Think about surveillance, and you’re likely to picture CCTV cameras on street corners.
But Benjamin Chekroun, ESG analyst at Candriam, says there is more to surveillance technology companies than meets the eye, and investors aren’t always aware of this.
‘When we started looking into surveillance technology, we realized that all major tech giants were developing their own facial recognition solutions, despite how different their business models are,’ Chekroun told Citywire Selector†
This includes Google, Amazon, Meta, Apple and Cisco, as well as Chinese e-commerce giants such as Alibaba, Huawei and Tencent. In addition many semiconductor and camera manufacturers produce components for facial recognition products.
Selling personal data to third-party advertisers has been the main money maker for companies such as Meta, which derived 97.4% of its 2021 revenues from advertising according to its Q4 and 2021 report.
Facial recognition software, Chekroun believes, is the next frontier.
‘For these firms, it’s not about making money from selling the technology: one online retailer is offering the technology for free when taking up other services. It’s about harvesting more behavioral data, to process it and benefit from it.
‘Tech giants have become experts at exploiting online data, today the new frontier for them is the physical world – and facial recognition technology will enable them to collect this important commodity. We want to make sure this is done in an ethical way and that the right safeguards are in place,” said Chekroun.
The challenge of identifying which enabling technology companies are involved in surveillance technology has also been noted by Karen Kharmandarian, president of Thematics Asset Management and co-manager of the Thematics AI & Robotics fund.
‘Beyond companies that are identified for developing surveillance technologies, such as Hikvision in China or Palantir in the US, it is often difficult to ascertain how extensively products and services embedding enabling technologies are used for surveillance.
‘Either because the companies have no precise idea of the ultimate use, or sometimes do not want to disclose it,’ Kharmandarian told Citywire Selector†
Benefiting from the personal
“As investors, we love technology,” Chekroun said. ‘Technology has brought us some brilliant investments in the last decade or so. But there’s a real financial risk, there’s reputational risk, and we fear that risk has been untold.’
It was against this backdrop that Candriam launched its collaborative engagement project with 51 other firms in 2021, to address the civil rights risks associated with facial recognition technology. In the year since its inception, Chekroun believes facial recognition software has become more widespread, but criticism has grown louder.
Kharmandarian agreed. ‘Surveillance technologies are most of the time being used in a legal and regulatory vacuum, with no clear framework as to how they should be used.
‘There is a growing recognition of the need to develop more effective policy and an adequate framework to ensure oversight, transparency and accountability, and survey what these companies do.’
This is evident with the growing number of companies being sued for collecting face prints without consent, something which hit Amazon, Google and Microsoft in 2020.
In 2021, facial recognition company Clearview AI faced a flurry of US lawsuits, and a zoo in China was sued over forcing visitors to submit a facial scan. In February this year, the US state of Texas filed a privacy lawsuit against Meta, which has already been fined $650m (€597m) by the US state of Illinois on similar grounds.
Similarly, the need for investor engagement has been emphasized by the non-profit organizations such as Business & Human Rights Resource Centre, Heartland Initiative and Access Now, which recently launched a due diligence guide.
Beyond face value
Candriam’s engagement project has spent the past year mapping out the human rights policies in place among 30 key surveillance technology players, to be compiled into an annual report of best practices. After consulting with experts, the group is set to enter a second wave of dialogues regarding improvements.
Its questions for companies, at present, concern disclosures and technical aspects: Do they have a human rights policy, and does it reference facial technology? How do they design their facial recognition products? Where do they pick their databases? Do they publicly test their systems?
The reception so far, Chekroun said, has been mixed, with some companies having been reluctant to disclose much over the past two years. But the pressure of civil society and the nascent regulatory framework has encouraged major players to be more open to constructive dialogue.
Similarly, Kharmandarian believes that investors have a responsibility and fiduciary duty to engage and ensure that investee companies are not unnecessarily exposed to legal, reputational and financial risks.
‘We need to gain a deeper understanding of the potential human rights issues going unnoticed in our companies’ value chains and to develop a framework for making rights-respecting investment decisions.’
Kharmandarian’s own process consists of a behaviour-based screening to exclude companies that are violating or neglecting international human rights agreements, standards, and directives, as well as those facing repeated allegations of severe human rights abuses and those lacking human rights policies throughout the value chain .
His team also gauges if the company’s governance, policy and practices are strong enough to identify, assess and address human rights abuses, and scrutinises the product lifecycle management to check for any human rights violations in the value chain or among end users.