The mayor of London has written to the owner of the King’s Cross development demanding to know whether the company believes its use of facial recognition software in its CCTV systems is legal. Sadiq Khan raises concerns after development admits using technology in its CCTV
Sadiq Khan said he wanted to express his concern a day after the property company behind the 27-hectare (67-acre) central London site admitted it was using the technology “in the interests of public safety”.
In his letter, shared with the Guardian, the Labour mayor writes to Robert Evans, the chief executive of the King’s Cross development, to “request more information about exactly how this technology is being used”.
Khan also asks for “reassurance that you have been liaising with government ministers and the Information Commissioner’s Office to ensure its use is fully compliant with the law as it stands”.
The owner of King’s Cross is one of the first property companies to acknowledge it is deploying facial recognition software, even though it has been criticised by human rights group Liberty as “a disturbing expansion of mass surveillance”.
But the company has provided little detail of how the software is used beyond saying it employed “a number of detection and tracking methods, including facial recognition” in its CCTV systems.
Cameras using the software are used by police forces to scan faces in large crowds in public places such as streets, shopping centres, football stadiums and music events such as the Notting Hill carnival. Images harvested can then be compared to a database of suspects and other persons of interest.
But the legality of the technology is unclear, and South Wales police are being challenged in the courts in a test case backed by Liberty, by an office worker who said it was wrong to scan an image of his face when he was not suspected of wrongdoing.
Camden council, the local authority in which King’s Cross falls, and whose headquarters is also in one of the development’s buildings, said the use of CCTV and facial recognition software had to be seen to be accountable.
A spokesperson for the Labour-run authority added: “The public will want to be reassured that they are not being monitored inappropriately – as do we.” Insiders said that Camden had first learned of the use of facial-recognition technology on its doorstep in media reports.
On Monday, the ICO, the data protection regulator, said it was studying the deployment of the surveillance technology by King’s Cross and other companies to examine whether its use was “strictly necessary and proportionate”.
There are also concerns that facial recognition technology has a racial bias. A US study concluded that the error rate for darker-skinned women ranged between 21% and 35% in software supplied by three companies. The error rate for lighter-skinned men was 1%.
Alluding to those concerns, Khan’s letter to Evans said that London’s public spaces “should be places that all Londoners, regardless of their age, ability, gender, gender identity, religion, race, sexual orientation or social class, can enjoy and use confidently and independently, avoiding separation or segregation”.