top of page
Sarah Staniforth

Facial recognition technology needs to be effectively regulated to protect minorities

Does the implementation of facial recognition (FR) software by the Metropolitan Police signal the rise of the surveillance state? Or is it merely a way of making policing more efficient, at a time when police forces are feeling the strain of years of successive budget cuts?

In 2018, the number of police officers per head in London fell to its lowest in two decades, placing enormous strain on the police force. By using facial recognition technology (FRT) for tasks such as spotting criminals, the number of officers needed could be reduced. So far, London Met claim to have used FR six times as part of a trial of the technology. At the moment, we’re a long way off a FRT dystopia. But one day, things could look very different.

If we look to China, the world’s leader in FR technology, we can see the extent to which FR can be used to violate citizens’ human rights. FR surveillance is ubiquitous in China, but the use of the technology to surveil ethnic minorities is particularly intense. For example, in the Muslim-majority region of Xinjiang, FR is reportedly used to prevent people venturing over 300 metres beyond areas deemed ‘safe’ by the government. Some critics of the regime have described the area as a ‘police state’, with residents being required to have face scans to engage in everyday activities like shopping at the market. A fifth of arrests in China took place in the province in 2017, an area home to a mere 1.5% of China’s population.

Nonetheless, the Chinese government claims that this is all in the public’s interest, pointing to the supposed terrorist threat posed by the Uighur Muslim minority that comprises most of the Xinjiang population. From an outside perspective, however, it’s difficult to see the targeting of an ethnic group for surveillance through FR as anything but a state-sanctioned act of racism.

Of course, Britain is a very different place to authoritarian China. But both countries have problems with institutional racism that shape police use of FR. The situation in China gives us an indicator of how the technology could potentially be used to racially profile minorities.

For one of its FR ‘trials’, the Met targeted attendees of 2017’s Notting Hill Carnival (the largest celebration of African-Caribbean culture in the UK), attempting to ‘match’ crime suspects in the crowd to images in their database. By choosing a largely black event to test the technology, they contributed to racist perceptions of the Afro-Caribbean community. Furthermore, they failed to demonstrably do anything to actually fight crime, with the only correct ‘match’ being an individual who had already been through the justice system for their crime. As a result of the ‘match’, the person was erroneously arrested.

There was one false arrest — it might not seem like much, but if you were wrongfully arrested, you’d perhaps feel differently. Furthermore, 35 incorrect matches were made between festival-goers and images stored in the database, demonstrating that the technology is currently highly inaccurate. This raises questions about whether the police should be allowed to use results from the trials as grounds for arrest at all, at the technology’s current level of sophistication.

Compared to the likes of FR-based targeted advertising, FR’s use in policing is particularly worrying, because it’s so easily defended in the name of ‘safety’ and ‘security’ (both of which are frequently racialised). As a result, it’s more difficult to condemn wholesale. And without a doubt, it can be used for good. For example, FR can be used by police to find missing children more efficiently. In 2018, New Delhi police reportedly found almost 3,000 missing children in a mere four days through FRT.

But equally, it can be used as a tool of oppression. Therefore, if FRT is to be used by police at all, there must much greater transparency about when and how it is used. We need the issue to be on the political agenda, so that laws can be put in place to protect our rights. We may not be able to stop the rise of FRT, but at the very least, we can have a say in how it is regulated.

17 views
bottom of page