Amazon’s facial recognition technology has wrongly matched 105 UK and US politicians with police mugshots, according to a study by Comparitech.
In order to test the accuracy of Amazon’s facial recognition tool Rekognition, the information security and privacy comparison company used the faces of 1,959 politicians: 430 US Representatives and 100 US Senators, and 632 members of UK Parliament and 797 members of the House of Lords. These were matched against four sets of 25,000 randomly selected arrest photos from Jailbase.com.
Rekognition works by generating a percentage of how likely it is the two images are the same person. At an 80% confidence threshold, Rekognition wrongly matched 32 US Congresspersons 73 UK politicians to mugshots.
Comparitech also discovered that out of the 12 politicians who were misidentified at a confidence threshold of 90% or higher, half of them were not white, despite people of colour making up a fifth of US Congresspeople and just a tenth of UK parliament.
Police use of facial recognition raises concerns
Facial recognition technology works by using biometrics to map a person’s facial features, which is then compared to a database of other images to determine if any of the faces are the same person.
Although it can be a useful policing, privacy advocates have objected to the technology, with Big Brother Watch describing the deployment of facial recognition surveillance cameras by police forces and private companies as “dangerously authoritarian surveillance” and “a threat to our privacy and freedoms”.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataEarlier this month, a Freedom of Information request by the RSA revealed that UK police forces are rolling out technology such as facial recognition and predictive policing without public consultation.
Comparitech also points out that in real world use cases, such as using facial recognition on CCTV footage, the image is far less clear, which may make errors in accuracy and performance more likely, which could run the risk of false arrests being made.
Amazon facial recognition used to help “narrow the field”
Amazon has stated in a blog post that, “In real-world public safety and law enforcement scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment (and not to make fully autonomous decisions).”
Javvad Malik, security awareness advocate, KnowBe4 said that it is important for the technology to be used in conjunction with trained human operators:
“Even with the advancements of artificial intelligence and processing power to identify people from biometrics, it is far from reliable technology. It is why trained human operators will be needed in conjunction with such software for the foreseeable future in order to eliminate false positives or false negatives.
“One of the biggest challenges with this kind of software is they rely on quite basic pattern matching which can be bypassed quite easily with shadows, tattoos and so forth. We’ve seen issues with facial recognition before in misidentifying people of colour or minorities. This is often due to lack of diversity in the development and testing teams, which is why it’s important that any organisations developing such technologies ensures there is appropriate diversity and have a strong code of ethics to dictate what is or isn’t appropriate development practices.”
Read more: Police facial recognition: Public divided over law enforcement use as ban looms.