Today, artificial intelligence (AI) is being used widely across the UK public sector in several areas including criminal justice.
It has found applications in the way police and judicial bodies identify and prosecute offenders.
AI and Live Facial Recognition (LFR) technology
The College of Policing’s defines Live Facial Recognition , as “a real-time deployment of facial recognition technology, which compares live camera feeds of faces against a predetermined watchlist, to locate persons of interest by generating an alert when a possible match is found.”
The technology can also be used for mass surveillance programmes, for instance, during protests to scan crowds and was used during King Charles III’s Coronation in May 2023. The Metropolitan Police Service (MPS), is a top user of LFR technology, deploying NEC’s Neoface software during the event. At present, South Wales Police and the MPS are the only forces in England and Wales to have deployed LFR, with the latter ramping up its use of the technology in 2022 despite ongoing concerns.
In April 2023, both forces doubled down on their use of the technology after research found “substantial improvement” in the accuracy of their systems when using certain settings. Noting the proliferation of the technology and its rise in support, senior officials at the Home Office have lobbied the Information Commissioners Office (ICO) to “act favourably” in an investigation into LFR camera-provider Facewatch to facilitate a futher roll-out of the tech
Efficacy
However, the use of LFR in policing continues to face scrutiny around its effectiveness. Critics underline several concerns, most notably that the training data is itself flawed and can lead to biased and even racist results.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataThe technology is also still developing after early trials delivered inaccurate results;, eight that took place in London between 2016 and 2018 resulted in a 96% “false positive matches.”
These concerns have been echoed by several key bodies in recent years. The House of Commons Science and Technology Select Committee has evaluated the use of biometrics in the UK on several occasions since 2014-15, including as part of the government’s biometrics strategy. Most recently, the Justice and Home Affairs Committee launched a ‘short investigation’ into the use of LFR by police in December 2023.
Transparency and data security
One key argument from stakeholders who oppose the use of technology in predictive policing is the issue of transparency and trust. many express concern of a lack of transparency in how algorithms work, implying that victims and perpetrators are not able to assess the accuracy and fairness of a system’s output. Moreover, delays in the deployment of biometric technologies in part seem to be because of issues relating to public trust.
To counter this, the Office of the Police Chief Scientific Adviser (OPSCA) has developed a set of principles or ‘Covenants’ on how AI should be used in policing. This was endorsed by the National Police Chiefs’ Council (NPCC) in September 2023 and “should be given due regard by all developers and users of the technology in the sector.”
Furthermore, the new Centre for Police Productivity – established following the government’s independent Policing Productivity Review, will use new tools to spot emerging crime trends to drive efficiency. In addition, the Centre will be able to share a central data hub, and AI solutions which can significantly reduce the hours officers have to spend checking thousands of records.
AI use looking forward
With ongoing criticisms of the use of AI across policing and the erosion of public trust in the police, more needs to be done to increase levels of transparency in emerging technologies such as LFR.
However, the government is advocating greater uptake of AI technologies across the UK criminal justice space. An independent report by the biometrics commissioner of England and Wales highlights that the UK policing minister, Chris Philp is pushing for the greater adoption of LFR by police, and questions the government’s proposed changes to surveillance oversight.
In October 2023, Philp sent a letter to police chiefs which highlighted that Police forces should ‘double the number of searches they make using retrospective facial recognition technology to track down known offenders.
Meanwhile, the University of Aberdeen’s has established the ‘PROBabLE Futures’ project which focuses on Probabilistic AI Systems in Law Enforcement, receiving £3.4 million in funding from Responsible AI UK (RAI UK) to investigate this use-case.
Summary The use of AI across policing, largely to support predictive policing and LFR, has been more apparent in the last couple of years, with much debate regarding the security, efficacy, and transparency of these technologies. As Ai technologies mature, the long-term prospects for their use are becoming clearer with a greater push from central government and police officials to implement them on a wider scale.
Related Company Profiles
NEC Corp