Lack of AI controls risks 'new Wild West', peers warn
Committee report says there is a risk of mistrials caused by the unregulated use of live facial recognition and machine learning tools
Machine reading, facial recognition and other artificial intelligence technology need safeguards for use in investigations to avoid mistrials the House of Lords Justice and Home Affairs Committee have warned.
The committee warned public awareness, government, and legislation “have not kept up with a new Wild West”.
The review by peers found 30 public bodies, initiatives, and programmes have input in the governance of new technologies for the application of the law. Included are the National Police Chiefs’ Council and the College of Policing.
But concerns were raised over their differing remits, powers and resources.
Peers said: “We were concerned that, in some instances, the use of advanced tools at certain points of the criminal justice pipeline may impede an individual’s right to a fair trial: whether by a lack of awareness that they were being used, unreliable evidence, or an inability to understand and therefore challenge proceedings.”
The report, Technology Rules? The advent of new technologies in the justice system, set out advice to ensure the use of advances such as automatic facial recognition do not generate errors that could impact on trial outcomes.
Committee Chair Baroness Hamwee (Lib Dem) said an over-belief in computing systems was a significant risk.
““We had a strong impression that these new tools are being used without questioning whether they always produce a justified outcome. Is ‘the computer’ always right? It was different technology, but look at what happened to hundreds of Post Office managers.”
It is the first significant evaluation of laws and guidance by the Lords which could ultimately hear appeals against judgements.
As an early warning, the committee called for a mandatory register of algorithms used by forces and in the justice system. It also recommends the introduction of a duty of candour on the police.
Peers also called for a national body to be established to set strict scientific, validity, and quality standards and to ‘kitemark’ new technological solutions against those standards.
Ministers were advised to set a clear, long-term strategy for the development of new technologies for use in the application of the law.
The report said: “Cohesion, consistency, and clarity are urgently needed in this area.”
Their report strengthens the stand taken by forces wanting to use facial recognition cameras including the Metropolitan Police.
Outgoing Commissioner Dame Cressida Dick has said the technology will help in specific situations such as public order work but a legal framework is needed to support legitimacy.
It would also protect forces from legal challenge in future; South Wales is pioneering its roll out but lost a court case brought by human rights campaigners.
Baroness Hamwee said: “What would it be like to be convicted and imprisoned on the basis of AI which you don’t understand and which you can’t challenge?
“Without proper safeguards, advanced technologies may affect human rights, undermine the fairness of trials, worsen inequalities and weaken the rule of law. The tools available must be fit for purpose, and not be used unchecked.
She added: “Humans must be the ultimate decision makers, knowing how to question the tools they are using and how to challenge their outcome.”