A senior judge said on Friday that lawyers could be prosecuted for presenting material that had been “hallucinated” by artificial intelligence tools.
The High Court of England and Wales warned lawyers on Friday that they could face criminal prosecution for presenting false material generated by artificial intelligence, after a series of cases cited made-up quotes and rulings that did not exist.
In a rare intervention, one of the country’s most senior judges said that existing guidance to lawyers had proved “insufficient to address the misuse of artificial intelligence” and that further steps were urgently needed.
The ruling by Victoria Sharp, president of the King’s Bench Division of the High Court, and a second judge, Jeremy Johnson, detailed two recent cases in which fake material was used in written legal arguments that were presented in court.
In one case, a claimant and his lawyer admitted that A.I. tools had generated “inaccurate and fictitious” material in a lawsuit against two banks that was dismissed last month. In the other case, which ended in April, a lawyer for a man suing his local council said she could not explain where a series of nonexistent cases in the arguments had come from.
Judge Sharp drew the two examples together using rarely exercised powers that were designed to enable the “court to regulate its own procedures and to enforce duties that lawyers owe.”
“There are serious implications for the administration of justice and public confidence in the justice system if artificial intelligence is misused,” she wrote, warning that lawyers could be convicted of a criminal offense or barred from practicing for using false A.I.-generated material.
Leave a comment