Judges warning over solicitors using AI after three cases thrown out

Judges have warned the public over solicitors using artificial intelligence to compose court documents following three cases having to be thrown out over error-strewn submissions. One judge claimed it was ‘disheartening’ that people have ‘fallen into the clutches of such charlatans’. It forms part of a global trend where more judges are recognising, or implying, that legal teams have included what are called ‘AI hallucinations’ – plausible but fabricated responses – in their submissions before the courts. One expert from Oxford University told the Oireachtas on Tuesday that he believes that lawyers who use the technology in court documents should face heavy penalties. Judges have warned the public over solicitors using artificial intelligence to compose court documents. Pic: Shutterstock An international database has found four instances where ‘AI hallucinations’ have made their way into documents before a judge in Ireland. In three of these cases, one from 2024 and two from last year, judges quashed the applications entirely after the use of AI was discovered to have resulted in misinterpretation of the law. The so-called hallucinations come about when false information is presented by AI chatbots as fact, even when it clearly does not align with reality or relevant laws. High Court Judge David Nolan issued a stark warning against the use of AI in a December 2024 judgment after the defendants appeared to misunderstand several legal terms and pieces of legislation relating to repossessions. Judge Nolan recounted that the defendants, a married couple in Dublin, attempted to introduce new arguments to the court at an inappropriate time during the trial. High Court Judge David Nolan issued a stark warning against the use of AI in a December 2024 judgment after the defendants appeared to misunderstand several legal terms and pieces of legislation relating to repossessions. Pic: Collins Courts According to the judgment, the husband informed the courtroom that he had been advised by a friend. In his judgment, Judge Nolan said: ‘The general public should be warned against the use of generative AI devices and programs in matters of law.’ The judge threw out the submission, on the basis that it was ‘fallacious’, adding: ‘It is disheartening, to say the least, that good people, such as the defendants, have fallen into the clutches of such charlatans.’ Less than four months later, the same judge denied a judicial review appeal after the applicant in the case repeatedly referred to ‘subornation of perjury’, a term rarely used in Irish courts. In his judgment, Judge Nolan said: ‘The general public should be warned against the use of generative AI devices and programs in matters of law.’ Pic: Getty Images ‘This sounds like something that derived from an artificial intelligence source,’ Judge Nolan said. ‘It has all the hallmarks of ChatGPT, or some similar AI tool.’ Almost 1,000 cases have been identified globally by the AI Hallucination Cases Database, which found false information fed to lawyers and others in legal cases. Solicitor and PhD researcher in AI technology and law at UCD, Labhaoise Ní Fhaoláin, told Extra.ie that the use of AI in legal practice is ‘problematic’. She explained that no two cases are the same, and that AI technologies have no methods to differentiate the details of them. Ms Ní Fhaoláin added that the body of legislation in Ireland is also too small for generative AI chatbots to work from, which leads to it taking examples from other jurisdictions. Almost 1,000 cases have been identified globally by the AI Hallucination Cases Database, which found false information fed to lawyers and others in legal cases. Pic: Getty Images She said that the public would benefit from gaining a better understanding of how AI works, remarking: ‘A major problem that we have with AI is the lack of friction. It feels so credible that people are sucked in by it – hence the charlatan comment. ‘I feel that if people had a better understanding of the tools, they would understand that it is not appropriate for what it is being used for.’ At the Oireachtas AI Committee on Tuesday, Oxford associate professor of Human-centred Computing Reuben Binns said you are ‘undermining the faith that people place in [the judicial] system’ if you allow hallucinated cases into the record’. He added: ‘I think there should be heavy punishments for people engaging in that.’ Guidance issued to solicitors and judges from the Law Society and the Chief Justice of Ireland, recommends that practitioners cautiously use the technologies. Both organisations said, however, that AI should not be used to form legal arguments or conduct research, due to its limitations in respect of case law, recognising the type of case, and bias.
AI Article