Attorney Pleads for Mercy After Using AI in Court, Where It Made Up Fake Lawsuits
futurism.com
Legally speaking: they may have f*cked up.Few settings would seem worse suited for submitting AI-generated text than a court of law, where everything you say, write, and do, is subjected to maximum scrutiny. And yet lawyers keep getting caught relying on crappy, hallucination-prone AI models anyway, usually to the judge's and the client's chagrin. After all the public shaming, you'd think they'd know better by now.The latest high-profile instance, , comes from a 2023 lawsuit filed against Walmart and Jetson Electric Bikes, in which the plaintiff alleged that a hoverboard sold by the two companies was responsible for a fire that burned down their home.These are serious claims. But the legal minds involved apparently took the easy route, to disastrous effect.On Thursday, a federal judge in Wyoming asked the plaintiff's lawyers to give him a good reason not to impose sanctions on them for citing nine totally-made-up legal cases in the suit. And you guessed it: they were conjured up by a shoddy AI model.The lawyers, from the firms Morgan & Morgan and Goody Law Group, withdrew the filing that contained the botched case law, and ate humble pie in a follow-up one."Our internal artificial intelligence platform 'hallucinated' the cases in question while assisting our attorney in drafting the motion in limine," they wrote, per The Register. "This matter comes with great embarrassment and has prompted discussion and action regarding the training, implementation, and future use of artificial intelligence within our firm."This is classic AI bullshittery. If a large language model can't come up with a confident answer, it'll make up one instead usually convincingly, if you're not paying close enough attention, and without dropping the authoritative tone.In this case, the defendants found that when they tried to look up some of the lawsuits, most couldn't be found by the names provided. At least one of them was a case fabricated by ChatGPT, and which could only be found on the chatbot. And as it turned out, the provided case number for the fabrication actually belonged to a real lawsuit. Score one, and them some, for the corporate lawyers.One of the plaintiff's attorneys explained that an "internal AI tool" was responsible for the errors. In a nutshell, he uploaded a draft of the motion he intended to file to the AI tool and asked it to pull relevant federal case law. And that it did out of thin air.Honest mistake or not, the consequences could be serious. Pointing to other instances where lawyers were sanctioned for reckless AI usage, the judge asked why he shouldn't descend on them with his full wrath, too. That could mean fines, a suspension, or even disbarment.The guilt-stricken attorney pleaded for mercy. "This was the first time in my career that I ever used AI for queries of this nature," the attorney said in the latest filing, per The Register. "With a repentant heart, I sincerely apologize to this court, to my firm, and colleagues representing defendants for this mistake and any embarrassment I may have caused."Share This Article
0 Comments ·0 Shares ·92 Views