A multimillion-dollar conspiracy trial that stretched across the worlds of politics and entertainment is now touching on the tech world with arguments that a defense attorney for a Fugees rapper bungled closing arguments by using an artificial intelligence program.
Prakazrel “Pras” Michel argued that the use of the “experimental” generative AI program was one of a number of errors made by his “unqualified, unprepared and ineffectual” trial attorney before his conviction earlier this year, according to a motion for new trial his new lawyers filed this week. The company behind the program, on the other hand, said it was a tool used to help write closing statements and a harbinger of major changes in the field.
Generative AI programs are capable of creating realistic text, images and video. They’re raising tough questions about misinformation and copyright protections as well as industry calls for regulations in Congress. Programs like ChatGPT already have had ripple effects across professions like writing and education. The arguments in the Michel case could preview issues to come as the technology makes a rapid advance.
The Grammy-winning rapper’s trial was touted as the first time generative AI was used in a federal trial in a news release from the startup company that designed the system. Defense attorney David Kenner, well known for his previous representation of rappers like Suge Knight and Snoop Dogg, also gave a quote calling the system a “game changer for complex litigation.”
But in his last words to the jury, Kenner appeared to mix up key elements of the case and misattributed the lyric, “Every single day, every time I pray, I will be missing you,” to the Fugees, the 1990s hip-hop group his client co-founded, when actually it is a well-known line from a song by the rapper Diddy, then known as Puff Daddy, court documents from Michel’s new attorney, Peter Zeidenberg, stated.
Kenner did not respond to a phone call and email seeking comment from The Associated Press. The company, EyeLevel.AI, said the program wasn’t “experimental” but instead trained using only facts from the case, including court transcripts, not musical lyrics or anything found online. It’s intended to provide fast answers to complex questions to help—not replace—human lawyers, said co-founder and COO Neil Katz.
“We think AI technology is going to completely revolutionize the legal field by making it faster and cheaper to get complex answers to legal questions and research,” Katz said.
He denied an allegation from Michel’s new lawyers that Kenner appeared to have a financial interest in the program.
The case will likely be closely watched as more law firms adopt the technology, said Sharon Nelson, president of Sensei Enterprises, a digital forensics, cybersecurity and information technology firm. A substantial number of firms are using it now, and surveys indicate more than 50 percent of lawyers expect to within the next year, she said. “It’s gone much faster than we thought,” she said. “The problem is, if you don’t work with it, you’re going to be left behind.”
Michel was found guilty in April on all 10 counts he was charged with, including conspiracy and acting as an unregistered agent of a foreign government. He faces up to 20 years in prison on the top counts. He is free ahead of sentencing, which has not yet been set.
“At bottom, the AI program failed Kenner, and Kenner failed Michel. The closing argument was deficient, unhelpful, and a missed opportunity that prejudiced the defense,” wrote Zeidenberg. His other arguments for a new trial included the jury being prejudiced by being allowed to hear references to the “crime fraud exception” and “co-conspirators.”
Michel was accused of funneling money from a now-fugitive Malaysian financer through straw donors to Barack Obama’s 2012 re-election campaign, then trying to squelch a Justice Department investigation and influence an extradition case on behalf of China under the Trump administration. His trial included testimony ranging from actor Leonardo DiCaprio to former U.S. Attorney General Jeff Sessions.
Kenner had argued during the trial the Grammy-winning rapper simply wanted to make money and got bad legal advice as he reinvented himself in the world of politics.
It wasn’t immediately clear when a judge might rule on the motion for a new trial.
Use of generative AI in the legal profession is in the early stages, but it could see much more widespread adoption as products improve, said John Villasenor, a professor of engineering and public policy at the University of California, Los Angeles. The American Bar Association does not yet have any guidelines on the use of AI in the legal profession, though there is a new task force studying the issue, a spokeswoman said.
Using it for closing arguments is complicated because of the many factors that develop over the course of a trial, he said. Generative AI, meanwhile, also sometimes produces “hallucinations,” statements that initially read as if they are accurate but are not.
“A good attorney coming up with closing arguments will be mindful of basic goals of the case but also of the specific ways in which the trial has played out,” he said. Even as products improve, “attorneys that use AI should make sure they very carefully fact check anything they are going to use.”