Man Sues OpenAI after “Hallucinating” ChatGPT Accuses Him of Embezzling Money


In a groundbreaking lawsuit, OpenAI, the creator of ChatGPT, is facing legal trouble after a man claimed that the AI platform defamed him by falsely accusing him of embezzling money. The case has caught the attention of legal experts and AI enthusiasts alike, as it could set a precedent for the intersection of AI and defamation law.

As this unprecedented legal battle unfolds, it highlights the need for a regulatory framework around AI technologies, just like the regulations imposed on automobiles. OpenAI’s warning that ChatGPT is a “free research preview and not intended to give out advice” may face scrutiny as well.

Georgia resident Mark Walters has filed a defamation suit against OpenAI in Gwinnett County Court, seeking damages for the statements made by ChatGPT. Walters argues that ChatGPT’s false accusation of his involvement in embezzlement has the potential to irreparably harm his reputation.

“While research and development of AI is worthwhile, it is irresponsible to unleash a system on the public that is known to make up ‘facts’ about people,” said Walters’ attorney, John Monroe, emphasizing the seriousness of the matter.

The peculiar incident unfolded when journalist Fred Riehl, covering a court case, requested ChatGPT to provide a summary of the accusations mentioned in a complaint. Riehl, attempting to save time, even shared the URL of the genuine complaint for reference. However, instead of providing accurate information, ChatGPT seemed to have gone into a bizarre “hallucination” mode.

In what can only be described as an AI-induced farce, ChatGPT conjured up an entirely fabricated narrative that implicated Walters in embezzlement. It falsely claimed that Walters had misappropriated funds from The Second Amendment Foundation, an organization involved in a separate lawsuit against Washington’s Attorney General’s office.

ChatGPT’s propensity for generating incorrect information, dubbed “hallucinations,” has been previously documented. From writing obituaries for the living to creating fake legal citations, the AI’s credibility has faced scrutiny. Even a Texas judge vowed to strike any filing prepared with the assistance of AI, unless it was thoroughly checked by a human.

Alan Gottlieb, one of the plaintiffs in the actual Washington lawsuit, confirmed that ChatGPT’s allegations against Walters were entirely false. The statements attributed to Walters were nowhere to be found in the legitimate complaint.

Walters seeks damages and lawyers’ fees as compensation for the harm caused. The exact amount will be determined during the trial proceedings, should the case move forward. Legal experts are closely observing the case, particularly regarding the burden of proof in defamation lawsuits involving AI. In Georgia, a plaintiff must demonstrate that the defendant was negligent in assessing the truthfulness of the statements. The complaint emphasizes that ChatGPT’s false and malicious allegations have harmed Walters’ reputation, subjecting him to public contempt and ridicule.

In a humorous twist, the case is listed in the Gwinnett Courts Portal as “Walters VS OpenAL LLC,” a minor error that adds to the comical nature of the situation. The involvement of AI in legal matters continues to be a topic of debate and fascination, leaving many wondering whether this peculiar lawsuit will stick or be dismissed as a fantastical tale of an AI’s misadventure.

Leave a Reply

Your email address will not be published. Required fields are marked *