Chat GPT Lawsuit: Understanding the Platform before you use it
OpenAI is facing its first defamation lawsuit over false information generated by ChatGPT. The company is also being investigated by multiple countries for potential use of user data. Europe is preparing to unveil a new AI law, which presents another challenge for OpenAI. The OpenAI lawsuit was filed by radio host Mark Walters in Georgia’s Superior Court of Gwinnett County on June 5th. Walters has filed a lawsuit seeking monetary damages due to allegations made by ChatGPT of fraudulent behavior and embezzlement of funds from a non-profit organization. Journalist Fred Riehl sought this information by asking ChatGPT to provide a summary of a case in federal courts.
Chatbots and misinformation
Chatbots like ChatGPT have been causing concern due to the spread of false information. These systems struggle to differentiate between fact and fiction, often creating inaccurate dates, facts, and figures. This mixing of real and invented information can mislead users and waste their time, but it has also caused harm in some cases. A professor expressed concern about the authenticity of his students’ essays after ChatGPT was suspected to have been used, and a lawyer faced disciplinary action for utilizing ChatGPT in researching fictitious legal cases.
The disclaimer on ChatGPT’s homepage states that the system may sometimes produce incorrect information. OpenAI is still endorsing ChatGPT as a dependable resource for acquiring information and answers, despite reservations. The CEO, Sam Altman, has even said that he prefers to learn from ChatGPT as opposed to books.
The legal responsibility for incorrect or harmful information created by AI systems is uncertain. Section 230 protects internet companies in the US by granting them immunity from liability for content that is generated by a third-party. It is unclear if this protection extends to AI-generated information. A defamation lawsuit filed by Walters could challenge this legal framework. The ChatGPT lawsuit claims that chatbot produced a false summary of a federal court case, including both factual and false information. The summary accused Walters of misusing funds from a non-profit, a claim that was never made against him.
ChatGPT lawsuit details
It’s unclear how Walters came across false information, as journalist Fred Riehl did not publish it but instead verified the details with someone else. ChatGPT lacks the capability to access external data, such as PDF files, without additional plug-ins. However, it failed to communicate this limitation to Riehl, which may lead to confusion for users. EugeneVolokh, a law professor who specializes in the legal liability of AI systems, suggests that libel claims against AI companies could be legally viable, but the sustainability of the OpenAI lawsuit may prove challenging. Walters failed to inform OpenAI of the inaccurate statements, which prevented them from addressing the matter. Additionally, no actual damages resulted from ChatGPT’s output. Nonetheless, the case’s outcome will be interesting to follow, according to Volokh.