https://newsletter.po.creamermedia.com
Deepening Democracy through Access to Information
Home / Legal Briefs / All Legal Briefs RSS ← Back
Africa|Power|Systems|Technology
Africa|Power|Systems|Technology
africa|power|systems|technology
Close

Email this article

separate emails by commas, maximum limit of 4 addresses

Sponsored by

Close

Article Enquiry

Who is to blame when AI defames?


Close

Who is to blame when AI defames?

Should you have feedback on this article, please complete the fields below.

Please indicate if your feedback is in the form of a letter to the editor that you wish to have published. If so, please be aware that we require that you keep your feedback to below 300 words and we will consider its publication online or in Creamer Media’s print publications, at Creamer Media’s discretion.

We also welcome factual corrections and tip-offs and will protect the identity of our sources, please indicate if this is your wish in your feedback below.


Close

Embed Video

Who is to blame when AI defames?

Webber Wentzel

24th February 2026

ARTICLE ENQUIRY      SAVE THIS ARTICLE      EMAIL THIS ARTICLE

Font size: -+

The rise of generative AI has also given rise to an increase in litigation based on false results which cause harm to a person's reputation. Defamatory output is sometimes caused by generative AI, and at other times, AI can be used to create false impressions about a person, as in the case of deepfakes. If a person’s reputation or dignity is harmed, a cause of action arises in South African law.

AI-related defamation lawsuits are being brought with increasing regularity. One of the first, against ChatGPT, was introduced in 2023 in Australia. The Hepburn Shire Council Mayor, Brian Hood, launched a defamation lawsuit against OpenAI, the owner of ChatGPT. The lawsuit concerned a false result generated by ChatGPT that claimed the mayor had served time in prison for a bribery charge in relation to a matter where he was, in fact, the whistleblower. The lawsuit was resolved in early 2024 after corrections were made to the ChatGPT outputs.

Advertisement

Another interesting case–this time in the United States of America (USA)–involved Robert Starbuck, an American filmmaker, journalist, and activist. His complaint, filed on 29 April 2025, set the scene:

"Imagine waking up one day and learning that a multi-billion-dollar corporation was telling whoever asked that you had been an active participant in one of the most stigmatised events in American history–the Capitol riot on January 6th, 2021–and that you were arrested for and charged with a misdemeanour in connection with your involvement in that event.

Advertisement

Further imagine that these accusations were completely false…

…Finally, imagine that the technology company continued to publish these and other lies about you for nine months after you first asked them to stop."

This was the basis on which Starbuck brought a defamation lawsuit against Meta Platforms, Inc.–the owner of the Meta AI chatbot. In August 2024, Starbuck discovered the chatbot included these false and damaging statements about him in its outputs. According to his complaint, Starbuck "did everything within his power to alert Meta about the error and enlist its help to address the problem." However, despite his attempts to bring this to the company's attention, the defamatory outputs reportedly continued. It seems that while all information relating to Starbuck was eventually erased from all text outputs, additional misinformation was added via the Meta AI voice feature, including claims that Starbuck had "pled guilty over disorderly conduct" relating to the Capitol riot and that he had "advanced Holocaust denialism."

The question "who is to blame when AI defames?" may have been answered by the Delaware Superior Court in this case, but a public apology by Meta's Joel Kaplan indicated that the "parties [had] resolved this matter" and that the parties were collaborating to mitigate risks relating to hallucinations.

Another case–also in the USA–involved Mark Walters, a media personality, radio talk show host, and Second Amendment (right to bear arms) advocate, who launched a defamation lawsuit against OpenAI in 2023. He claimed that Frederick Riehl-a journalist and editor of a news site focusing on Second Amendment rights-used ChatGPT, which produced statements about Walters being involved in embezzlement. Walters sued Open AI (the owner of ChatGPT).  However, the Superior Court of Gwinnett County in the State of Georgia ruled in favour of Open AI in May2025, on various grounds, one of which was that, as a public figure, Walters had to demonstrate actual malice (knowledge of falsity) on the part of ChatGPT. The court held that OpenAI could not be held liable; the key basis for the decision appears to be that the inclusion by ChatGPT of a disclaimer below the prompt bar meant that reasonable readers would know that ChatGPT makes mistakes. When considering whether the disputed output communicated a defamatory meaning as a matter of law, the court scrutinised this "hypothetical reasonable reader" test. The court identified that "[d]isclaimer or cautionary language weighs in the determination of whether this objective, ’reasonable reader’ standard is met". Due to the recurrent disclaimers that applied, users of ChatGPT in Riehl's position could not have believed that the output consisted of "actual facts" without venturing to verify the information. In the order, reference was made to Riehl's testimony, in that he was "sceptical" of the output; knew that it "was not true" and consisted of "the wrong information"; and that he was cognisant of ChatGPT's capability to produce hallucinations. Because Riehl did not believe the output, the court concluded that it could not have communicated a defamatory meaning as a matter of law. The court confirmed that this alone would have been adequate to find in favour of OpenAI and grant summary judgment.

In South Africa, while no cases have yet been decided, the AI platforms may not be as lucky as ChatGPT was in the Walters case. In South African law, the publication would likely be regarded as defamatory despite the disclaimer. Disclaimers are not "magic wands" to cure defamatory speech. And if, as we believe likely, they are required to show that they acted without negligence, then a court will need to take a very close look at the systems and processes the platform has adopted. At the very least, it is likely that such platforms will have a duty to act reasonably once notified of the defamatory or unlawful content. As AI platforms operating in South Africa will soon see, there is nothing artificial about a defamation lawsuit.

Written by Dario Milo, Partner & Lia Wheeler, Candidate Attorney from Webber Wentzel

*Dario is a partner at Webber Wentzel and a member of the firm’s AI specialist team in dispute resolution, advising clients on emerging AI-related disputes, legal issues and potential risks

EMAIL THIS ARTICLE      SAVE THIS ARTICLE      ARTICLE ENQUIRY      FEEDBACK

To subscribe email subscriptions@creamermedia.co.za or click here
To advertise email advertising@creamermedia.co.za or click here


About

Polity.org.za is a product of Creamer Media.
www.creamermedia.co.za

Other Creamer Media Products include:
Engineering News
Mining Weekly
Research Channel Africa

Read more

Subscriptions

We offer a variety of subscriptions to our Magazine, Website, PDF Reports and our photo library.

Subscriptions are available via the Creamer Media Store.

View store

Advertise

Advertising on Polity.org.za is an effective way to build and consolidate a company's profile among clients and prospective clients. Email advertising@creamermedia.co.za

View options

Email Registration Success

Thank you, you have successfully subscribed to one or more of Creamer Media’s email newsletters. You should start receiving the email newsletters in due course.

Our email newsletters may land in your junk or spam folder. To prevent this, kindly add newsletters@creamermedia.co.za to your address book or safe sender list. If you experience any issues with the receipt of our email newsletters, please email subscriptions@creamermedia.co.za