By Matthew B. Harrison
TALKERS, VP/Associate Publisher
Harrison Media Law, Senior Partner
Goodphone Communications, Executive Producer
When Georgia-based nationally syndicated radio personality, and Second Amendment advocate Mark Walters (longtime host of “Armed American Radio”) learned that ChatGPT had falsely claimed he was involved in a criminal embezzlement scheme, he did what few in the media world have dared to do. Walters stood up when others were silent, and took on an incredibly powerful tech company, one of the biggest in the world, in a court of law.
Taking the Fight to Big Tech
Walters, by filing suit against OpenAI, the creator of ChatGPT, become the first person in the United States to test the boundaries of defamation law in the age of generative artificial intelligence.
His case was not simply about clearing his name. It was about drawing a line. Can artificial intelligence generate and distribute false and damaging information about a real person without any legal accountability?
While the court ultimately ruled in OpenAI’s favor on specific legal procedure concerns, the impact of this case is far from finished. Walters’ lawsuit broke new ground in several important ways:
— It was the first known defamation lawsuit filed against an AI developer based on content generated by an AI system.
— It brought into the open critical questions about responsibility, accuracy, and liability when AI systems are used to produce statements that sound human but carry no editorial oversight.
— It continued to add fuel to the conversation of the effectiveness of “use at your own risk” disclaimers when there is real world reputational damage hanging in the balance.
Implications for the Radio and Podcasting Community
For those spoken-word creators, regardless of platform on terrestrial, satellite, or the open internet, this case is a wake-up call, your canary in a coal mine. Many shows rely on AI tools for research, summaries, voice generation, or even show scripts. But what happens when those tools get it wrong? (Other than being embarrassed, and in some cases fined or terminated) And worse, what happens when those errors affect real people?
The legal system, as has been often written about, is still playing catch-up. Although the court ruled that the fabricated ChatGPT statement lacked the necessary elements of defamation under Georgia law, including provable harm and demonstrable fault, the decision highlighted how unprepared current frameworks are for this fast-moving, voice-driven digital landscape.
Where the Industry Goes from Here
Walters’ experience points to the urgent need for new protection and clearer guidelines:
— Creators deserve assurance that the tools they use are built with accountability in mind. This would extend to copyright infringement and to defamation.
— Developers must be more transparent about how their systems operate and the risks they create. This would identify bias and attempt to counteract it.
— Policymakers need to bring clarity to who bears responsibility when software, not a person, becomes the speaker.
A Case That Signals a Larger Reckoning
Mark Walters may not have won this round in court, but his decision to take on a tech giant helped illuminate how quickly generative AI can create legal, ethical, and reputational risks for anyone with a public presence. For those of us working in media, especially in formats built on trust, voice, and credibility, his case should not be ignored.
“This wasn’t about money. This was about the truth,” Walters tells TALKERS. “If we don’t draw a line now, there may not be one left to draw.”
To listen to a longform interview with Mark Walters conducted by TALKERS publisher Michael Harrison, please click here
Media attorney, Matthew B. Harrison is VP/Associate Publisher at TALKERS; Senior Partner at Harrison Media Law; and Executive Producer at Goodphone Communications. He is available for private consultation and media industry contract representation. He can be reached by phone at 724-484-3529 or email at matthew@harrisonmedialaw.com. He teaches “Legal Issues in Digital Media” and serves as a regular contributor to industry discussions on fair use, AI, and free expression.
Share this with your network