George Carlin’s estate settles lawsuit over comedian’s AI doppelganger | Artificial intelligence (AI)

by Pelican Press
67 views 5 minutes read



George Carlin’s estate settles lawsuit over comedian’s AI doppelganger | Artificial intelligence (AI)

The estate of comedian George Carlin settled a lawsuit on Tuesday against the owners of a comedy podcast who claimed they used artificial intelligence to mimic the deceased stand-up’s voice. The lawsuit was one of the first in the US to focus on the legality of deepfakes imitating a celebrity’s likeness.

The Dudesy podcast and its creators – the former Mad TV comedian Will Sasso and the writer Chad Kultgen – agreed to remove all versions of the podcast from the internet and permanently refrain from using Carlin’s voice, likeness or image in any content. Danielle Del, a spokesperson for Sasso, declined to comment.

Carlin’s family and an attorney for his estate both praised the settlement. Neither side disclosed terms of the deal.

“I am pleased that this matter was resolved quickly and amicably, and I am grateful that the defendants acted responsibly by swiftly removing the video they made,” Kelly Carlin, the comedian’s daughter, said in a statement.

Carlin’s estate filed its lawsuit in January after the Dudesy podcast, which touts itself incorporating AI into its comedy routines, posted an hourlong special to YouTube called George Carlin: I’m Glad I’m Dead. The estate’s suit claimed that the podcast both violated Carlin’s rights of publicity and copyright, calling it “a casual theft of a great American artist’s work”.

The special was introduced by the podcast’s eponymous AI character “Dudesy”, which claimed it had watched Carlin’s work and then created a stand-up set in the style of the comedian. Following the suit, however, Sasso’s spokesperson Del told the New York Times that the fictional Dudesy character was not AI-generated and that Kultgen wrote the entire fake Carlin special rather than it being trained on previous work. As the case did not reach the discovery phase, it is unclear exactly what parts of the fake Carlin set are AI-generated.

“While it is a shame that this happened at all, I hope this case serves as a warning about the dangers posed by AI technologies and the need for appropriate safeguards not just for artists and creatives, but every human on earth,” Kelly Carlin said.

Even if the podcast did not use Carlin’s comedy to train an AI, an attorney for the estate said that using the technology to create an impersonation of him was still a violation of Carlin’s rights and that the disclaimer before the special was insufficient. Clips of the special could have been stripped of context and been misleadingly spread around the internet purporting to really be from Carlin, who died in 2008.

“These kinds of fake videos create a real potential for harm because someone could, for example, just take a segment of it and send that around or post that on Twitter,” said Josh Schiller, a partner at Boies, Schiller, Flexner and lawyer for Carlin’s estate. “Someone might believe that they’re listening to the real George Carlin, because they’ve never heard him before and they don’t know he’s dead.”

The settlement comes at a sensitive time for the entertainment industry’s relationship with artificial intelligence. The boom in publicly available generative AI tools over the past year and a half has heightened creators’ concerns over unauthorized imitations of artists both living and dead. Recent deepfakes of celebrities such as Taylor Swift have additionally put pressure on lawmakers and AI companies to restrict malicious or non-consensual uses of the technology.

Earlier this week, over 200 musicians signed an open letter calling on developers and tech companies to stop producing AI tools that could replace or undermine their rights and steal their likenesses. Meanwhile, a number of states have passed legislation surrounding the uses of deepfake technology – including Tennessee enacting a law last month against blocking the replication of an artist’s voice without their consent.

While the case settled quickly, it highlights the potential for future litigation over whether AI-generated imitations could be considered parodies allowed under fair use. Shows like Saturday Night Live have long been allowed to impersonate public figures on those grounds, but there have yet to be major legal tests of generative AI tools creating similar impressions – a situation that Schiller argues is fundamentally different from when a human does it.

“There’s a big difference between using an AI tool to impersonate someone, and make it appear as if it’s authentic, versus someone putting on a gray wig and a black leather jacket,” Schiller said. “You know that that person is not George Carlin.”





Source link

#George #Carlins #estate #settles #lawsuit #comedians #doppelganger #Artificial #intelligence

You may also like