Senators introduce bill to protect individuals against AI-generated deepfakes

by Pelican Press
25 views 3 minutes read

Senators introduce bill to protect individuals against AI-generated deepfakes

Today, a group of senators introduced the , a law that would make it illegal to create digital recreations of a person’s voice or likeness without that individual’s consent. It’s a bipartisan effort from Senators Chris Coons (D-Del.), Marsha Blackburn (R-Tenn.), Amy Klobuchar (D-Minn.) and Thom Tillis (R-N.C.), fully titled the Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024.

If it passes, the NO FAKES Act would create an option for people to seek damages when their voice, face or body are recreated by AI. Both individuals and companies would be held liable for producing, hosting or sharing unauthorized digital replicas, including ones made by generative AI.

We’ve already seen many instances of celebrities finding their imitations of themselves out in the world. used to scam people with a fake Le Creuset cookware giveaway. A voice that sounded showed up in a ChatGPT voice demo. AI can also be used to make political candidates appear to make false statements, with the most recent example. And it’s not only celebrities who can be .

“Everyone deserves the right to own and protect their voice and likeness, no matter if you’re Taylor Swift or anyone else,” Senator Coons said. “Generative AI can be used as a tool to foster creativity, but that can’t come at the expense of the unauthorized exploitation of anyone’s voice or likeness.”

The speed of new legislation notoriously flags behind the speed of new tech development, so it’s encouraging to see lawmakers taking AI regulation seriously. Today’s proposed act follows the Senate’s recent passage of the DEFIANCE Act, which would allow victims of sexual deepfakes to sue for damages.

Several entertainment organizations have lent their support to the NO FAKES Act, including SAG-AFTRA, the RIAA, the Motion Picture Association, and the Recording Academy. Many of these groups have been pursuing their own actions to get protection against unauthorized AI recreations. SAG-AFTRA recently to try and secure a union agreement for likenesses in video games.

Even OpenAI is listed among the act’s backers. “OpenAI is pleased to support the NO FAKES Act, which would protect creators and artists from unauthorized digital replicas of their voices and likenesses,” said Anna Makanju, OpenAI’s vice president of global affairs. “Creators and artists should be protected from improper impersonation, and thoughtful legislation at the federal level can make a difference.”



Source link

#Senators #introduce #bill #protect #individuals #AIgenerated #deepfakes

You may also like