Boss Digital

Proposed “No Fakes” Act will hold entities accountable for damages caused by deepfakes


In a nutshell: It took a lot longer than we’d have hoped, but it’s finally happened. Ethicists and policymakers alike have been pushing for some form of regulation against unauthorized AI use for years. Now, a new bipartisan bill introduced on Wednesday proposes holding entities accountable for producing non-consensual “digital replicas.”

Earlier this year, deepfake pornography depicting Taylor Swift spread online, causing significant backlash. This incident seemed to be a tipping point for regulators, with members of Congress and even the White House weighing in on the need to address the deepfake crisis. Rumors that comprehensive legislation could be on the horizon began circulating soon after.

The new bipartisan Congressional bill aims to resolve this problem. The Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024 (NO FAKES Act) will hold individuals and companies liable for damages if they create, host, or share unconsented AI-generated audio or visual depictions of a person. Online platforms would also be on the hook if they knowingly host prohibited replicas after receiving takedown notices.

The term the legislation uses to refer to AI-generated depictions is “digital replica.” Under the bill, individuals will have exclusive control over the use of their voice or visual likeness in these replicas. This right extends to 10 years after death. The law would preempt existing state laws addressing deepfakes to create a uniform national standard. Of course, the bill includes exemptions for works like documentaries, parody, or commentary protected by the First Amendment.

Co-sponsored by Senators Chris Coons, Marsha Blackburn, Amy Klobuchar, and Thom Tillis, the bill has gained broad support from entertainment industry heavyweights like SAG-AFTRA, the Universal Music Group, the Motion Picture Association, and top talent agencies. Several leading AI companies, including OpenAI and IBM, have endorsed the legislation.

“Everyone deserves the right to own and protect their voice and likeness, no matter if you’re Taylor Swift or anyone else,” Coons stated, referring to the incident involving the pop star.

“For SAG-AFTRA members, the NO FAKES Act is especially important since our livelihoods are intrinsically linked with our likenesses,” said Fran Drescher, the union’s president, praising the legislation’s protections for actors and performers.

While celebrities have been the most high-profile victims of deepfake abuse, the potential harms extend far beyond just famous entertainers. The law would protect everyone from scammers using AI-generated fake audio or video for fraud, defamation, or worse.

“With AI technology becoming increasingly powerful, I’m thrilled to see this important legislation to protect human beings from abuses, exploitation, and fraud,” Drescher noted.

“Creators and artists should be protected from improper impersonation, and thoughtful legislation at the federal level can make a difference,” said OpenAI Vice President of Global Affairs Anna Makanju.

The bill’s sponsors say they’ve worked extensively with stakeholders across entertainment, tech, and other sectors to balance the need to protect individuals while upholding free speech and enabling continued US innovation in AI.

The introduction of the NO FAKES Act follows another landmark bill passed by the Senate last month called DEFIANCE, allowing victims of sexual deepfakes to sue for damages. However, it still has a long road to the President’s desk. Senator Coons says they’re working to pass it in the House and Senate as soon as possible.





Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top