Skip to content

Artificial Intelligence Regulation Pushes Social Media Platforms to Combat Deepfakes

Bill aims to hold digital platforms responsible for artificial intelligence-generated content, focusing on protecting voice and likeness from misuse.

Artificial Intelligence Regulation Drive: Major Platforms responding to the No Fakes Act
Artificial Intelligence Regulation Drive: Major Platforms responding to the No Fakes Act

Artificial Intelligence Regulation Pushes Social Media Platforms to Combat Deepfakes

The No Fakes Act, a federal bill aimed at preventing the unauthorized use of a person's voice, face, or likeness in AI-generated content, is currently under active legislative consideration in 2025. If enacted, the bill would introduce a stringent regulatory framework that imposes significant obligations on internet platforms and developers of AI tools.

### Current Status of the No Fakes Act

The bill establishes a new federal right called the "digital replication right." This right functions similarly to an intellectual property regime, granting individuals (and their heirs or assignees for up to 70 years after death) control over unauthorized digital replicas of their likenesses, voices, or images.

The NO FAKES Act requires platforms to take down AI-generated replicas upon receiving a notice from the right holder, keep down recurring instances of the same or similar unauthorized content, remove and filter tools capable of producing unauthorized digital replicas, and potentially unmask anonymous users accused of uploading unauthorized replicas based solely on the right holder’s claim.

### Concerns and Criticism

Civil liberties and technology advocacy groups like the Electronic Frontier Foundation (EFF) warn that the bill is "heavy-handed," threatening online anonymity, free expression, and innovation. The bill’s vague language creates broad interpretive gaps and lax procedural safeguards, raising concerns for authors, artists, and creators relying on fair use, remix culture, and non-commercial expression. Critics argue the bill could lead to censorship, stifle creative reuse, and place veto power in the hands of rights holders over technological development related to AI-generated content.

### How Platforms and Entertainment Companies Are Adapting

The NO FAKES Act has strong support from entertainment industry unions, dominant tech companies like YouTube and OpenAI, and labor groups. This coalition has influenced the bill's push and framing as protecting the creative community. Platforms are anticipated or required to implement extensive content filtering and takedown systems to comply. This includes deploying AI detection tools, maintaining records of content removals, and cooperating in user identification related to alleged violations.

The bill also targets tools and services that can produce unauthorized likeness replicas, making developers liable, which could result in heightened platform moderation and restrictions on AI tool availability. Given these requirements, platforms are likely increasing their investment in automated filtering technologies and rights management systems to reduce legal risk and align with the emerging regulatory environment.

In conclusion, the No Fakes Act remains a proposed but influential bill in 2025, reflecting heightened legislative efforts to curb harmful AI-generated impersonations of individuals. While it enjoys industry and bipartisan political support, the bill faces robust opposition from free speech advocates and creative communities due to its broad censorship mandates and potential negative impact on innovation. Platforms and entertainment companies are adapting by preparing to enforce strict content moderation, filtering, and compliance mechanisms aligned with the bill’s provisions. If enacted, the NO FAKES Act would significantly reshape how AI-generated likenesses are controlled and by whom, prioritizing rights holders' control while raising complex questions around free expression and internet freedom.

  1. The No Fakes Act, if enacted, will likely influence technology companies developing AI tools for entertainment, as they may need to implement stricter regulations to comply, potentially impacting the availability and creativity of these tools.
  2. The pending No Fakes Act, which aims to prevent AI-generated impersonations, has sparked concern among advocacy groups and creative communities in the field of technology and entertainment, as they fear it may threaten free expression, innovation, and the rights of individuals relying on fair use and remix culture.

Read also:

    Latest