The doctrine
Right-of-publicity statutes, the federal TAKE IT DOWN Act, and state non-consensual-intimate-imagery laws now define liability for synthetic likenesses, voice clones, and AI-generated explicit imagery.
The right of publicity is state law and varies widely. Tennessee's ELVIS Act (2024) created a specific tort for unauthorized AI voice replication. California, New York, and Illinois have older publicity statutes adapted by courts to AI-generated imagery and voices. The Scarlett Johansson / "Sky" voice episode (May 2024) — though not litigated — produced industry-wide voice-talent contracting and consent-management changes.
The federal TAKE IT DOWN Act of 2025 criminalizes the publication of non-consensual intimate visual depictions, including AI-generated imagery, and creates a notice-and-takedown obligation for platforms. Civil enforcement is layered atop existing state non-consensual-intimate-imagery laws in 49 states.
Deepfake fraud cases — including the $25M Hong Kong CFO heist of February 2024 — sit at the intersection of fraud, securities, and AI law. Civil discovery in those matters is producing a substantial record about real-time deepfake capabilities.
Leading cases
"Sky" voice paused; no formal action filed.
Voice-actor class action over alleged unauthorized voice cloning by AI voice platform.
Federal NCII law signed 2025; first enforcement actions pending.
Key holdings
- Voice is increasingly protected. Tennessee's ELVIS Act and analogous state laws are creating standalone causes of action.
- Federal NCII law is now in force. Platform takedown obligations and individual criminal exposure for AI-generated explicit imagery.
- Right of publicity travels with the depicted person. Multistate exposure for any single piece of synthetic media.