San Francisco is suing the makers of non-consensual deepfake pornography. Deepfake porn creates fake — but realistic — images of people in the nude, often depicting them engaged in graphic sexual acts. This can seriously embarrass ordinary people depicted in deepfake porn, if the porn spreads across the internet or viewers think the porn is real, creating the false impression that the person depicted in the porn engaged in pornographic activity.
Some states have laws against deepfake pornography or revenge porn, but there are obstacles to enforcing those laws. Some revenge porn laws may be vulnerable to a constitutional challenge, because they define porn so broadly as to cover non-explicit images, or cover images of public figures posted for their newsworthiness, rather than with intent to harass (like images of a Virginia politician, Susanna Gibson, who deliberately produced porn for viewing by strangers, on the pornographic website Chaturbate, then complained when images of her sex acts were released to the general public (including images that didn’t even show her private parts but showed that she intentionally engaged in public sex acts)).
Some victims of deepfake porn may not want to sue over it, because the lawsuit could draw further attention to the images, and the pornographer may be overseas and hard for U.S. courts to reach, making some lawsuits futile.
But California has a sweeping law allowing government officials to sue over unfair business practices, like deepfake porn, and to seek civil penalties and injunctive relief (California Business & Professions Code Section 17200).
So on August 14, San Francisco, through its city attorney, filed a lawsuit against the operators of deepfake porn web sites, in a state court case known as People of the State of California v. Sol Ecom, Inc.:
San Francisco filed a lawsuit against 16 AI-powered websites, marking the first government action to combat non-consensual deepfake pornography. The case aims to shut down these websites, which allow users to create AI-generated nude images without consent.
The sites collectively received more than 200 million visits in the first half of 2024.
Users can upload photos of fully clothed individuals, including children, which are then manipulated by AI to create non-consensual nude images.
Individuals used the technology to create deepfake nudes of people ranging from singer Taylor Swift to Rep. Alexandria Ocasio-Cortez, and everyday citizens like middle-school girls.
In Beverly Hills, California, school administrators expelled five eighth-graders for creating and sharing AI-generated nude images of 16 female classmates.
Victims of these AI-generated nudes often face severe psychological trauma, damaged reputations, and in some cases, suicidal thoughts.
The pervasive nature of online content makes it exceptionally challenging for victims to trace the origin of their manipulated images.
“This is an enormous national and global problem,” Yvonne Mere, San Francisco’s chief deputy attorney, said. “Businesses and websites that conduct these malicious practices have evaded enforcement. Individual victims of this conduct have little practical and legal recourse. It’s very difficult for victims to determine what websites are used to nudify their images because these images don’t have any unique or identifying marks that links you back to the websites.”
The lawsuit, filed on behalf of Californians, alleges violations of state and federal laws against deepfake and revenge pornography and child exploitation.
It seeks to permanently restrain those operating the websites from creating deepfake pornography in the future and assess civil penalties.
Some deepfakes of public figures might be protected by the First Amendment. No one is deceived by a deepfake nude of Rep. Alexandria Ocasio-Cortez. Everyone who sees it will realize that the image of her is fake, because politicians almost never pose in the nude for pornographic images. So it won’t damage her image. Moreover, fake porn web sites often have the word “fake” in their name, such as Cfake.com, so viewers are not deceived about whether those fakes are real. Fake depictions of public figures engaged in sexual conduct are often protected by the First Amendment, when it is a parody or otherwise doesn’t purport to be true, even if some stupid people wrongly think it depicts reality.
For example, the Supreme Court overturned a court ruling forcing Hustler Magazine to pay damages to the Rev. Jerry Falwell for a parody depicting him having drunken sex with his mother in an outhouse, which caused him severe emotional distress. The Supreme Court ruled that because he was a public figure, he could not recover damages for intentional infliction of emotional distress unless he could show that the parody intentionally or recklessly made false claims that deceived readers — which he couldn’t, because it was labeled as a parody. (See Hustler Magazine v. Falwell (1988)).
It is easier to sue in California state court than federal court over things like deepfake porn. California state law contains no uniform requirement of standing to sue over violations of state law — for example, a black man was allowed to sue in state court to strike down California state affirmative-action programs for blacks, even though they didn’t injure him, so he wouldn’t have had standing to sue in federal court (see Connerly v. State Personnel Board (2001)). But standing isn’t required in California state court, for certain kinds of lawsuits.