There have been concerns recently about how deepfakes can be used to spread political misinformation, but that's not the most common use for the technology: It's overwhelmingly being used to create nonconsensual porn.
In the years since this alarm was first raised, the legal fight against deepfake revenge porn and nonconsensual porn has been an uphill battle.
The idea of having a "right to privacy" dates back to the early 1890s. Authors of a landmark paper for the Harvard Law Review warned that "photographs and newspaper enterprise" had invaded private life, and many "mechanical devices" could make "what is whispered in the closet... proclaimed from the house-tops."
Even though the paper goes on to suggest the law should step in to stop spreading of personal pictures against someone's will, not much has legally changed since then. At the federal level, there is still no ban on creating deepfakes of someone without their consent.
There have been scattered cases relating to using someone's identity to mislead people, but these are usually focused on individual cases and people with public personas.
In the 80s, Bette Midler sued Ford and an ad agency for hiring one of her back-up singers to imitate her voice in a commercial. The agency got permission to use the song, but Midler herself refused to do the ad. The focus of the suit was just over imitating her voice and whether exploiting her likeness is protected by free speech. It wasn't; Midler won her suit against the ad agency.
That begs the question: How is exploiting someone's likeness for a commercial punishable, but exploiting someone's likeness for porn still isn't? A big reason is just the difference in mediums and how they're regulated.
On the internet, websites and platforms are generally not legally responsible for the content they host; the creator or uploader is. But they can be very hard to track down. That legal protection is thanks to the pivotal Communications Decency Act and its infamous Section 230.
Section 230 has come up from politicians on both sides of the aisle who want to reform big tech, and for the first time, there may be some momentum to revisit the statute. The Supreme Court just announced in early October that it would be hearing its first ever case regarding Section 230.
Holding platforms accountable for hosting deepfake porn wouldn't be a silver bullet solution, but it could be a game-changer in cracking down on the videos. Some experts argue that pushing for federal criminalization would be a vast improvement from the scattered state laws currently in place.
Currently, 48 states and D.C. have banned nonconsensual distributing for porn. However, many of those laws don't apply to deepfake porn because there's still a legal gray area: It combines the images of multiple people — like a head of someone on the body of another— so there's some legal ambiguity about identity and personal harm.
It's clear the law has some catching up to do when it comes to deepfake porn, but meanwhile, each year there are tens of thousands of victims who can't afford to wait for lawmakers to navigate these legal gray areas.