Uncategorized

Section 230 immunity could be carved out for intimate Artificial Intelligence deepfakes

The Intimate Privacy Protection Act: A Law to Protect Human Embeddings from the Misuse of Artificial Intelligence on Online Platforms

Microsoft wants a statute to give law enforcement officials a legal framework to prosecute fraud generated by artificial intelligence. Smith wants lawmakers to ensure that the federal and state laws on child sexual exploitation and abuse and non-consensual intimate imagery are updated to include artificial intelligence.

The Senate recently passed a bill cracking down on sexually explicit deepfakes, allowing victims of nonconsensual sexually explicit AI deepfakes to sue their creators for damages. The bill was passed months after middle and high school students were found to be fabricating explicit images of female classmates, and trolls flooded X with graphic Taylor Swift AI-generated fakes.

Microsoft has had to implement more safety controls for its own AI products, after a loophole in the company’s Designer AI image creator allowed people to create explicit images of celebrities like Taylor Swift. Smith says the private sector has a responsibility to prevent misuse of artificial intelligence.

Smith says that the laws need to evolve to combat deep fake fraud, which he says has become apparent as the tech sector and non-profits have taken recent steps to address this problem. The US can prevent the use of deep fake fraud by making it a felony to use it to steal from everyday Americans.

Two bipartisan House lawmakers are working on a bill to protect tech companies from being sued if they don’t remove artificial intelligence deepfakes from their platforms.

Reps. Jake Auchincloss (D-MA) and Ashley Hinson (R-IA) unveiled the Intimate Privacy Protection Act, Politico first reported, “to combat cyberstalking, intimate privacy violations, and digital forgeries,” as the bill says. Section 230 of the Communications Act of 1934 protects online platforms from being held responsible for what their users put on their services. Under the Intimate Privacy Protection Act, that immunity could be taken away in cases where platforms fail to combat the kinds of harms listed. It does this by creating a duty of care for platforms — a legal term that basically means they are expected to act responsibly — which includes having a “reasonable process” for addressing cyberstalking, intimate privacy violations, and digital forgeries.

Lawmakers on both sides of the aisle have long wished to narrow Section 230 protection for platforms they fear have abused a legal shield created for the industry when it was made up of much smaller players. But most of the time, Republicans and Democrats can’t agree on how exactly the statute should be changed. The sex trafficking charges were carved out from Section 230 when Congress passed the FOSTA-SESTA.