The Senate unanimously passed a bill on Tuesday that would allow victims of non-consensual intimate images created by artificial intelligence, or “deepfakes,” to sue their creators for damages.
The Destroying Explicitly Falsified Images and Nonconsensual Editing Act (DEFIANCE) Act allows victims of sexually explicit deepfake images to seek civil remedies against those who produced or processed the image with the intent to distribute it. Under the bill, identifiable victims in such deepfake incidents could receive up to $150,000 in damages if the incident is related to “actual or attempted sexual assault, stalking, or harassment” or “direct and proximate cause.” Compensation is available up to $250,000. The bill now needs to be considered by the House of Representatives before it can be handed over to the president to be signed into law.
In a speech on the Senate floor, Majority Leader Chuck Schumer (D-N.Y.) said explicit deepfakes “are not just a fringe problem that happens to a few people, but a pervasive problem.” These types of malicious and hurtful images can destroy lives.
Schumer said the STANDARD Act is just “one example of the guardrails on artificial intelligence that I talk about all the time. Artificial intelligence is a remarkable technology that can inspire incredible innovation, but we must get past the guardrails to prevent its most severe consequences.” Serious abuses are causing serious harm to people. He unveiled a roadmap for how Senate committees should approach artificial intelligence legislation earlier this year.
He urged the House to pass the Defiance Act. The House already has a companion bill, although there is only a week and a half left before Congress dissolves for its August recess.
“With this bill, we are telling victims of clearly nonconsensual deepfakes that we hear them and we are taking action,” Schumer said.