AI deepfake porn bill would require Big Tech to police and remove images

0
108
AI deepfake porn bill would require Big Tech to police and remove images



U.S. Senator Ted Cruz (R-TX) speaks at a press conference on Capitol Hill in Washington on October 6, 2021.

Evelyn Hockstein | Reuters

WASHINGTON – Lawmakers on Capitol Hill are racing to combat the boom in fake pornographic images using artificial intelligence that are targeting everyone from celebrities to high school students.

Now a new bill aims to hold social media companies accountable for monitoring and removing deepfake porn images posted on their websites. The measure would criminalize publishing or threatening to publish deepfake porn.

Sen. Ted Cruz, R-Texas, is the bill’s lead sponsor. Cruz’s office provided CNBC with exclusive details on the bill.

The Take It Down Act would also require operators of social media platforms to develop a process to remove the images within 48 hours of receiving a valid request from a victim. In addition, sites would be required to make reasonable efforts to remove all other copies of the images, including those shared in private groups.

The job of enforcing these new rules would fall to the Federal Trade Commission, which regulates consumer protection regulations.

Cruz’s legislation will be formally introduced Tuesday by a bipartisan group of senators. They will be joined in the Capitol by victims of deepfake porn, including high school students.

The rise of non-consensual AI-generated images has impacted celebrities like Taylor Swift, politicians like Rep. Alexandria Ocasio-Cortez, D-N.Y., and high school students whose classmates have taken pictures of their faces and shared them using apps and AI tools have created nudes or pornographic photos.

“By leveling the playing field at the federal level and putting the responsibility on websites to have procedures in place to remove these images, our law will protect and empower all victims of this heinous crime,” Cruz said in a statement to CNBC.

Duel with Senate bills

According to a report from Home Security Heroes, deepfake porn producers increased their production by 464% year-over-year in 2023.

But while there is broad consensus in Congress that deepfake AI pornography needs to be addressed, there is no agreement on how to do so.

Instead, there are two competing bills in the Senate.

Sen. Dick Durbin, D-Ill., introduced a bipartisan bill earlier this year that would allow victims of non-consensual deepfakes to sue people who held, created, possessed or distributed the image.

Under Cruz’s bill, deepfake AI porn would be treated like highly offensive online content, meaning social media companies would be responsible for moderating and removing the images.

When Durbin tried to get a vote on his bill last week, Sen. Cynthia Lummis, R-Wyo., blocked the bill, saying it was “too broad in scope” and could “stifle American technological innovation.”

Durbin defended his bill by saying, “Under this proposed law, there is no liability for technology platforms.”

Lummis is one of the original co-sponsors of Cruz’s bill, along with Republican Senator Shelley Moore Capito and Democratic Senators Amy Klobuchar, Richard Blumenthal and Jacky Rosen.

The new bill also comes as Senate Majority Leader Chuck Schumer, D-N.Y., is pushing his chamber to advance AI legislation. Last month, an artificial intelligence task force released a “roadmap” on key AI issues that included developing laws to combat the “non-consensual distribution of intimate images and other harmful deepfakes.”



Source link

2024-06-18 13:45:25

www.cnbc.com