Lawmakers in Capitol Hill are urgently responding to the surge in deepfake AI pornographic images that have targeted various individuals, including celebrities and high school students. The introduction of a new bill aims to hold social media companies accountable for monitoring and removing such harmful content from their platforms. Known as the Take It Down Act, the legislation spearheaded by Sen. Ted Cruz of Texas seeks to criminalize the publication or threats of deepfake porn. The bill also mandates social media platforms to establish a process for removing these images within 48 hours of a valid request from a victim. Furthermore, the platforms are required to make reasonable efforts to remove any additional copies of the images, even those shared in private groups.
The responsibility of enforcing these regulations would be entrusted to the Federal Trade Commission, an agency that oversees consumer protection. The bill is set to be officially presented by a bipartisan group of senators, who will be joined by deepfake porn victims, including high school students. The prevalence of nonconsensual AI-generated images has affected not only public figures like Taylor Swift and Rep. Alexandria Ocasio-Cortez but also ordinary individuals who have fallen victim to this disturbing trend. The proposed legislation aims to level the playing field at the federal level and ensure that websites have mechanisms in place to promptly remove such harmful content.
In recent years, the production of deepfake porn has seen a significant increase, with a reported surge of 464% in 2023, according to a study by Home Security Heroes. While there is broad consensus within Congress regarding the urgency of addressing deepfake AI pornography, there is a lack of agreement on the approach to be taken. Two competing bills have emerged in the Senate, highlighting the divergent perspectives on how to combat this issue. Sen. Dick Durbin put forth a bipartisan bill earlier in the year, which would grant victims of non-consensual deepfakes the right to sue individuals involved in creating, holding, or distributing the images.
On the other hand, Sen. Cruz’s bill treats deepfake AI porn as highly offensive online content, placing the onus on social media companies to regulate and remove such material. The clash between these two approaches was evident when Sen. Cynthia Lummis raised concerns about Durbin’s bill, citing its broad scope and potential negative impact on American technological innovation. In response, Durbin defended his bill, emphasizing that tech platforms would not face liability under the proposed law. Notably, Lummis is among the co-sponsors of Cruz’s bill, along with Sens. Shelley Moore Capito, Amy Klobuchar, Richard Blumenthal, and Jacky Rosen.
The introduction of the Take It Down Act coincides with Senate Majority Leader Chuck Schumer’s efforts to advance A.I. legislation in his chamber. A task force on A.I. recently released a “roadmap” outlining key priorities in this domain, underscoring the growing importance of addressing emerging technologies and their societal impacts. As the debate on deepfake AI pornography unfolds in Congress, the divergent policy approaches highlight the complexity of regulating online content and balancing innovation with consumer protection. The ultimate challenge will be to devise effective measures that safeguard individuals from the harmful consequences of deepfake porn while fostering technological progress and digital freedom.
Leave a Reply