On Monday(5/19/25), President Donald J. Trump signed the TAKE IT DOWN Act into law, marking a major development in the effort to combat the non-consensual distribution of intimate images—including those created by artificial intelligence. The bipartisan legislation, backed by lawmakers across the political spectrum and championed by First Lady Melania Trump, introduces federal penalties for sharing explicit content without consent and places new responsibilities on online platforms.

The law mandates that websites and social media companies remove flagged content within 48 hours after a verified request from a victim. It also allows prosecutors to pursue criminal charges against individuals who knowingly distribute such images, including deepfakes. This move responds to the growing concern over AI-generated imagery being used to harm, harass, or defame individuals, particularly women and minors.

The legislation received widespread support from both parties. Senator Ted Cruz called it a “win for victims,” emphasizing the importance of addressing the use of technology for abuse. Democratic Senator Amy Klobuchar echoed those sentiments, noting that the law provides necessary protections in an evolving digital landscape. Other lawmakers, including Senators Marsha Blackburn and Shelley Moore Capito, voiced their support for the bill’s protections for children and vulnerable communities.

First Lady Melania Trump has been closely involved with the legislation, continuing her advocacy from her “Be Best” initiative launched during her husband’s first term. She described the law as a necessary step to ensure young people feel safer online, especially amid the rise of deepfake technology that has been used to target minors and women.

The law’s signing was welcomed by numerous organizations, including the National Center for Missing & Exploited Children (NCMEC), which emphasized how the legislation helps close existing legal gaps and enhances protection for child victims. Other groups, such as the National Organization for Women, praised the measure for helping restore control and dignity to those affected by digital exploitation.

Not all reactions were supportive. Digital rights organizations such as the Electronic Frontier Foundation expressed concern about how the law may affect free speech and due process. They warned that the 48-hour removal rule may lead platforms to rely heavily on automated systems, which have been known to incorrectly flag legal content, including satire, journalism, and advocacy.

Despite these concerns, the law’s passage was broadly seen as a response to public demand for stronger online safeguards. Platforms including Meta, TikTok, and X (formerly Twitter) voiced support and pledged to work within the new framework.

The bill’s origin was partly inspired by real-world cases, including one involving a 14-year-old girl whose deepfake image circulated online for nearly a year without removal. That case galvanized support from lawmakers and the public, drawing attention to the emotional and psychological toll of digital exploitation.

As AI tools become more accessible and powerful, policymakers have sought ways to respond to the risks they present. The TAKE IT DOWN Act sets a federal standard for how non-consensual intimate imagery is handled and holds platforms and offenders accountable. While implementation and oversight will be closely watched, the law represents a clear statement: digital abuse will no longer go unchecked.

This image is the property of The New Dispatch LLC and is not licenseable for external use without explicit written permission.