This article discusses the TAKE IT DOWN Act that President Trump signed into law. We talk about how it can be a tool for individuals and ORM companies to remove harmful online content from social media platforms and web search results. Note that we discuss the topic of nonconsensual intimate visual depictions in this article.
On Monday, May 19, 2025, President Trump signed the TAKE IT DOWN Act. The full name of the act is Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act. It went into effect immediately.
The act addresses digital exploitation, mainly the publication of real and artificial intelligence-generated non consensual intimate imagery (often known as “revenge porn”) and deepfakes. Those who continue to distribute the type of harmful content online that’s outlined in the Take It Down Act will rightfully face criminal consequences.
In this article, we’ll discuss the Take It Down Act, including support for it and complaints against it. We’ll also discuss how the Take It Down Act influences online reputation management and online safety. This includes the ability for ORM companies to take proactive measures when protecting children, any young person, or adult on the internet against online abuse.
NetReputation can help you protect your presence in the digital age. Call us at 844-461-3632 to learn more about protecting yourself online. You can also fill out the contact form below for a free consultation.
Request a Free Consultation
Take It Down Act Specifics
According to the summary of the Take It Down Act on the official U.S. federal legislative website, the bill “generally prohibits the nonconsensual online publication of intimate visual depictions of individuals, both authentic and computer-generated, and requires certain online platforms to promptly remove such depictions upon receiving notice of their existence.”
More specifically, the Take It Down Act prohibits nonconsensual intimate visual depictions in online publications. The authentic or computer-generated imagery must meet at least one of the following criteria:
- Intimate visual depiction of an adult subject, where the publication did or intended to cause harm to the subject, and where the subject did not consent to the publication of the imagery. If it was an authentic visual depiction (not a computer-generated image), it was created or obtained in a circumstance where the subject had a reasonable expectation of privacy.
- Intimate visual depiction of a minor subject, where the publication intended to abuse or harass the subject, or where the imagery’s purpose was to arouse or gratify sexual desires.
Covered platforms, including social media platforms and websites, must remove the materials within 48 hours of receiving a valid removal request from the victim. The covered platforms must also make reasonable efforts to delete any duplicates of the intimate visual depiction online. To hold perpetrators accountable, violators could face a fine, prison time or both. Moreover, even threatening to publish an intimate visual depiction may result in criminal penalties.
Covered Platforms
The Take It Down Act refers to “covered platforms,” which are the apps (both online and mobile) and websites that have to establish a process to remove nonconsensual intimate visual depictions upon the victim’s request. These covered platforms have a year to establish the process; the deadline is May 19, 2026.
The specific wording about covered platforms includes the following criteria:
- Mobile application, online application, online service, or website
- The platform serves the public
- The platform’s primary use is as a user-generated content forum, OR
- It curates, hosts, publishes or otherwise makes non-consensual intimate images available
The following types of platforms are not included in covered platforms:
- Apps or websites that primarily consist of non-user-generated content
- Broadband internet access service providers
- Email platforms
However, the act also says that exclusions don’t apply to apps or sites that host or publish non-consensual intimate imagery.
Takedown Provisions
Under the Take It Down Act, a covered platform must provide clear and conspicuous notice to users regarding their removal process. For example, they must make it easy for users to access a disclosure agreement or web page that clearly defines the notice and removal process. This must include how the person can submit a removal request and what to expect from the covered platform.
The act requires covered platforms to respond to valid removal requests. A valid request must meet the following criteria:
- The request must be in writing.
- There must be a physical or electronic signature of either the person in the intimate visual depiction or their representative.
- The covered platform must be provided with a way to locate the imagery.
- There must be a statement from the victim that demonstrates that the imagery was not consensual.
- The victim or their representative must provide their contact information.
Once a covered platform receives a request, they have 48 hours to locate and remove the content and its duplicates. When it comes to identifying and removing identical copies of the content, the covered platform must make reasonable efforts to do so.
Protecting Good Faith Efforts
There will be instances when a lawful intimate visual depiction is removed from online services. In these cases, the Take It Down Act provides protection for the covered platform.
For example, a covered platform may remove reported content to comply with the Take It Down Act. Later, the covered platform may discover that the content did not include non-consensual intimate images. In other words, the removed imagery was legally published online.
In a case like this, the Take It Down Act has provisions for protecting good-faith efforts. The covered platform is protected if the content creator takes legal action.
Updates for Removing Intimate Visual Depictions
Before the act was signed, there were existing laws regarding online communications. The act has amended the Communications Act of 1934, adding criminal regulations regarding publishing intimate images. Additionally, the act added requirements for covered online platforms that the Federal Trade Commission can now enforce.
More specifically, the Federal Trade Commission will enforce required removals of non-consensual intimate images that violate the new act. The act also gives the Federal Trade Commission jurisdiction over nonprofit organizations, which are not usually covered by the Federal Trade Commission Act. (The Federal Trade Commission Act, or FTC Act, is a United States federal law that prohibits deceptive practices that affect commerce.)
You can use mitigation measures to report an intimate visual depiction of yourself, remove information from online platforms, and take control of your online reputation. Call us at 844-461-3632 for more information about how the Take It Down Act and ORM strategies can protect you.
Take It Down Act: Bipartisan Support
The Take It Down Act is bipartisan legislation, meaning it has support from both the Democratic and Republican parties. U.S. Senator Amy Klobuchar, a Democrat from Minnesota, and U.S. Senator Ted Cruz, a Republican from Texas, introduced the bill.
Elliston Berry inspired the Take It Down Act in part. Berry was the victim of fake nudes shared on social media platforms in 2023, when she was just 14 years old. According to The Independent, Berry’s mother, Anna McAdams, reached out to Snapchat multiple times over an eight-month period to have the intimate visual depiction removed. However, Snapchat ignored her requests.
After President Trump signed the Take It Down Act, Senator Klobuchar said, “We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse.” She went on to say that while these actions can “ruin lives and reputations,” the government can now hold perpetrators accountable.
In his statement following the signing of the bill, Senator Cruz said, “Predators who weaponize new technology to post this exploitative filth will now rightfully face criminal consequences, and Big Tech will no longer be allowed to turn a blind eye to the spread of this vile material.”
First Lady Melania Trump’s Involvement
The Take It Down Act also gained support from First Lady Melania Trump, who helped usher it through Congress.
In March 2025, First Lady Melania Trump hosted a roundtable on Capitol Hill to discuss online protection for children. During opening remarks, the First Lady said, “It’s heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content, like deepfakes. This toxic environment can be severely damaging. We must prioritize their well-being by equipping them with the support and tools necessary to navigate this hostile digital landscape. Every young person deserves a safe online space to express themself freely, without the looming threat of exploitation or harm.”
Censorship and Lawful Speech Concerns
According to the Associated Press, critics of the Take It Down Act worry that its broad language could lead to censorship, lawful speech and First Amendment issues. Of particular concern are censorship issues regarding legitimate images, not just non-consensual intimate images. This includes LGBTQ content and legal pornography that does not fall under the umbrella of illegal content or non-consensual intimate images.
The Electronic Frontier Foundation (EFF) has said that the Take It Down Act gives “the powerful a dangerous new route to manipulate platforms into removing lawful speech that they simply don’t like.” EFF claims the following:
- The takedown provision in the Take It Down Act may apply to any intimate visual depiction.
- Automated filters will flag content, which can lead to errors.
- The 48-hour takedown timeframe isn’t enough time to verify if the speech (meaning the content) is actually legal.
The Cyber Civil Rights Initiative (CCRI) also released a statement about the Take It Down Act. CCRI says that the takedown provision “is highly susceptible to misuse and will likely be counter-productive for victims.” The statement also says the act is “unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse.”
Take It Down Act and ORM
Online reputation management (ORM) and the Take It Down Act are closely linked. Each effort has the high-level goal of protecting people from the damaging effects of harmful content online. While the act focuses mainly on non-consensual intimate images, ORM has a broader view of all types of content that could harm a person’s digital reputation.
Moreover, the Take It Down Act can protect victims with a legal, federal way to combat non-consensual images. That includes both authentic and computer-generated imagery posted online. This can make it easier for individuals and online reputation management companies to have damaging content removed from the internet quickly. Furthermore, ORM experts can then restore the victim’s digital reputation swiftly.
The longer negative content stays online, the more complex it becomes for ORM professionals to repair an individual’s reputation. With the 48-hour time limit of the Take It Down Act, websites have to respond rapidly to takedown requests. This can limit how much damage is done to a person’s online presence.
At NetReputation, we protect victims of online abuse and help them take control of their online reputation. You don’t have to suffer from non-consensual intimate imagery posted online. Give us a call at 844-461-3632 to learn more.
Protecting Your Dignity Online
Many states have already banned the dissemination of explicit deepfakes and revenge porn. However, according to the AP, “the Take It Down Act is a rare example of federal regulators imposing on internet companies.”
Moreover, artificial intelligence is able to produce and disseminate harmful content online at breakneck speed. Today, any young person could be the victim of harmful content going viral. That content may include hate speech, non-consensual imagery, or other forms of extremely damaging information or known exploitation.
It’s now a federal crime to knowingly publish revenge porn, whether they’re authentic images or AI generated deepfakes. By immobilizing technological deepfakes on websites and in apps, along with other types of non-consensual imagery, the act is able to empower victims to take action.
We recognize the risks of the Take It Down Act. These include the potential for over-removal that could result in free speech issues. However, we also acknowledge that there are many instances when the act will help victims to reclaim their privacy as quickly as possible.
By leveraging the Take It Down Act’s legal obligations, ORM professionals have a new tool for helping individuals to remove damaging content from web results. This gives them the chance to create a positive online identity.
We offer ORM solutions for individuals suffering from harmful content who need to protect their dignity online.
Get started today with a free consultation with one of our experts. Call us at 844-461-3632 or fill out the contact form below. We’ll discuss your online safety and how the Take It Down Act may play a role.
Request a Free Consultation
Leave a Reply