Skip to content
Photo of the Federal Trade Commission building

The Federal Trade Commission has issued a clear warning to online platforms: comply with the Take It Down Act by May 19, 2026, or risk enforcement.

On May 11, 2026, FTC Chairman Andrew N. Ferguson announced that the agency sent letters to more than a dozen major technology companies, including Amazon, Alphabet, Apple, Automattic, Bumble, Discord, Match Group, Meta, Microsoft, Pinterest, Reddit, Snapchat, TikTok, and X, reminding them of their obligations under the Take It Down Act. The FTC stated that it is prepared to “monitor compliance, investigate violations, and enforce” the law.

Although the FTC’s letters targeted major technology companies, the law is not limited to Big Tech. Many smaller platforms, apps, subscription sites, creator platforms, messaging services, image and video sharing services, gaming platforms, dating platforms, forums, and adult-content businesses may also be covered.

What the Take It Down Act Requires

The Take It Down Act requires covered platforms to create a process for victims to request removal of intimate photos or videos shared without consent. The law applies to both real intimate images and “digital forgeries,” including AI-generated or AI-altered intimate images.

Covered platforms must:

  1. provide clear and conspicuous notice of the removal process;
  2. make it easy for victims, including non-users, to submit takedown requests;
  3. remove reported nonconsensual intimate images within 48 hours after receiving a valid request;
  4. make reasonable efforts to identify and remove known identical copies of the reported content; and
  5. maintain a process that allows requests to be tracked and handled consistently.

The FTC’s guidance emphasizes that intimate content may appear in posts, messages, comments, livestreams, or other platform features, so platforms should consider where notice and reporting tools must appear in order to be effective.

Why This Matters Now

The FTC has made clear that it views Take It Down Act compliance as an enforcement priority. Violations are treated as violations of an FTC rule, and the FTC’s guidance states that platforms may face civil penalties of $53,088 per violation.

For platforms that host user-generated content, creator content, private messaging, image or video uploads, live chat, AI-generated media, or adult content, this is not just a policy issue. It is an operational issue. A compliant policy is not enough if the platform cannot receive, review, track, remove, and prevent reposting of covered content within the required timeframe.

Businesses Should Act Immediately

Platforms should review whether they are covered by the Take It Down Act and, if so, confirm that they have a functioning compliance process in place before the May 19 deadline.

At a minimum, covered platforms should consider:

  • updating terms of service, acceptable use policies, community guidelines, and content removal policies;
  • creating a dedicated Take It Down Act reporting form or workflow;
  • adding clear and conspicuous notice where users are likely to encounter intimate content;
  • establishing internal escalation procedures for nonconsensual intimate imagery reports;
  • documenting review standards and response timelines;
  • implementing tools to identify and remove identical copies;
  • training trust and safety, support, moderation, and compliance teams; and
  • preserving records showing timely response to each request.

Platforms that use or allow AI-generated images or videos should pay particular attention to the law’s treatment of digital forgeries. The compliance risk is not limited to traditional “revenge porn.” AI-generated intimate imagery can trigger the same takedown obligations.

How Silverstein Legal Can Help

Silverstein Legal advises online platforms, creator businesses, adult-content companies, AI platforms, dating services, social media businesses, and subscription platforms on content moderation, user safety, age verification, privacy, FTC compliance, platform terms, and related trust-and-safety obligations.

We can assist with:

  • Take It Down Act coverage analysis;
  • policy and terms updates;
  • takedown form and workflow design;
  • content moderation and escalation procedures;
  • adult-content and creator-platform compliance;
  • AI-generated content policies;
  • FTC enforcement risk review; and
  • practical implementation plans for legal, compliance, and trust-and-safety teams.

The May 19 deadline leaves little room for delay. Platforms that host user content, intimate content, creator content, AI-generated media, or private messaging should review their compliance posture now.

For assistance with Take It Down Act compliance, contact Silverstein Legal.

Corey D. Silverstein
Managing Attorney, Silverstein Legal

About Silverstein Legal

Founded in 2006 by adult entertainment lawyer Corey D. Silverstein, Silverstein Legal is a boutique law firm that caters to the needs of anyone working in the adult entertainment industry. Silverstein Legal’s clients include hosting companies, affiliate programs, content producers, processors, designers, developers, and website operators.

Back To Top