The Children’s Online Privacy Protection Act (COPPA), passed in 1998, requires sites targeting children under 13 to obtain parental consent before collecting data. Over two decades later, the digital environment has transformed dramatically, yet COPPA has remained largely unchanged and out of step with how the internet operates today.

Recognizing this gap, the Federal Trade Commission (FTC) introduced a slate of proposed amendments to COPPA in late 2023, which were finalized in January 2025. These updates were intended to modernize the rule to reflect how today’s platforms collect, store, and use children's data, especially in an era of algorithmic targeting, third-party tracking, and long-term data retention. But now, those changes may not be implemented.

Amid growing political pressure and a regulatory freeze issued after the inauguration of President Trump in January 2025, three key updates are in at risk of being eliminated: expanded parental consent requirements, new limits on data retention, and the lack of clear exemptions for collecting data solely for age verification. For plaintiff attorneys handling privacy-related class actions, the failure of these amendments could impact both the volume and viability of future cases.

What is COPPA?

COPPA was enacted to protect the personal information of children under the age of 13. It requires websites, apps, and online services that are either directed at children or knowingly collect data from children to:

  • Post a clear privacy policy.
  • Provide direct notice to parents about data collection practices.
  • Obtain verifiable parental consent before collecting personal information.
  • Give parents the right to review and delete their children’s data.
  • Maintain reasonable procedures to protect the security of collected data.

The Act defines personal information to include names, addresses, screen names, geolocation data, persistent identifiers like cookies, biometric data, and more. Enforcement authority lies with the FTC and state attorneys general, and violations can result in substantial civil penalties. However, COPPA does not provide a private right of action, meaning individuals cannot sue directly under the law.

Despite the law’s broad reach, enforcement has historically been uneven, and the rule’s definitions and requirements haven’t kept pace with how kids interact with digital platforms today.

This is where the 2025 amendments come in.

The Stalled Amendments: What's at Risk of Being Eliminated?

In January 2025, the FTC finalized changes to modernize COPPA that reflected years of public comments and emerging concerns about children’s digital safety, known as the Final Rule.

“The updated COPPA rule strengthens key protections for kids’ privacy online,” said FTC Chair Lina M. Khan. “By requiring parents to opt in to targeted advertising practices, this final rule prohibits platforms and service providers from sharing and monetizing children’s data without active permission.”

However, the proposed amendments are currently stalled due to a combination of political shifts and regulatory considerations. Following the inauguration of President Trump, a regulatory freeze has paused implementation, and new FTC Chair Andrew Ferguson has raised concerns with parts of the Final Rule.

While Ferguson voted in favor of the Final Rule, he later identified three areas for potential revision:

  • Clarifying when a change to privacy terms is "material" such that it requires new parental consent.
  • Modifying the prohibition on indefinite retention of personal information to clarify what qualifies as "reasonable" data retention.
  • Creating an exception for collecting children’s personal information solely for age verification, provided it is deleted immediately after use.

The following are three major proposed changes now at risk:

1. Parental consent expansion 

What was proposed: Currently, companies must obtain verifiable parental consent before collecting a child’s data. The amendment would go further, requiring companies to identify specific third parties receiving the data and obtain new consent if those third parties or purposes change.

Why it matters: This provision would close loopholes that allow companies to share children’s data with third-party advertisers or analytics platforms under vague consent notices. For attorneys, it could clarify liability for unlawful or undisclosed data transfers.

What’s the pushback: Chair Ferguson argued that companies should only need to obtain new consent when changes “materially increase” the risk to children's privacy or security. Without a clearer definition of "material," the rule could lead to excessive burdens on companies.

2. Limits on data storage 

What was proposed: The amendment would require companies to retain children’s personal data only as long as reasonably necessary for the specific purpose it was collected, and explicitly prohibit indefinite retention.

Why it matters: Many platforms retain data for extended periods, exposing it to potential misuse, breaches, or unauthorized reuse. For plaintiff attorneys, long-term storage often helps establish a pattern of negligent data practices or unreasonable risk of harm.

What’s the pushback: Ferguson criticized the lack of clarity around what counts as "indefinite" and warned that the rule could lead to perverse outcomes, such as premature deletion of data with value to families. He called for clearer guidance on what constitutes reasonable retention.

3. Lack of age verification exemptions 

What was proposed: The FTC declined to include an exemption for companies collecting children’s personal information solely to verify age, even if that data is deleted immediately after.

Why it matters: Age verification is essential for determining whether COPPA applies. But collecting age data itself counts as personal information, triggering COPPA’s requirements. Without an exception, companies may skip age checks to avoid liability.

What’s the pushback: Ferguson supported a narrow exception allowing companies to collect and promptly delete age verification data. Without such an exemption, companies face a Catch-22: they need consent to collect data that tells them if they need consent.

Impact on Plaintiffs Attorneys

If these amendments are revoked or significantly revised, plaintiff firms handling privacy-related class actions may face greater difficulty proving violations of children’s privacy rights. Weakened protections could also mean fewer cases meet the legal thresholds for litigation.

  1. Fewer actionable violations: Stronger parental consent and data retention rules would have increased the chances that companies would fall out of compliance. Without these clarifications, many gray areas remain, and companies can argue they operated within the limits of the law. That means fewer clear-cut cases to pursue under state laws that mirror COPPA.
  2. Increased difficulty proving harm: In privacy litigation involving minors, proving harm is already challenging. Without updated rules that clearly prohibit practices like indefinite retention or vague third-party sharing, plaintiff attorneys may have a harder time establishing that a defendant acted unreasonably or illegally.
  3. Barriers to class certification One benefit of the proposed amendments was that they could have led to more uniform violations, such as a platform sharing all children's data with the same third party. These shared experiences support commonality under Rule 23. Without clearer standards, violations may appear more individualized, complicating class certification.
  4. Missed opportunity to shift industry behavior Without stronger regulation, enforcement shifts to civil litigation. But without a strengthened COPPA framework, it’s more difficult to build pressure on platforms to change data-sharing and retention practices that often happen behind the scenes. Plaintiffs’ attorneys, often the last line of defense, lose a key tool to drive systemic change.

While COPPA enforcement remains in the hands of the FTC and state attorneys general, state-law-based class actions citing parallel violations have become more viable in recent years. The FTC’s failure to modernize the rule would limit not only regulatory impact but also the ability of plaintiffs' attorneys to hold platforms accountable in court.

A New Era of Privacy Enforcement: The Role of Legal Intelligence

If regulatory reforms stall, the burden of enforcement will increasingly fall to the civil justice system and the attorneys prepared to take it on, making companies like Darrow a critical partner in litigating data privacy claims.

Darrow’s Legal Intelligence Platform uses a combination of AI-powered anomaly detection algorithms and human expertise to identify potential breaches of privacy laws by scanning, evaluating, and clustering publicly-available data. We partner with plaintiffs’ attorneys to uncover and pursue high-value data privacy violations, ensuring that unauthorized disclosures don’t escape legal scrutiny and those violating privacy laws are held accountable.

Beyond case discovery, our in-house legal consultants and data privacy experts provide ongoing strategic support, helping firms refine claims, anticipate defenses, and increase their chances of securing favorable resolutions. With Darrow, attorneys can focus on building strong arguments and driving meaningful change in data privacy enforcement and litigation.

Partner with Darrow to find new data privacy cases and grow your practice.

This Might Interest You:

Partner with Darrow to grow your practice