New Mexico's second-phase trial against Meta aims to impose strict regulations on how Instagram and Facebook handle content for minors. The state is asking the court to require transparent content curation, age-gated moderation, and algorithmic filtering designed to prevent exploitation and harassment of children.
What the state is asking for
The proposed restrictions target two core Meta practices:
- Algorithmic amplification: The state wants Meta to stop using engagement-maximizing algorithms that can push harmful content to minors. Instead, content curation for underage users would need to be transparent and auditable.
- Age-gated moderation: Meta would be required to enforce separate content moderation rules for accounts belonging to users under 18, with automated AI-driven filtering to block exploitative or harassing material.
The legal context
This is the second phase of a lawsuit New Mexico filed against Meta. The first phase focused on data collection and privacy violations. The second phase shifts to platform design and content delivery — specifically how Meta's algorithms recommend content to minors.
A ruling in favor of New Mexico could set a nationwide precedent. Other states have filed similar lawsuits against Meta and other social media platforms, but none have yet reached the stage where a court is asked to mandate specific technical changes to recommendation systems and moderation pipelines.
Technical implications
If the court grants the restrictions, Meta would need to:
- Build separate recommendation models for underage users that do not optimize for engagement metrics like watch time or shares.
- Implement age verification that is reliable enough to enforce the rules without creating a privacy-invasive system.
- Deploy AI-based content filters that can detect exploitation and harassment in real time, with a lower tolerance threshold for minor accounts.
None of these are technically impossible — similar systems exist in child-safety-focused apps and in enterprise content moderation tools. But retrofitting them into Instagram and Facebook's existing infrastructure, which serves billions of users, would be a major engineering undertaking.
Tradeoffs
- Privacy vs. safety: Age verification at scale often requires collecting more personal data, which creates its own privacy risks.
- Overblocking: AI filters trained to be aggressive on minor accounts may flag legitimate content, such as educational material about puberty or mental health.
- Jurisdictional fragmentation: If each state imposes different technical requirements, Meta would need to maintain multiple moderation pipelines, increasing complexity and cost.
Bottom line
New Mexico's trial is a test case for whether courts can force social media platforms to redesign their core algorithms for child safety. A verdict is months away, but the technical requirements the state is asking for are specific enough that engineers at Meta are already evaluating what it would take to comply.