Child Safety Took a Back Seat to Profits as Meta Became a Marketplace for Predators
New Mexico lawsuit accuses Facebook and Instagram of enabling child sexual exploitation while ignoring internal safety warnings to maximize engagement and revenue.

Meta Platforms, the parent company of Facebook and Instagram, is facing a high-stakes jury trial in New Mexico over allegations that its social media networks became environments where child sexual exploitation and other harms to children thrived because safety took a back seat to engagement and profits.
The 2023 lawsuit, filed by New Mexico Attorney General Raúl Torrez, claims Meta allowed predators to target and groom children, failed to moderate groups used for commercial sex, and used addictive features like infinite scroll and auto-play to keep young users exposed to potentially harmful content. Internal documents reportedly show that safety staff repeatedly warned leadership about risks to children, but those concerns were allegedly ignored.
The state argues that these decisions directly exposed children to child sexual abuse material and solicitation, effectively making the platform unsafe for its youngest users. Meta denies wrongdoing, emphasizing its investments in safety tools, moderation technology, and collaborations with law enforcement and child protection experts. The company has also invoked Section 230 and free speech protections, arguing it cannot be held liable for user-generated content.
This trial marks a critical test of accountability for tech platforms. It raises urgent questions about whether children’s safety is being sacrificed for engagement metrics and profit growth. Experts warn that without proactive protections, online spaces designed for connection and learning can quickly become dangerous environments for children.
Why the Case Matters for Child Protection
Children cannot rely solely on age restrictions for safety. UNICEF has highlighted that protecting children online requires safer platform design, robust content moderation, and proactive measures that prevent harm while respecting their rights to privacy, participation, and expression. Companies, regulators, and families must collaborate to ensure children are not put at risk to boost profits or clicks.
Evidence presented in the trial is expected to show how algorithms that maximize screen time and features like poorly moderated groups or unmonitored messaging can expose children to exploitation. Warnings from employees about AI chatbots and other risks were reportedly overlooked, highlighting a pattern of prioritizing financial gain over safety.
Beyond the courtroom, this trial is part of a broader movement holding tech companies accountable for how they affect mental health, addictive behaviors, and unsafe environments for children. The outcome could set a precedent for how social media platforms are required to protect their youngest users in the future.
The case underscores a simple but critical principle: children’s right to safety should never be negotiable for profit. Digital platforms must be designed with children’s well-being at the center, not at the margins of a business model. As the trial unfolds, the world is watching to see if child protection will finally take priority over revenue.




