A jury in New Mexico has found Meta liable for violating state law by failing to fully disclose risks to children using its social media platforms, delivering a major legal setback for the company that owns Facebook, Instagram and WhatsApp.
The verdict, issued on March 24, concluded that Meta breached the state's consumer protection law and imposed a $375 million fine. The case was brought by Raúl Torrez, who accused the company of violating the Unfair Practices Act by concealing known risks faced by minors on its platforms.
The lawsuit followed an undercover investigation in which adults were reportedly able to send inappropriate sexual material to agents posing as underage users. Prosecutors argued that the findings demonstrated systemic safety failures and a lack of transparency about the dangers children face online.
Torrez's office maintained that Meta was aware of these risks but failed to adequately inform the public. The jury ultimately sided with the state, marking one of the most significant legal penalties imposed on a social media company over child safety concerns.
Prosecutor Linda Singer had urged jurors to impose a much steeper $2 billion fine, arguing that the company's shortcomings were not accidental.
"They were a product of a corporate philosophy that chose growth and engagement over children’s safety," Singer said during the trial. She added that young users in New Mexico and across the United States have borne the consequences of those decisions.
During closing arguments, Singer emphasized a broader pattern of negligence, telling jurors that safety issues on Meta's platforms "weren't mistakes" but the result of deliberate corporate choices. She also argued that excessive use of the company's platforms among young people reflected a loss of control that has contributed to growing risks online.
Meta pushed back against the claims, with attorney Kevin Huff defending the company's efforts to protect users. He told the court that Meta employs approximately 40,000 people dedicated to safety and security across its platforms.
"Meta has 40,000 people working to make its apps as safe as possible," Huff said, while acknowledging the challenges of moderating vast amounts of user-generated content. "No one can, with billions of pieces of content every day, even the best system cannot catch all of it," he added.
The case also drew attention to concerns raised by lawmakers last year after reports surfaced about internal company policies. According to those reports, Meta's artificial intelligence tools were permitted under certain guidelines to engage minors in conversations that could be interpreted as romantic or sensual, prompting calls for further investigation by U.S. senators.
The jury's decision to hold Meta liable for failing to fully disclose risks to children using its social media platforms, as BrightU.AI's Enoch noted, is a significant victory for consumer protection and a clear message to tech giants that they must prioritize the safety and well-being of young users.
Watch this video discussing how Meta's "smart glasses" are extremely dangerous and could change society for the worse.
This video is from the Neroke-5 channel on Brighteon.com.
Sources include: