

Table of Contents
State Tech Pulse Policy Brief: Digital Platforms & Youth Safety
Background: State leaders are sharpening their pencils and increasingly taking action around social media oversight. Action follows a wave of feedback from constituents and an increasingly complex web of pressures ranging from youth mental health concerns to long-standing federal inaction. This policy brief explores the 5 W’s (Who, What, Where, When, Why?) regarding the recent landmark social media ruling in New Mexico and the intersection with state-led social media regulations and actions aimed at online children’s safety.
State of NM vs. Meta Court Case & Intersection Between Other State Action

Public Policy Problem: Although social media platforms have been around for over 20 years, the federal government has not passed comprehensive tech legislation. State legislators have heard from constituents about the increasing use of social media and its impacts on young people. In recent years, state leaders have aimed to fill the gaps in policy to protect youth from negative effects. Social media has increasingly become more ingrained in everyday life, alongside amplified content creation and distribution driven by artificial intelligence (AI) algorithms and AI-generated content.
Although research is ongoing and not fully agreed upon regarding the impacts of prolonged social media use on youth, leaders from across society have stepped up and urged action. For example, the US Surgeon General issued a “Social Media and Youth Mental Health” advisory in 2023 calling for collective action from adolescents, policymakers, technology companies, researchers, families and young people to engage in a proactive and multifaceted approach to minimize the harms of social media platforms and to create safer, healthier online environments for children.
Public policy is based on the idea that governments design interventions, incentives and regulations when there is a market failure. Increasing public pressure has outlined that social media platform safeguards towards youth is not enough, and online children's safety is important. There is tension between whether current laws on the books protect consumers or if laws need to be updated and modernized to include emerging technology applications.

Proposed Policy Solutions: Over the past five years, bipartisan state‑led proposals on social media regulation, children’s online safety, and youth privacy have achieved considerable momentum across the country. Many of these laws have been challenged in court, with courts in several states pausing enforcement while litigation proceeds.
The lawsuit brought on by New Mexico against Meta acts as an early test. This case will help clarify the question on many legislators' minds. “Do emerging technologies require new statutory frameworks, or can existing laws on child exploitation, children’s privacy, and consumer protection already reach platform conduct?” Meanwhile, thousands of lawsuits against major social media companies are active nationwide, demonstrating the scale and complexity of the legal environment states are navigating.
Here are 10 types of legislative proposals state legislators across the US are putting forward:
Age verification for social media accounts and to access harmful materials online.
Parental consent to open new social media accounts.
Children’s online privacy.
Specific requirements for settings on minor accounts, content moderation and platform design.
Streamlined search warrant processes that social media companies must comply with.
Platform-led impact assessments to address whether the design, service or feature of an online service harms children.
School cell phone policies.
Protections for children when interacting with AI chatbots.
Updating child pornography laws to include AI-generated content.
Protections for content creators.
Did you know that state legislators in over 30 states passed over 50 new laws focused on children online safety over the past few years? This number is growing.
Attorney Generals across the US are filing lawsuits to test current laws already on the books related to consumer protections, deceptive business practices, human trafficking, child exploitation, child pornography, among other areas of law.
Over 40 states elect their state’s Attorney General, whereas the other state Attorney Generals are appointed by the Governor, Supreme Court, or State Legislature. Twenty-five states, including Puerto Rico and the District of Columbia, require the AG to be authorized to practice law in the state.
Aside from signing legislation, governors have also been active in this space in other ways.
New Jersey’s governor signed an executive order to create the Office of Youth Online Mental Health Safety.
New York’s governor unveiled a legislative package with several strategies, including a bill that would require social media companies to post warnings about the platform's potential impact on mental health.
Utah’s governor launched a “Harms of Social Media” public awareness campaign.

State of NM vs. Meta Court Case & Intersection Between Other State Action
Who? Attorneys General across the US, school districts and individuals are actively looking to the courts and state legislatures to create rules to foster online children's safety across social media platforms.
What? In 2023, New Mexico Attorney General Raúl Torrez initiated a lawsuit against Meta and its family of apps, including Instagram, and its CEO, Mark Zuckerberg. After a multi-week trial, a jury in March 2026 ordered Meta to pay $5,000 per violation, totaling $375 million in civil penalties for violating the state’s consumer protection laws. The state sought to claim up to $2 billion in damages. In the courtroom, jurors indicated there was initial disagreement over the number of violations because the figure was tied to the population of New Mexico youth, and they were presented with differing data points. Verdicts like these are likely to influence how future state legislation is drafted.
Where? The lawsuit was brought on behalf of NM consumers, including young social media users and their families. In addition to Attorney General lawsuits, schools, teachers, parents and individuals are increasingly coming forward with concerns and legal action across the US.
When? The timeline…
Oct 2023: Over 30 states filed a joint federal lawsuit (California, et. al. v. Meta Inc. et al.) against Meta, alleging the company routinely collects data from minors without parental consent in violation of the federal Children's Online Privacy Protection (COPPA rule.) States argued that residents of their respective states are protected under state consumer protection, fraud, deceptive business practices, and unfair trade practices-related laws. Other states have filed their own individual lawsuits.
Fall 2023: The New Mexico Department of Justice conducts an undercover investigation into Meta by creating some minor accounts on Meta products.
Dec 2023: New Mexico Attorney General files a lawsuit against Meta and the company’s CEO, alleging violations of New Mexico law and harms to New Mexicans.
May 2024: NM judge rejects Meta’s motion to dismiss the case, but granted the request to drop the CEO of Meta from the lawsuit.
May 2024: NM AG’s undercover operation and investigation into Meta led to several arrests of individuals in New Mexico for child solicitation, among other crimes.
December 2025: States involved in a multi-state joint lawsuit wrote a letter to the court arguing the issue would be best resolved in a joint trial.
March 24, 2026: A jury in Santa Fe, NM, sided with the state and found Meta liable for 37,500 violations on two different questions at $5 thousand per violation. Meta states they intend to appeal the decision.
March 25, 2026: A California jury found Meta and Google responsible for the mental health struggles of a young victim who used social media from a young age. The legal argument hinged on how platforms were designed to encourage prolonged, harmful use. Before the trial began, TikTok and Snapchat settled with the plaintiff.
May 2026: A second phase of trial is scheduled in New Mexico, without a jury, where a judge will determine whether Meta created a public nuisance and if the company needs to take additional steps to address alleged harms.
Why? At the end of March, juries in both New Mexico and California reached verdicts in lawsuits focused on youth harm on online platforms. Although the cases were brought forward by different entities and relied on different legal arguments, both cases focused on platform design. According to the complaints, platforms are designed to encourage prolonged use and lead to harms such as access to sexually explicit content and connection with child predators.
New Mexico’s case is unique. Over two years of litigation, the Attorney General gathered evidence that the company’s design choices were intentional and exposed young people to exploitation and harm. The case went on for weeks in a Santa Fe courtroom where jurors heard from current and former Meta employees, law enforcement officials, New Mexico educators, and industry experts. Evidence in the case included internal Meta documents discussing platform features and potential harm.
The case marked a turn in the debate, which previously centered around content moderation and First Amendment protections. For years, tech companies were largely shielded from liability by Section 230, which protects platforms from being held responsible for content posted by users. These new cases test the limits of that protection by focusing on product design rather than speech. New Mexico grounded its claims in state law, specifically four actions under the New Mexico Unfair Practices Act (NMSA 1978, Sections 57-12-1 through 26) and a public nuisance claim.
The complaint alleged the company knowingly made false or misleading representations regarding platform safety, addictiveness and youth well-being. The complaint alleged the company engaged in coercive, exploitative, abusive, deceptive and predatory practices.
Longstanding protections are being challenged as states scrutinize the design choices that shape young people’s online experiences.
Do you have leads, tips, corrections, feedback or resources you would like to share? Send your advice to [email protected].
Disclosure: This is a human-written and driven publication. As a small business owner and mighty team of one, I use AI tools to optimize my small business operations as a part of my admin tech stack. Regarding this publication, AI is mainly used to help with catchy titles, as a thesaurus when writing and a partner when creating cartoons. (Thanks, Canva, and not an ad!) As a secret doodler, I add my human touch using my digital pad and pen. I also use Grammarly, with AI built in, to help with copy editing/grammar (again, mighty team of one!) Thanks for reading. 😊
