SOCIAL MEDIA LITIGATION—THE PROTECTION OF OUR CHILDREN SHOULD NOT BE LEFT SOLELY TO THE PLAINTIFF’S TRIAL BAR

Children using social media

SOCIAL MEDIA LITIGATION—THE PROTECTION OF OUR CHILDREN SHOULD NOT BE LEFT SOLELY TO THE PLAINTIFF’S TRIAL BAR

INTRODUCTION AND BACKGROUND

Last week we read the news regarding verdicts against a host of social media platforms, including Facebook (Meta), YouTube, TikTok, and others. The core allegations in both the New Mexico and the California cases were the engagement of children through the manipulation of algorithms, ignoring the mental health warnings resulting from such practices, as well as the facilitation of the platforms as a hunting ground for child predation. The key distinctions in the objectives of the two cases are at the core of this article. In New Mexico, the Attorney General took the platform Meta to task for violation of state consumer protection laws designed to protect the public at large. In California, a trial lawyer represented a single client who may someday be compensated for individual harms. The attorney involved in the California case is highly skilled and crafted a workaround to allege that the “design” of the platform created a state law tort law exception to the broad immunity Congress gave to such platforms in the early days of the internet in 1996 through Section 230 of the Communications Decency Act to be discussed below. It may not survive appeal on many different grounds. In New Mexico, the broad protections to the public at large were set forth in that state’s Consumer Protection Act, which prohibited misleading and outrageous trade practices that caused harm to a wide swath of the children of that jurisdiction. I posit herein that this is the approach that will create the most change. States must act through their Attorney’s General to enforce consumer protection laws as a fundamental right reserved to the states under well-settled principles of federalism. Given the hope that Congress might address the modern state of such platforms and reform Section 230, the United States Attorney General would also wield real authority to protect our most vulnerable citizens from the profit-driven practices of platforms like Meta. The plaintiff’s trial bar is not the complete answer, and such litigation has disadvantages over what we saw in New Mexico. The power of state law enforcement carries much more weight in the courts when backed by the resources that states can wield in consumer protection litigation.

The California case is causing a buzz among the plaintiff’s trial bar, which sees a link to the tobacco litigation of the 90s and a potential gold mine in fees. I suggest that the ultimate protection of the public will take place when Congress narrows the immunities gratuitously granted to such platforms collectively known to all of us as “Big Tech” and when states are permitted to enforce targeted consumer protection laws which are not in conflict with Section 230, discussed below. Many Attorneys General in the various states, blue or red, are united in the view that the protections offered in Section 230 are overbroad and have led to a purposeful gaming of the law while causing various harms, from child predation and negative influencing to viewpoint discrimination. It should not be lost on the dear reader that the money wielded by Big Tech is and has been their force field to guard them from the accountability that is long overdue. The problems that litigants face, whether the case involves individual plaintiff’s like the California case or the citizens of New Mexico as plaintiff’s in that litigation, are as follows:

 

  • Platforms like Meta have endless resources to appeal issues such as the preemption of state law by Section 230 and causation to specific harms;
  • The Supreme Court has not seemed interested in interpreting Section 230 narrowly so as to allow an expansive role for states to protect their citizens from the profit-fueled predation of Big Tech and has struck down some state laws targeting alleged viewpoint discrimination on such platforms;
  • Big Tech has successfully argued that state laws targeting its practices are preempted by Section 230 therefore shielding it from state law enforcement- this failed, for now, in New Mexico.

 

In the area of mental health and proximate causation to harm, the issues are very difficult in cases alleging specific harm to plaintiff’s or a class of plaintiff’s, and such personal injury tort cases are vulnerable to being overturned on appeal given the complexity of mental health issues in determining individual proximate causation. I submit that consumer protection causes of action do not meet these roadblocks like those present in individual tort cases. States need to be aggressive in asserting their federalism-based rights to protect their citizens using well-crafted consumer protection laws. In fact, in the New Mexico litigation a further proceeding is scheduled wherein injunctive relief, not available in routine tort litigation, will be imposed for the benefit of the citizens of New Mexico.

 

A BACKGROUND ON THE DATED SECTION 230 PROTECTIONS AND WHY THE POLICIES UNDERLYING THEM NEED TO BE REEVALUATED

I cannot say it any more succinctly than this excerpt from a piece which appeared as an online essay published in the University of Chicago Law Review and authored by Michael Daly Hawkins and Matthew Stanford. The piece was published when Bill Barr was Attorney General:

 

“Section 230 of the Communications Decency Act of 1996 broadly immunizes internet platforms from liability for content published by third parties and for removing or restricting access to certain classes of content. Congress enacted Section 230 in the early days of the internet, when platforms were young and had a limited potential audience, as well as a limited ability to affect behavior. The concept of internet platforms—available worldwide, with free-flowing information and endless perspectives, unencumbered by government regulation, and encouraged to engage in self-regulation—seemed both new and attractive. Besides, there were billions to be made, innovations to be rewarded, and taxes on capital gains to help balance the federal budget. Of course, there were competing interests at stake: For instance, the European Union expectedly opted to prioritize privacy and antitrust regulation. The United States meanwhile placed a premium on innovation. But there were other reasons. Defending the need for immunity, internet platforms pointed to the sheer volume of information flowing through their servers, making it virtually impossible to screen everything. One commentator captured the moment this way:

When Section 230 was adopted in 1996, it would have been impossible for a service like AOL to monitor its users in a wholly effective way. AOL couldn’t afford to hire tens of thousands of people to police what was said in its chat rooms, and the easy digital connection it offered was so magical that no one wanted the service to be saddled with such costs. Section 230, which granted platforms broad immunity for third-party content published on their services, was an easy sell.

Section 230 immunity has since been stretched beyond these original aims, shielding even those platforms that deliberately solicit or host illegal activity. The modern internet is inescapable, even essential, seeping into virtually every aspect of our lives. And its potential for affecting human behavior has long since been realized. News reports following mass shootings tell of inflammatory websites the assailants frequented as they contemplated opening fire in classrooms, synagogues, and public gatherings. Social media platforms favored by the young have been used to encourage teen suicide, and disgruntled spouses have used social media to humiliate their exes.

All this has led to serious policy discussions about whether it is time to revisit the blanket immunity provided by Section 230. Earlier this year, Attorney General William Barr suggested the time may have come to hold internet platforms accountable for the content flourishing on their sites and services. ‘No longer are tech companies the underdog upstarts,’ he said in a February speech reflecting on the broad immunity provided by Section 230. Sounding a bit like President Teddy Roosevelt blasting away at the railroad and steel producers at the turn of the last century, Attorney General Barr stated: ‘They have become titans.’”

The Big Tech lobby money machine has, to date, prevented meaningful narrowing of these immunities by Congress, particularly around the protection of vulnerable persons in society, including children. Meta, in the California case, introduced evidence of filters and types of protections calculated to limit such exposure to children. The jury rejected these, as did the New Mexico jury. If Congress has yet to meaningfully act, the various states must be aggressive in pursuing outrageous trade practices that run afoul of carefully crafted consumer protection laws. Just such an argument was made in an amicus brief to the United States Supreme Court in a case that was not ultimately fully reviewed, Gonzalez v. Google. The outstanding Attorney General of the State of Tennessee, Jonathan Skrmetti, submitted an amicus brief on behalf of 27 states, Virginia included, urging the Supreme Court to hear a case in which state law issues were implicated. The case was not a strong vehicle for the issues, as it alleged that Google was utilized, through its vehicle, YouTube, to recruit members of ISIS, and a family was victimized by an ISIS-inspired attack. The Court ultimately did not address Section 230 issues and gave short shrift to the state-based ones. However, the state law-based claims in that litigation and the power of states to regulate what its citizens are exposed to remains a very compelling issue yet to be fully resolved by either the Supreme Court or Congress. Mr. Skrmetti argued as follows:

 

“‘As every schoolchild learns, our Constitution establishes a system of dual sovereignty’ that divides power ‘between the States and the Federal Government.’ Gregory v. Ashcroft, 501 U.S. 452, 457 (1991). Under that system, the federal government wields only the ‘enumerated powers’ surrendered by the States in the Constitution, M’Culloch v. Maryland, 17 U.S. (4 Wheat.) 316, 405 (1819), which are necessarily ‘few and defined,’ The Federalist No. 45, at 313 (J.Madison) (J. Cooke ed. 1961).

The States, by contrast, retain ‘numerous and indefinite’ powers that ‘extend to all the objects . . . concern[ing] the lives, liberties, and properties of the people; and the internal order, improvement, and prosperity’ of the country. Id.; see also U.S. Const. amend. X (‘reserv[ing] to the States [and] the people’ all ‘powers not delegated’ to the federal government or ‘prohibited’ by the Constitution). Thus, while legitimate acts of Congress are the ‘supreme Law of the Land,’ U.S. Const. Art. VI, cl.2, the very fact that ‘[t]he States exist’ and exercise broad residual power ‘refut[es]’ any notion that the federal government acts as the ‘ultimate, preferred mechanism for expressing the people’s will,’ Alden v. Maine, 527 U.S. 706, 759 (1999).”

 

The 27 states submitting the brief through Mr. Skrmetti also argued:

 

“Since 230’s enactment, however, advances in computer technology have made the internet a dramatically different place. Social media companies that now claim Section 230 immunity do not just ‘publish’ user-generated material; they actively exploit it. To make money, they ‘run ads.’ Facebook, Social Media, Privacy, and the Use and Abuse of Data: Joint Hearing Before the S. Comm. on Com., Sci., & Transp. and the S. Comm. on the Judiciary, S. Hrg. No. 683, 115th Cong., Tr. (Doc. J-115-40) at 21 (statement of Mark Zuckerberg, Chairman and CEO of Facebook). And their customers (the advertisers) ‘want as many [people] as possible to see th[ose] ads.’ Sang Ah Kim, Social Media Algorithms: Why You See What You See, 2 Geo. L. Tech. Rev. 147, 148 (2017). The platforms thus seek to maximize ad exposure by ‘engaging’ users, getting them to spend as much time as possible ‘interacting with content on the platform, including viewing, liking, commenting, sharing, and saving [third-party] posts.’ Id. at 147–148.To drive this ‘engagement,’ the companies use sophisticated computer programming that recommends content to users in an intentional, deliberate way.”

 

Over the last two decades, many interactive computer services have in a variety of ways sought to recommend to users that they view third-party materials, such as written matter or videos. Those recommendations are implemented through automated algorithms, which select the specific material to be recommended to a particular user based on information about that user that is known to the interactive computer service. The public has only recently begun to understand the enormous prevalence and increasing sophistication of these algorithm-based recommendation practices. The foregoing was urged by the Gonzalez plaintiff in that litigation.

In connection with this briefing urging a stronger role for states, then Virginia Attorney General Jason Miyares commented, “In order for our technology laws to be effective and ensure consumers are protected, these laws must modernize as technology does to ensure that social media companies claiming Section 230 immunity are not exploiting users.”

A July 2023 publication by the National Association of Attorneys General noted the harms to children posed by this manipulation of algorithms to target certain users who are particularly vulnerable.

 

“A particular area of concern regarding recommendation algorithms is the exposure of children to harmful or dangerous content on websites such as YouTube and apps like TikTok. According to a research study funded by the European Union, ‘Young children are not only able, but likely to encounter disturbing videos when they randomly browse the platform starting from benign videos.’ Because of Section 230’s far-reaching protections, there is substantial uncertainty about whether YouTube and other social media platforms are subject to any liability when their algorithms lead children to such content. Parents and school districts have expressed concern that social media websites use algorithms that ‘exploit the psychology and neurophysiology of their users,’ a technique which is ‘particularly effective and harmful’ to children. According to research from the Pew Research Center, three out of five YouTube users reported that they have seen ‘videos that show people engaging in dangerous or troubling behavior.’ This suggests that there is a high probability that YouTube algorithms will lead children to dangerous or disturbing content, and will then keep them engaged for as long as possible.”

 

CONCLUSION

The innovations of plaintiff’s counsel in encouraging positive change through tort-based litigation can certainly be acknowledged, particularly in the products liability arena. However, in this specific context, the power of the states, exercised by their respective Attorneys General, must be encouraged and enhanced. Scores of state jurisdictions joining issue against platforms like Meta and Google, coupled with meaningful reform and modernization of these hollow and baseless Section 230 protections by Congress, will benefit society immeasurably. It would be preferable for a broad federal reform of these immunities to remove all doubt, but that may not happen given Congress’s political inability to address the issue to date. Plaintiff’s counsel, in the private arena, have an undeniable profit motive and this creates political pushbacks and division. The protective role of elected and accountable state Attorneys General to take up the gauntlet to protect the citizens who elected them using the resources of their treasuries is the greatest deterrent to what we all know—Big Tech is out of control. Money corrupts, and absolute money corrupts absolutely.

 

Mike Imprevento
March 30th, 2026

Author Photo

Author: Mike Imprevento

With decades of experience as a complex litigation attorney in private practice, Mike brings a deep understanding of the legal system to his writing. His insights are sharpened by his diverse background, having served as a Lieutenant in the Navy Judge Advocate General's Corps and a Captain in the Norfolk Sheriff’s Office. Together, these roles offer a unique, no-nonsense perspective on justice and the law.

Share this Posts

Recent Into the Weeds Posts

Loading...