Opinion

CA must fix outdated regulatory system for digital platforms

Image by Vadym Plysiuk.

Capitol Weekly welcomes Opinions on California public policy or politics. Please read our guidelines for opinion pieces before submitting an Op-Ed. Submissions that do not adhere to our guidelines will not be considered for publication. 

OPINION – Last week, juries in two very different places – Los Angeles and New Mexico – both made similar landmark conclusions: That Meta and Google engineered their social media platforms in ways that intentionally addict children, erode their mental health and warp their development. After hearing weeks of argument from both sides, jurors ruled that online features disguised as engagement tools are, in effect, psychological levers designed to keep young users scrolling and scrolling.

Both rulings have drawn national and international attention because it speaks to an uncomfortable truth: Our current digital marketplace has allowed products that harm children to flourish without meaningful oversight.

Too many parents, educators, and child advocates have watched the toll of unregulated social media play out in real time. Their stories mirror what we heard in court: teens anxious, distracted, awash in comparisons, and increasingly overwhelmed by content optimized not for wellbeing, but for engagement.

On behalf of our families and children, California needs to lead the policy response. And that’s why I introduced legislation this year to establish a Social Media and e‑Safety Commission: an independent authority with the power to set safety standards, ensure transparency and hold digital platforms accountable before harm becomes litigation.

Right now, our regulatory system for digital platforms is woefully outdated. We regulate automobiles, pharmaceuticals and food products precisely because we recognize that unregulated markets can and do produce harmful outcomes. Yet tech companies can design, deploy and update products that reach hundreds of millions of users — including children — with little accountability and no mandated safety standards. The result, as the evidence in the Meta trial suggests, is a generation growing up in machines built to keep them hooked.

As a former executive in the tech industry, I am familiar with their playbook. I know how products and apps are designed, marketed and tested.

While some may say a state-by-state approach won’t work, I look to the work being done in Utah under Republican Governor Spencer Cox and others.

Like them, California has already taken steps. Last year we passed the Youth Social Media Protection Act, which requires platforms to respond when trusted adults report dangerous content threatening a child’s safety. It was an essential first step because it gives families a mechanism to intervene when things go wrong.

But it is not enough.

The Act remains fundamentally reactionary. It kicks in only after a trusted adult has already witnessed or reported harm. It does not prevent harm in the first place. It does not require platforms to demonstrate that their products are safe for children before distribution. And it does not ensure that design — the very code and user experience that affect a child’s attention, emotions and self‑image — is subject to oversight rooted in science and public health.

That is the gap the Social Media and e‑Safety Commission would fill.

This Commission would be mandated to develop and enforce safety standards grounded in child development expertise. It would require tech companies to disclose how their systems affect young users. It would set age‑appropriate safeguards, much as we do with other products that touch on public health and safety.

Some will object, suggesting that government regulation of tech stifles innovation. But regulation is not the enemy of innovation — unregulated markets that ignore human wellbeing are. True innovation should improve lives, not exploit human psychology for profit.

The verdicts against Meta and Google shine a spotlight on what is at stake. But trials — no matter how consequential — happen after the fact. They seek accountability after harm has occurred. As policymakers, we must also ask: how do we prevent harm before it becomes another headline, another courtroom case, another family crisis?

California’s leadership on this issue is not just timely — it is urgent. The digital world is not going away, and our children are not immune from its effects. If we care about their futures, we must pair accountability with proactive safeguards.

We have the opportunity — and the responsibility — to build rules that protect and empower rather than neglect and exploit. The social media trials may have just concluded, but the broader story will be written in the laws we choose to enact today.

Assemblymember Josh Lowenthal was elected to the State Assembly in 2022. Since 2024 he has served as the Speaker Pro Tempore.

Want to see more stories like this? Sign up for The Roundup, the free daily newsletter about California politics from the editors of Capitol Weekly. Stay up to date on the news you need to know.

Sign up below, then look for a confirmation email in your inbox.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Support for Capitol Weekly is Provided by: