Subscribe
The California Age-Appropriate Design Code Act’s passage is part of a growing nationwide push to hold tech companies like Instagram, TikTok and Snapchat accountable for how their services may affect children’s mental health and safety.

The California Age-Appropriate Design Code Act’s passage is part of a growing nationwide push to hold tech companies like Instagram, TikTok and Snapchat accountable for how their services may affect children’s mental health and safety. (Pexels)

California state lawmakers passed a major children’s online safety measure on Tuesday that would require digital platforms to vet whether new products may pose harm to kids and teens before rolling them out and to offer privacy guardrails to younger users by default.

Children’s safety advocates say the legislation, the California Age-Appropriate Design Code Act, would make the state a national leader in setting protections for kids and teens online. It’s passage is part of a growing nationwide push to hold tech companies like Instagram, TikTok and Snapchat accountable for how their services may affect children’s mental health and safety.

Its passage is likely to heighten calls for Congress to pass new guardrails for children’s personal information and online activity. Similar efforts in Washington are dragging amid disagreement between House and Senate lawmakers over whether to prioritize expanding protections for children or advancing privacy safeguards for all consumers.

The California Senate approved the measure 33-0 late Monday, and the state’s Assembly, which previously greenlit an earlier version of the bill, voted 60-0 to advance the bill to Democratic Gov. Gavin Newsom’s desk.

Newsom has not taken a public stance on the bill. His spokespeople did not return requests for comment on whether he intends to sign it into law.

“The passage of an age-appropriate design code in California is a huge step forward toward creating the internet that children and families deserve,” Josh Golin, executive director of the children’s safety group Fairplay, said in a statement Tuesday.

Trade groups representing major social media platforms including Facebook, TikTok and Twitter lobbied against the bill, arguing that it would hamper innovation and violate constitutional protections of free speech while failing to adequately protect families.

The legislation, proposed by Democratic Assemblymember Buffy Wicks and Republican Assemblymember Jordan Cunningham, explicitly requires platforms to “prioritize the privacy, safety, and well-being of children over commercial interests” when the two conflict in cases pertaining to users under 18.

If companies violate its provisions, they could be subject to civil fines of up to $7,500 per affected child or teen under enforcement action brought by the state’s attorney general.

“My hope is that now with the passage here in California, and hopefully soon to get the governor’s signature, it will be a model for the rest of the country and the world to keep our kids safe online,” Wicks, who modeled the legislation after a similar proposal in the United Kingdom, told The Washington Post on Tuesday.

California lawmakers earlier this year failed to advance a separate measure to open tech platforms up to liability if their design choices lead to addiction among users under 18, a topic of heated debate.

Common Sense Media CEO Jim Steyer, whose advocacy group backed both measures, said in a statement Tuesday that “the California Senate failed young people earlier this month by holding up” the measure targeting potentially addictive design features, such as autoplay functions aimed at keeping users online.

Steyer added that the group plans to “work with legislators next year to expand the meaningful progress just made on behalf of children and families across the state.”

Wicks called the legislation’s passage a “first step” in state efforts to protect kids online, and said California legislators plan to revisit plans for a broader package of kids safety bills next year.

Lawmakers in Washington have made more limited progress toward enacting guardrails for children online.

A key Senate panel in July advanced a pair of bipartisan proposals that would ban companies from collecting the data of users 13 to 16 years old without their consent and create an “eraser button” allowing children and parents to remove their data from platforms. Current federal law, passed in 1998, restricts the tracking and targeting of those younger than 13.

One of the measures, the Kids Online Safety Act, would require that platforms give kids the option to opt out of algorithmic recommendations and other potentially harmful features.

But neither of the bills has advanced in the House, where lawmakers are instead pushing to pass data privacy protections for all consumers, including by expanding guardrails for children. That effort faces major roadblocks in the Senate because of opposition from top lawmakers.

Amid years of impasse in those privacy efforts on Capitol Hill, a small but growing number of states have enacted or pushed to pass their own data protections, including the landmark California Consumer Privacy Act, signed into law in 2018.

The trend has spurred calls, particularly among Republican lawmakers and industry groups, to pass a federal standard to override state privacy laws and prevent a patchwork of state rules. But a slew of Democratic lawmakers and consumer advocates have pushed back, arguing that states should be able to expand on federal protections, including around children’s privacy.

“We’ll be initiating an aggressive outreach to both all my legislative leaders here in California, as well as the authors of the bills in D.C., and plan to meet with them to discuss this,” said Wicks, who said she’s “deeply concerned” about the push to override state privacy laws.

Sign Up for Daily Headlines

Sign up to receive a daily email of today's top military news stories from Stars and Stripes and top news outlets from around the world.

Sign Up Now