CA Legislature Passes Bill Requiring Social Media Firms to Consider Children’s Health in Designing Products

Source: WSJ | Published on August 31, 2022

California’s legislature passed a bill Tuesday that would, for the first time in the United States, require developers of social-media apps such as Facebook, Instagram, and TikTok to consider minors’ physical and mental health when designing their products.

The bill was passed unanimously and bipartisanly in the Assembly after passing unanimously in the state Senate on Monday. Democrats hold the majority in both chambers.
Governor Gavin Newsom has not said whether he will sign or veto the bill. A Democrat spokesman declined to comment.

“California is home to the tech innovation space, and we welcome that,” said state Assembly member Buffy Wicks, a Democrat and the bill’s primary author, at a news conference Tuesday morning urging Gov. Gavin Newsom to sign the bill. “But I also want to make sure our children are safe, and they are not safe right now.”

Social media companies lobbied against the bill, claiming that different state laws governing their apps would make compliance difficult.

Its passage follows the defeat of a separate bill that would have allowed government lawyers to sue social media companies if their apps cause harm or addiction in children.

Representatives from companies such as Meta Platforms Inc., SNAP Inc., and Twitter Inc. lobbied hard against the bill.

The bill that was passed on Tuesday would require social media companies to conduct research on products and features that are likely to be accessed by minors in order to assess and mitigate potential harm before making them public. Those assessments would have to be provided to the state attorney general if requested, but the contents would not be made public.

It would also require businesses to disclose their privacy policies in a language that children understand, as well as prohibit minor profiling and the use of tools that encourage children to share personal information.

Furthermore, it would prohibit companies from tracking children’s precise geolocation unless the child is notified, and it would prohibit companies from using children’s personal information in ways that are deemed harmful to their health.

Companies that violate the rules may face product injunctions and fines of up to $2,500 per affected child for each violation, and up to $7,500 per child if the violation was intentional.

The provisions of the bill would take effect in July 2024 if signed by Mr. Newsom.

According to Ms. Wicks, the bill is modeled after a similar law in the United Kingdom that requires social media companies to design their products with children in mind. Alphabet Inc.’s Google, for example, has made safe search, which screens out potentially inappropriate content, its default browsing mode in the United Kingdom, while TikTok and Instagram have disabled direct messaging between children and adults they don’t follow.

Representatives from Meta and TechNet, a trade group for the technology industry, previously stated that they preferred a bill that regulated the design of their products over one that held them liable for child harm, such as the one that failed earlier this month.

However, industry representatives stated that they continue to oppose the measure that was passed and are urging Mr. Newsom to veto it.

“We support the bill’s intent, and protecting children online remains a top priority.” But it must be done responsibly and effectively,” said Dylan Hoffman, executive director of TechNet’s California and Southwest regions, which lobbied heavily against both bills. “While this bill has improved, we are still concerned about its unintended consequences in California and throughout the country.”

TechNet, he says, is instead advocating for a federal privacy law that would establish national standards to protect children online.

Lobbyists for the tech industry and social media companies unsuccessfully pushed to lower the bill’s applicable age to 13 and to limit it to products and services “directed at” children rather than “likely to be accessed” by them.