Citing a mental health crisis among young people, California lawmakers are turning to social media

Carla Garcia said her son’s addiction to social media began in fourth grade, when he got his own virtual learning computer and logged onto YouTube. Now, two years later, the video-sharing site has replaced both schoolwork and activities he loved — like composing music or serenading his friends on the piano, she said.

“He just has to have his YouTube,” said Garcia, 56, of West Los Angeles.

Alessandro Greco, now 11 and a soon-to-be sixth-grader, watches videos even as he tells his mother to start homework, make his bed or practice his instrument. When she confronted him, she said, he got frustrated and said he hated himself because he felt like watching YouTube wasn’t a choice.

Alessandro tells her that he just can’t quit, that he’s addicted.

“It’s vicious — they took away my parenting skills,” Garcia said. “I can’t beat this.”

Some California lawmakers want to help Garcia and other parents protect their children’s mental health by targeting elements of the website they say are designed to attract children — like personalized posts that grab and keep viewers on a specific page, frequent push notifications that bring users back to their devices and autoplay features that provide a continuous stream of video content.

Two complementary bills in the state legislature would require websites, social media platforms or online products that children use – or could use – to remove features that can become addictive, collect their personal information and promote harmful content. Those who do not comply can face lawsuits and hefty fines. One of the measures would impose penalties of up to $7,500 per affected child in California — which could reach millions of dollars.

Federal lawmakers are making a similar push with bills that would tighten children’s privacy protections and target features that encourage addiction. It would be necessary for online platforms to provide tools to help parents track and control their children’s internet usage. The measures were approved by a US Senate committee on July 27.

“We need to protect children and their developing brains,” California Assemblyman Jordan Cunningham (R-San Luis Obispo), the lead author of both bills and a father of four, said at a committee hearing in June. “We must end Big Tech’s era of unrestricted social experiments on children.”

But Big Tech remains a formidable foe, and privacy advocates say they are concerned that one of California’s measures could increase data penetration for everyone. Both bills have passed the state Assembly, but it is unclear whether they will survive the state Senate.

Tech companies that wield enormous power in Sacramento say they already prioritize consumer mental health and are working to strengthen age-verification mechanisms. They also introduce parental controls and prohibit the exchange of messages between minors and adults they do not know.

But those bills could infringe on companies’ free speech rights and require changes to websites that can’t realistically be designed, said Dylan Hoffman, TechNet’s executive director for California and the Southwest. TechNet – Trade association for technology companies, including Meta (the parent company of Facebook and Instagram) and Snap Inc. (which owns Snapchat) – opposes the measures.

“This is an oversimplified solution to a complex problem, and there’s nothing we can offer to alleviate our concerns,” Hoffman said of one of the bills that specifically targets social media.

Last year, the US Surgeon General, Dr. Vivek Murthy, highlighted the nation’s youth mental health crisis and pointed to the use of social media as a potential contributor. Murthy said social media use among teenagers has been linked to anxiety and depression — even before the stress of covid-19. Then during the pandemic, he said, teenagers’ average non-academic screen time jumped from nearly four hours a day to nearly eight.

“What we’re really trying to do is just keep our kids safe,” Assemblywoman Buffy Weeks (D-Oakland), another lead author of the California bills and a mother of two, said at a committee hearing in June.

One of Cunningham and Wicks’ bills, AB 2273, would require all online services “likely to be accessed by a child” — which could include most websites — to minimize the collection and use of personal information about users under 18. This includes setting default privacy settings to the maximum level unless users prove they are 18 or older, and providing terms and service agreements in a language that a child can understand.

Modeled after a law passed in the UK, the measure also says companies must “consider the best interests of children when designing, developing and providing that service, product or feature”. This broad wording could allow prosecutors to target companies for features that are harmful to children. This may include continuous notifications that require children’s attention, or pages with suggestions based on a child’s activity history that may lead to harmful content. If the state attorney general finds that a company violated the law, it could face fines of up to $7,500 per affected child in California.

California’s other bill, AB 2408, would allow prosecutors to prosecute social media companies that knowingly prey on minors, which could result in fines of up to $250,000 for each violation. The original version also would have allowed parents to sue social media companies, but lawmakers removed that provision in June in the face of opposition from Big Tech.

Together, the two proposals in California attempt to impose some order on the largely unregulated environment of the Internet. If successful, they could improve the health and safety of children, said Dr. Jenny Radeski, an assistant professor of pediatrics at the University of Michigan School of Medicine and a member of the American Academy of Pediatrics, a group that supports the data protection bill.

“If we were going to a playground, you’d want a place that’s designed to allow the child to explore safely,” Radesky said. “Yet in the digital playground, much less attention is paid to how the child can play there.”

Radesky said she has witnessed the effects of these addictive elements firsthand. One night, as her then 11-year-old son was getting ready for bed, he asked her what a serial killer was, she said. He told her he learned the term online when unsolved murder mystery videos were automatically recommended to him after watching Pokémon videos on YouTube.

Adam Leventhal, director of the Addiction Science Institute at the University of Southern California, said YouTube recommendations and other tools that mine users’ online history to personalize their experiences contribute to social media addiction by trying to keep people online as long as possible. Because developing brains favor exploration and pleasurable experiences over impulse control, children are especially susceptible to many of social media’s tricks, he said.

“What social media offers is highly stimulating, very rapid feedback,” Leventhal said. “Anytime there’s an activity where you can get a pleasurable effect and get it quickly and get it when you want it, it increases the likelihood that the activity will be addictive.”

Rachel Holland, a spokeswoman for Meta, explained in a statement that the company has worked together with parents and teenagers to prioritize the well-being of children and mitigate the potential negative effects of its platforms. She pointed to various company initiatives: in December 2021, for example, it added supervision tools to Instagram that allow parents to review and limit children’s screen time. And in June, it began testing new age-verification tactics on Instagram, including asking some users to upload a video selfie.

Snap spokesman Pete Boogaard said in a statement that the company is protecting teens through steps that include banning public accounts for minors and turning off location sharing by default.

Meta and Snap declined to say whether they support or oppose the California bills. YouTube and TikTok did not respond to multiple requests for comment.

Privacy groups are raising red flags about the measures.

Eric Null, director of the privacy and data project at the Center for Democracy and Technology, said a provision in the data protection bill requiring privacy agreements to be written in age-appropriate language would be nearly impossible to enforce. “How do you write a privacy policy for a 7-year-old?” It seems to be especially hard to do when the child can barely read,” Null said.

And because the bill would limit the collection of children’s personal information — but still require platforms that children can access to collect enough details to verify a user’s age — it could increase data penetration for all users, he said. he. “This will further incentivize all online companies to verify the age of all their users, which is somewhat counterintuitive,” Null said. “You’re trying to protect privacy, but you’re actually now requiring a lot more data collection for every user you have.”

But Carla Garcia is desperate for action.

Fortunately, she said, her son does not watch violent videos. Alessandro prefers videos from “America’s Got Talent” and “Britain’s Got Talent” and videos of one-hit wonders. But the addiction is real, she said.

Garcia hopes lawmakers will limit the ability of tech companies to constantly send her son content he can’t turn away from.

“If they can help, then help,” Garcia said. “Put in some rules and stop the algorithm, stop stalking my child.”

This article was reprinted from with permission from the Henry J. Kaiser Family Foundation. Kaiser Health News, an editorially independent news service, is a program of the Kaiser Family Foundation, a nonpartisan health policy research organization not affiliated with Kaiser Permanente.

Leave a Comment