Florida Governor Ron DeSantis signed legislation on Monday that prohibits people under 14 years old from having social media accounts, regardless of their parents' consent, one of the most restrictive laws aimed at curbing social media access for minors.
Under the new law, social media companies are required to close accounts believed to be used by minors under 14 years of age. Platforms must also terminate accounts at the request of parents or minors and must delete all account information.
The law is set to take effect on Jan. 1, 2025.
Minors under 14 or 15 years old can have an account on a social network with the consent of their parents, according to the new law. Accounts already owned by teens of this age should be deleted if a parent or guardian does not give consent.
"Being buried in those devices all day is not the best way to grow up, it's not the best way to get a good education," DeSantis, a Republican, said at an event commemorating the bill signing on Monday.
The law does not name specific platforms but instead targets social media sites that rely on features like notification alerts and autoplay videos that encourage compulsive viewing.
Supporters of the law pointed to recent studies linking social media use among young people with an increased risk of depression and mental health problems. It can also make them vulnerable to online bullying and predators.
A Snap representative declined to comment. A representative for Meta Platforms had no immediate comment. Representatives for X and TikTok did not immediately respond to a request for comment.
Similar laws have been proposed in other states, but these laws fall short of the comprehensive ban enacted in Florida. A federal judge in Arkansas blocked a law in late August that would have required age verification of social media users and parental consent on minors' accounts.
NetChoice, a tech industry trade association whose members include Facebook parent Meta Platforms, TikTok, and Snap, sued to stop the Arkansas law last June. The association has filed similar legal challenges to proposed social media restrictions in California and Ohio.
Some social media platforms have already taken steps to restrict content shown to young users. On most sites, children under age 13 are protected by data collection laws, but these laws typically don’t apply to older minors.
In January, Meta announced new content filtering measures for teens of all ages.
Instagram and Facebook now automatically restrict teens from accessing harmful content, including videos and posts about self-harm, graphic violence, and eating disorders. Teens under 16 are also protected from sexually explicit content. Previously, teenagers could choose less strict environments.
The content restrictions came after more than 40 states sued Meta, alleging that the technology company misled the public about the risks its platforms posed to young people.
Some of these dangers were detailed in a series of Facebook profiles published by The Wall Street Journal in 2021, including an article that showed how Instagram knew its platform was toxic for many teen girls. Meta said it didn’t design its products to be addictive for teens.
Also Read: Nike deal ends German soccer's long-standing partnership with Adidas