The Federal Trade Commission on Wednesday proposed sweeping changes to bolster the key federal rule that has protected children’s privacy online, in one of the most significant attempts by the U.S. government to strengthen consumer privacy in more than a decade.
The changes are intended to strengthen the rules underlying the Children’s Online Privacy Protection Act of 1998, a law that restricts the online tracking of youngsters by services like social media apps, video game platforms, toy retailers and digital advertising networks. Regulators said the moves would “shift the burden” of online safety from parents to apps and other digital services while curbing how platforms may use and monetize children’s data.
Among other things, the proposed changes would require certain online services to turn off targeted advertising by default for children under 13. They would prohibit the online services from using personal details like a child’s cellphone number to induce youngsters to stay on their platforms longer. That means online services would no longer be able to use personal data to bombard young children with push notifications.
The proposed updates would also strengthen security requirements for online services that collect children’s data as well as limit the length of time online services could keep that information. And they would limit the collection of student data by learning apps and other educational-tech providers, by allowing schools to consent to the collection of children’s personal details only for educational purposes, not commercial purposes.
“Kids must be able to play and learn online without being endlessly tracked by companies looking to hoard and monetize their personal data,” Lina M. Khan, the chair of the Federal Trade Commission, said in a statement on Wednesday. “By requiring firms to better safeguard kids’ data, our proposal places affirmative obligations on service providers and prohibits them from outsourcing their responsibilities to parents.”
COPPA is the central federal law protecting children online in the United States., despite members of Congress attempting to introduce other bills since then.
Under the COPPA law, online services aimed at children, or those that know they have children on their platform, must obtain a parent’s permission before collecting, using or sharing personal details — such as first and last names, addresses and phone numbers — from a child under the age of 13.
To comply with the law, popular apps like Instagram and TikTok have terms of service that prohibit children under 13 from setting up accounts. Social media and video game apps typically ask new users to provide their birth dates.
Still, regulators have filed numerous complaints against large tech companies accusing them of failing to set up effective age-gating systems; showing targeted ads to kids based on their online behavior without parental permission; enabling strangers to contact children online; or keeping children’s data even after parents asked for it to be deleted. Amazon; Microsoft; Google and its YouTube platform; Epic Games, the maker of Fortnite; and Musical.ly, the social app now known as TikTok, have all paid multimillion dollar fines to settle charges that they violated the law.
The F.T.C.’s proposal to strengthen children’s privacy protections comes amid heightened public concern over the potential mental health and physical safety risks that popular online services may pose to young people online. Parents, pediatricians and children’s groups warn that social media content recommendation systems have routinely shown inappropriate content promoting self-harm, eating disorders and plastic surgery to young girls. And some school officials worry that social media platforms distract students from their schoolwork in class.
States have passed more than a dozen laws this year that restrict minors’ access to social media networks or pornography sites. Industry trade groups have successfully filed lawsuits to temporarily block several of those laws.
The F.T.C. began reviewing the children’s privacy rule in 2019, receiving more than 175,000 comments from tech and advertising industry trade groups, video content developers, consumer advocacy groups and members of Congress. The resulting proposal runs more than 150 pages.
Proposed changes include narrowing an exception that allows online services to collect persistent identification codes for children for certain internal operations, like product improvement, consumer personalization or fraud prevention, without parental consent.
The proposed changes would prohibit online operators from using such user-tracking codes to maximize the amount of time children spend on their platforms. That means online services would not be able to use techniques like sending mobile phone notifications “to prompt the child to engage with the site or service, without verifiable parental consent,” according to the proposal.
How online services will comply with such proposed changes is not yet known. Members of the public have 60 days to comment on the proposed changes to the children’s privacy rule. Then the commission will vote on them.