YouTube Unveils New Privacy Rules for Kids Content

The company will limit data collection and disable targeted ads on videos meant for children

Youtube app icon GettyImages-1063528748

YouTube unveiled new privacy rules limiting data collection and advertising on content intended for kids.

The changes are part of a series of updates YouTube promised after action from federal regulators in 2019.

The video platform’s parent company, Google, agreed to pay $170 million to settle claims that YouTube had violated the Children’s Online Privacy Protection Act (COPPA) by knowingly collecting personal data from children under 13 without parental consent and using that data to target youngsters with ads.

More on Privacy

YouTube says it will now restrict data collection from anyone viewing children’s content and stop serving personalized ads with those videos.

Users may still see “contextual” ads based on the contents of the page or the video, though. The company says it will also turn off features such as comments and live chatting for kids content.

“YouTube now treats personal information from anyone watching children’s content on the platform as coming from a child, regardless of the age of the user,” the company said in a blog post.

To identify videos receiving the extra protections, YouTube will require content creators to tag videos for kids as children’s content.

“Creators know their content best, and should set the designation themselves,” the company said. According to YouTube, the platform will also use machine learning algorithms to help ensure that children’s content is being identified accurately.

“This is a big win for kids privacy,” says Josh Golin, executive director of the Campaign for a Commercial-Free Childhood. “It’s unfortunate that it took this long, when YouTube was so blatantly violating COPPA, but it’s a really big deal that the No. 1 children’s site in the world is changing its data collection and advertising practices on child-directed content.”

Despite the victory, some advocates say the new policy places an undue burden on content creators, many of whom have argued that limiting advertising will cause a dip in the revenue they earn from their videos.

“Google should be primarily responsible for ensuring that the privacy of children is protected—not the content creators,” says Jeff Chester, executive director of the Center for Digital Democracy, who reached Consumer Reports by email. “Google has access to more data on who watches, what ads appear, and much more. While Google promises to use an AI tool to also identify child-directed content, to help supplement what creators tell them, it’s an insufficient response.”

The announcement from YouTube comes as children’s privacy rights receive greater attention from companies and regulators alike.

The new California Consumer Privacy Act broadens privacy rules for children, extending special protections to kids as old as 16. (COPPA applies only to those younger than 13.) Apple introduced new guidelines for app developers last June, banning third-party advertising and analytics in the kids category. And shortly after Google’s YouTube settlement last September, Epic Games, developer of the popular video game Fortnite, updated its privacy policy to clarify practices covered by COPPA.

Some privacy advocates worry, however, that the Federal Trade Commission, which has indicated that it may update COPPA rules in the near future, may be thinking about weakening the regulations.

Consumer Reports recently filed comments encouraging the commission to strengthen the restrictions and take greater steps to enforce them. The organization also joined 31 advocacy groups, including the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, in sending a letter calling on the FTC to conduct an investigation into the children’s digital media marketplace before proposing any changes to privacy protections.

“For years, the FTC has been shirking its responsibility to enforce the protections promised to families under COPPA, leaving parents in a situation where they have to rely on gatekeepers like YouTube and Facebook to ​police digital services and products directed at children,” says Katie McInnis, a policy counsel at Consumer Reports. “The FTC should be doing more to ensure companies are following the rules that are already on the books.”

Headshot image of Electronics editor Thomas Germain

Thomas Germain

I want to live in a world where consumers take advantage of technology, not the other way around. Access to reliable information is the way to make that happen, and that's why I spend my time chasing it down. When I'm off the clock, you can find me working my way through an ever-growing list of podcasts. Got a tip? Drop me an email ( or follow me on Twitter ( @ThomasGermain) for my contact info on Signal.