A young boy watches a YouTube kids video on a tablet computer.

Whether they’re looking for a favorite Taylor Swift video or the latest viral video of a toy being unboxed, kids are heading to YouTube to find it.

By hosting such videos, children’s advocates argue, the platform violates a law intended to protect children’s data from being collected online and also exposes children to inappropriate content.

Now, reports in the Wall Street Journal and the Washington Post say that Google, which owns the website, could be preparing to remove children’s content from YouTube’s main platform, while the Federal Trade Commission comes close to completing an investigation into the site’s privacy practices.

More on Data Privacy

In an email to Consumer Reports, the FTC says it can’t comment on the matter. A Google spokesperson declined to address questions about an investigation, too, but did say the company considers a lot of ideas when it comes to improving YouTube.

Children’s advocates have long argued that YouTube’s business model is based in large part on luring kids onto the site, then keeping them there as long as possible, to track what they see and bombard them with targeted ads.

“All of this is unfair and manipulative to kids, and it violates COPPA [Children’s Online Privacy Protection Act], so it has to stop,” says David Monahan, campaign manager at the Campaign for a Commercial-Free Childhood (CCFC).

The federal law puts restrictions on the data that companies can collect from children and how the data can be used. It requires explicit parental consent before any data gathering can begin.

In April 2018, Consumer Reports, the CCFC, and 20 other advocacy groups filed a complaint with the FTC arguing that YouTube’s data collection violates COPPA’s privacy protections.

Parents also criticize the site for failing to properly screen the videos that get posted. In some cases, inappropriate content even makes its way to Google’s YouTube Kids app, which is supposed to face a higher level of scrutiny.

Earlier this year, a Florida woman said she found clips on both platforms that gave children instructions for how to kill themselves. The info was spliced into the middle of a seemingly innocent cartoon. YouTube took the video down.

The woman reported finding other videos on YouTube Kids that depicted sexual exploitation and abuse, human trafficking, a school shooting, gun violence, and domestic violence.

Katie McInnis, policy counsel at Consumer Reports, says YouTube relies too much on parents to protect their children from objectionable content.

“Parents should be involved in what their children are watching, but many—if not most—cannot monitor every video a child watches,” she says. “Relying on parents to report the problematic videos their child comes across was never going to be an effective system.”

The Challenge of the Cleanup

According to the Wall Street Journal, Google executives are now thinking about shifting all children’s content to the YouTube Kids app. A change like that would take major revenue-generating channels off YouTube’s main video platform, which is flush with kid-friendly content.

A full 81 percent of parents with children 11 or younger have permitted their kids to watch videos on YouTube, according to a November 2018 survey conducted by the Pew Research Center.

To keep children from straying into objectionable content, Google may also modify a default setting, tied to YouTube’s recommendation system, that automatically plays new videos every time the one being viewed ends, the Wall Street Journal reports.  

At the moment, children and adults don’t need an account to view YouTube content. To subscribe to the site’s channels and like and comment on videos, though, you have to set up a Google account, which requires you to give your age.

YouTube’s terms of service limit access to those features to kids 13 and older. If the company determines that a user is younger than that, it can shut down the account.

In a June 3 blog post, the company said it terminates thousands of accounts each week. And it said it’s constantly removing content to keep YouTube safe for kids; noting that it took down more than 800,000 videos during the first quarter of this year for violations of its child safety policies.

But many children use the account of a parent or older sibling to gain full access to the site’s features. And while Google’s YouTube Kids app, rolled out in 2015, is intended to be a safer, COPPA-compliant alternative to the adult platform, experts say it has failed to gain much traction beyond very young children.

Meanwhile, advertisers pay to put their ads on channels geared toward younger viewers, such as ChuChu TV Nursery Rhymes & Kids Songs, which has nearly 24 million subscribers.

And according to Forbes, last year’s top YouTube earner was a 7-year-old boy named Ryan, who earned $22 million in 2018 from the Google ads tied to videos that show him unboxing the latest toys. His channel has nearly 20 million subscribers, and his videos have been viewed 30 billion times. He now has his own show on Nickelodeon, as well as his own line of toys.

What's at Stake

If the FTC were to take action against YouTube, it could come in the form of a hefty fine or a consent decree that mandates new protections for kids, McInnis says.

But given Google’s size and reach, a “hefty” fine would have to be in the tens of billions of dollars, Monahan adds. “Anything smaller will simply be written off by Google as a cost of doing business.”

Any FTC settlement also needs to come with a promise that all kids content will be shifted fully from the main YouTube site to the YouTube Kids app, a change that his group has been urging for years. But that’s no small undertaking, particularly when it comes to defining kid content and adult content.

There are scores of classic Sesame Street clips, for example. And lots of popular fare created for video game players.