YouTube Kids is getting a little safer with enhanced parental controls

Posted April 26, 2018

James Beser, product director for YouTube Kids, said said: "From collections of channels from trusted partners to enabling parents to select each video and channel themselves, we're putting parents in the driver's seat like never before". The app is now used by 11 million viewers weekly.

YouTube Kids is a dedicated app from the video network that aims to provide a child friendly experience. He added that the YouTube Kids app was meant "to give kids around the world a place to access videos that were enriching, engaging and allowed them to explore their endless interests".

When a video for children is uploaded to the main YouTube platform, it is not automatically added to the YouTube Kids library.

YouTube has introduced three new ways for parents to monitor the YouTube Kids app, which it says will roll out over the course of 2018.

More news: Leader of Karabakh separatists called on Pashinyan for dialogue

YouTube is quick to point out that parents who are happy with how the YouTube Kids app works now will still have the ability to open up a wider selection of video content.

It's the kind of control parents have been asking for from the popular app, but it also puts the onus on them to filter content.

"One area of focus has been building new features that give parents even more control around the content available in the YouTube Kids app so they can make the right choice for their unique family and for each child within their family", said Google in the press statement. The company says its machine learning processes can take several days to evaluate a video. It's a side effect of the way YouTube Kids finds its videos.

In a blog detailing the work it's doing to enforce its community guidelines, YouTube said the removed videos represented a "fraction of a percent of YouTube's total views" during the final three months of previous year. It's possible this safety guard isn't sufficient for catching every odd video your kid might see.

More news: Say hello to the iX3, BMW's 'first' electric vehicle

The Google-owned platform pulled down 8.3 million videos between October and December 2017, with more than half of the problem posts being spam or sexual content.

The company continues to change its rules to crack down on the issue, but creators are constantly looking for ways to trick the system.

This makes it easy for parents to select only the channel collections and topics they want their kids to access.

More news: 'Smallville' Actress Allison Mack Released On $5 Million Bond