photo: Pixabay

YouTube Kids recently got an upgrade. Your kiddo’s favorite way to get their screen time fix refreshed their design, added the ability for kids to create their own profiles and now allows kids to set passcodes (that parents can override). But these recent changes have been overshadowed by some very unexpected controversy. Readon find out what the trouble is, what it means for your child’s viewing, and what YouTube is doing to fix it.

So we all know by now that YouTube Kids is…umm, for kids. Obviously. And with that, we kind of figure that it’s safe. That is, it’s safe for kids to watch. Well, for the most part it is. Or at least, it should be.

YouTube Kids, like other” kid-friendly” sites, uses filters to keep the big bad content out. But like just about everything else, it’s not perfect. Hey, nothing is 100%.

Even though the site has made more than an effort to keep adult content out of YouTube Kids, it seems that some not-for-little-eyes videos have slipped through. Parents are less than thrilled about the “troubling” content that has somehow made it past the filters.

What types of videos are getting through the filtering process? The offending clips are cartoons that have much-loved kid characters (such as Mickey Mouse or Spider-Man). But instead of being appropriate animation, these videos have scenes such as Mickey in a pool of blood.

While the slip-through slip-up has gotten more than its fair share of media attention, this isn’t a rampant problem and isn’t the norm for the site. The offending videos apparently made their way onto the site by mistake or as a result of hackers fooling the site’s filtering algorithms. Of the millions of videos viewed on YouTube Kids in the past 30 days, less than .005 percent were removed for inappropriate content.

Regardless, YouTube Kids has taken measures to keep this from happening. Earlier in 2017, they launched a campaign to make sure that inappropriate videos like these wouldn’t be able to monetize. Also, age restrictions have been added to videos that have been flagged.

it announced that uploaders will no longer be able to monetize videos that make “inappropriate use of family-friendly characters,” and now YouTube will also begin to add age restrictions to videos that have been flagged. That may seem like it should’ve already been in place, but at least this new system will automatically keep videos that seem to be creepy spins on kid-friendly stuff off of the YouTube Kids app.

How do you make sure that your child stays safe online? Tell us in the comments below.