People think of Artificial Intelligence as the ultimate problem solver to all our problems. That can be quite accurate, in some ways, but the truth of the matter is that AIs can be biased too if you feed them inaccurate data, just like in YouTube’s case.
The way machine learning process which stands at the base of most of what we file under the “AI” umbrella term means that the programmers feed it a lot of data, which it then uses to perform whatever tasks it is meant to do, with the added bonus of always learning and adapting. If you provide the AI with erroneous or confusing data, it won’t be as effective at its job as you may want it to be, and that’s perfectly normal.
This is exactly YouTube’s problem. While Google, in general, excels at AI development and it’s one of the leading forces in the field, sometimes it simply fails. Failure is also normal, a part of life, but in this case it’s somehow a lot worse because the ones who are “suffering” from this failure are kids.
If you don’t have a child that regularly checks YouTube from your phone, you may not know just how much disturbing material there is out there. People take popular cartoon characters like Peppa the Pig, or Masha and the Bear, or Mickey Mouse and the rest of the gang and put them in some disturbing situation that end up in blood and gore. The kids will tap on the videos if they end up in the suggested videos since they’ll see their favorite characters and witness whatever content these people created.
YouTube Kids may be a solution for some people, but not for all since the app isn’t available around the world. If you want to clean up YouTube for your child you have to do some serious sleuthing and spend loads of time after every use to remove channels that are disturbing from the viewing list and so on. There’s no way to block content from particular channels at all, and the videos don’t have any kind of age limits set to them. The protections set in place by YouTube for sensitive content only work to a certain extent, so they’re not an answer either.
No, the answer to the problem sits with YouTube’s own team, which counts thousands of reviewers – people who watch tons of content every day and decide whether the videos match the guidelines the company has, before feeding the results into the system for the AI to use. And if you haven’t noticed the issue yet, then we’re going to point it out. YouTube’s guidelines are sketchy at best.
YouTube’s biggest problem – itself
According to screenshots obtained by BuzzFeed News, the people who train YouTube’s search algorithms have a hard time to figure out what goes where, which videos are high quality and which content is disturbing. In fact, as wrong as that sounds – if the video looks good and the production is of top quality, then the video will also get the “high quality” mark.
One rater told BuzzFeed that one of the things upsets them a lot os the fact that these videos are aimed at kids but aren’t really for kids. “Content creators make these cartoons with fake versions of characters kids like, such as Paw Patrol, and then as you watch, they start using foul language, making sex jokes, hurting each other, and more. And so many children watch YouTube unsupervised, that kind of thing can be really scarring,” the reviewer told the publication under anonymity.
In short, all reviewers that work for YouTube check out content in person and hand out assessments about quality, changing the algorithmic reach of the videos. If a video is disturbing, they can flag it, but they still have to say if the quality is high or not, so it might still end up in your feed. A lower rating to a video would “block” the video, in a way, but the conflicting guidelines make this quite hard to do.
These guidelines that YouTube has, including regarding content directed at kids, are at fault, in part, for the success some channels have gained over the years, including some that glorified child abuse for more views. The channel ToyFreaks, for instance, did just that, cumulating tens of millions of views and followers. YouTube finally blocked the channel after public backlash. Other similar content creators felt the same, some even going under investigation with child protection services.
YouTube clearly needs to do better, especially when it comes to content directed at children. The rest of us are pretty used to all sorts of things and maybe no longer bat an eye over a disturbing video. Our children, however, are not. Hopefully, a revision of the evaluation guidelines will be done soon.