Posted on 11 December 2017 by Paul
The internet is buzzing about creepy YouTube videos at the moment. The first article to really catapult this issue into the mainstream was Something is Wrong on the Internet, which is lengthy but well worth reading as it details some of the immediate concerns around these videos.
A lot of the discourse that has come out since then has focussed on the mere existence of these videos; a concern that these videos are not just unsettling but that they are inappropriate for children, and yet are being targeted at them - and the wool is being pulled over parents’ eyes because of the way that these videos are being presented.
That is a problem, but we all need to take a collective step back and breathe. This kind of confusion isn’t new. Many of us have a friend who worked at a video store where an employee unwittingly puts Japanese pornographic animation into the children’s section because “it’s a cartoon”. We also have shows aimed at adults that are easily mistaken for children’s programming, such as South Park, Archer, and Aqua Teen Hunger Force. Parents often buy their children videogames such as Grand Theft Auto IV without thinking whether their kids are equipped to make sense of the satire of American society and the relentless brutality that it presents.
What is new is the scale in which this kind of confusion is occurring, and the fact that much of this problematic content is being created for and targeted at children (generally pre-schoolers), on the back of new digital marketing forces that no-one seems to fully understand.
What makes these videos visible in the first place when most of the videos on YouTube end up languishing in obscurity? YouTube is driven by algorithms. Algorithms act in a way that is completely different to human curation, and as such fails in new ways that go beyond simply misplacing hentai.
The creators of these videos know that.
Take a look at the jumble of keywords that make up the titles of these videos: “Mickey Mouse & Minnie Mouse Erase Test Scores's Minnie Mouse! Cartoons Sun & Moon Babies Funny Story”, “Bubble Guppies Gil & Molly Freezed by Frozen Princess Funny Story! Finger Family Song Nursery Rhymes”, “HD Mickey Mouse Babie DEAD In Gas Explosion Mickey Mouse & Dog Rescue! Finger Family Nursery Rhymes” (I've linked to a mirror of HD Mickey Mouse Babie because the original has been taken off YouTube). These titles barely make sense when parsed out into their individual parts - is the Mickey Mouse baby dead or not? How could this also be a finger family nursery rhyme? How can all of these things fit into one video? HD Mickey Mouse Babie is a 10 minute jumble of different plots, themes, and characters that’s basically incomprehensible, including one in which a Minnie Mouse knock-off fakes a pregnancy and tries to commit suicide. Disappointingly, there’s no finger family anywhere in sight.
These videos aren’t meaningful interpretations of beloved children’s characters (they certainly aren’t offering an intentional critique of the sanitised world of Disney) - the characters are simply there in the hopes that, once the video is tagged with “Minnie Mouse”, “Cartoons”, and “Sun & Moon”, it ends up being auto-played after a legitimate Disney or Pokémon Sun and Moon video.
The creators of these videos are also exploiting the fact that young children aren’t discerning media consumers. They certainly have their preferences (for example, my 3-year-old niece really likes Sofia the First and hates Octonauts) but they’re still learning to differentiate between different pieces of media. A good example of this is the fact that young children can’t tell the difference between advertising and entertainment, but if you’re a parent then you’ll know this already. Peppa Pig is Peppa Pig regardless of whether she’s pink or green. That’s how these videos end up with so many views; it’s easy for young children to be ensnared in a cycle of bootleg media, driven by algorithms.
This presents a marked change in the way that programming is being targeted at children. A video store is curated by its owner or employees; they decide what is placed in the children’s section. Broadcast television is not only curated, it also has watersheds for Adults Only content. You’re unlikely to see Don Quixote beside Percy Jackson at Unity Books. Humans can contextualise and interpret media in a way that machines simply can’t, and parents have depended on this as a way to choose content for their children to watch - even if it’s as simple as understanding the television watershed. If South Park is on at 9pm (or was it 9:30?) on TV4, parents would know that, regardless of how it looks, it probably contains material inappropriate for children. They can then choose whether or not they trust their kids to be mature enough to watch it (and not to go on and spout anti-semitism at school).
That’s not the case with online content. It’s not that these human structures are inherently infallible; it’s that when they break down it’s generally a mistake and people are held accountable. When algorithms do the same, they are working as intended.
Some of you reading this probably aren’t particularly convinced that there are any harms to kids watching “wrong head” videos. I personally think that in a lot of cases, these clips are more likely to confuse or discomfort, rather than harm kids. But some of these videos are likely to be harmful, regardless of how you want to spin it. Here’s a screenshot I took from a now-removed video (this is the nature of documenting online phenomena - it’s extremely volatile) in which Spider-Man knocks Elsa over, climbs on top of her and kisses her. Then he takes her dress while she’s in the shower – she knocks him down with her ice powers, but he manages to trip her up with webbing. She falls on top of him, then they kiss each other. Spider-Man also kisses Elsa while she’s sleeping! And then he knocks her out, carries her away, and throws her onto a web before kissing her again.
Before it was removed, it had over 17 million views. Some of those views would have come from adults and teenagers with a morbid interest in the video, but the majority of these views would have come from children who stumbled onto this video, given the common patterns in interaction between these types of videos and actual children’s videos (such as having very few comments and ratings relative to view count, as well as a surprisingly even like to dislike ratio).
This is a particularly egregious example. But given young children’s tendency to watch the same (or similar) material over and over, most parents will not want their pre-schoolers binging on even the less problematic clips, given how mindless and incoherent they are. The trouble is, that is exactly what the producers of these videos want, and they are exploiting the way YouTube and other digital platforms work in order to achieve that.
I don’t think we can say that these videos are malicious in intent. This probably isn’t an attempt to groom kids (although that has been speculated), and for the most part doesn’t smell like internet “trolls” (just trust me on this one). It’s likely that these videos have been made because children are easy to keep engaged with simplistic, low budget productions that mimic more sophisticated children’s content, and are easy to exploit for money. Regardless of intent it’s clear that the people making these videos simply don’t care about making content for young children; they just want to make sure they have your kids’ eyeballs on their videos.
This is not like a video of an animated Peppa Pig getting high with Snoop Dogg (that is also available) made for adults to laugh at. These videos are for kids, intentionally injected into the stream via confusing tags, for them to watch instead of legit episodes of beloved shows.Laura June at The Outline
Look, I understand that YouTube can be a fantastic resource. You can go there for topics as diverse as how to carry a fox, a rabbit, and a lettuce across the river with a small dinghy, to the history of the VHS. YouTube can also be a very convenient way to keep a kid occupied for a period of time - and there’s nothing wrong with that either.
But when it comes to kids – particularly pre-school kids – there’s a lot of content on YouTube that could potentially disturb, terrify or harm them. The risk of that happening has only increased with the growth of content such as Spider-Man and Elsa finger family nursery rhyme HD. YouTube might be taking these videos down now, but it’s likely that something new will take its place.
So what can you do if you’re concerned?
To comment as a guest without having to login, click inside the comment field below. This will reveal a field labelled "Name". Click in the "Name" field to reveal the "I'd rather post as a guest" checkbox. Tick this checkbox and then fill in your name and email address. You can use a nickname if you don't want to use your real name. Happy commenting!