When I wrote an essay about the videos online, the public reaction largely mirrored my own. Some seem to be the product of random title generators, others – so many others – involve real humans, including young children, distributed across the globe, acting out endlessly the insane demands of YouTube’s recommendation algorithms, even if it makes no sense, even if you have to debase yourself utterly to do it.Ī scene from Minnie Mouse Choked Pizza for Eating Too Much. Beyond the simple knock-offs and the provocations exists an entire class of nonsensical, algorithm-generated content millions and millions of videos that serve merely to attract views and produce income, cobbled together from nursery rhymes, toy reviews, and cultural misunderstandings.
This is the part that’s harder to explain – and harder for people to understand – if you don’t immerse yourself in the videos, which I’d hardly recommend.
As I spent more and more time with them, I became perturbed not just by their content, but by the way the system itself seemed to reproduce and exacerbate their most unsavoury excesses, preying on children’s worst fears and bundling them up into nightmare playlists, while blindly rewarding their creators for increasing their view counts even as the videos themselves descended into meaningless parodies and nonsensical stories.įor adults, it’s the sheer weirdness of many of the videos that seems almost more disturbing than their violence. And these videos are worrying on several levels. It’s an approach that has previously led me to investigate Britain’s system of deportation flights or its sophisticated road surveillance network, and this time it took me into the weird, surreal, and often disturbing hinterland of YouTube’s children’s videos. I’m a writer and artist, with a focus on the broad cultural and societal effects of new technologies, and this is how most of my obsessions start: getting increasingly curious about something and digging deeper, with an eye for concealed infrastructures and hidden processes. Moreover, there seemed to be little understanding of where these videos were coming from, how they were produced – or even why they existed in the first place. But despite these reports, YouTube and its parent company, Google, had done little to address them. Previously happy and well-adjusted children became frightened of the dark, prone to fits of crying, or displayed violent behaviour and talked about self-harm – all classic symptoms of abuse.
A brief Google of some of the terms mentioned in the article brought up not only many more accounts of inappropriate content, in Facebook posts, newsgroup threads, and other newspapers, but also disturbing accounts of their effects. Parents reported that their children were encountering knock-off editions of their favourite cartoon characters in situations of violence and death: Peppa Pig drinking bleach, or Mickey Mouse being run over by a car. I n November of last year, I read an article in the New York Times about disturbing videos targeted at children that were being distributed via YouTube.