Opinion Apr12 2018,
Warning: Illegal string offset 'ID' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 952

Warning: Illegal string offset 'key' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 953

Warning: Illegal string offset 'label' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 954

Warning: Illegal string offset 'name' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 955

Warning: Illegal string offset 'menu_order' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 956

Warning: Illegal string offset 'parent' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 957

Warning: Illegal string offset 'key' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 961
Exeter

The Cursed Autoplay

There are so many videos on YouTube that a 2016 estimate claimed that it would take 28,359 years to watch all of them. On top of this, as of August 2017 there are over 400 hours of video uploaded every minute, which scales up to over 24,000 years of YouTube video for every year of real time. Deletions mean that less video than this actually gets added to the website, but it nevertheless shows that YouTube operates on a scale far beyond human comprehension.

Rather than leaving you to navigate this abyss alone, YouTube’s algorithms create navigable paths through Suggested Videos and the optimised Autoplay stream. The next videos shown are put there based on their relevance to the current video, where other viewers went after the current video, and what people with similar demographics and watch histories went on to spend time watching afterwards. The strongest momentum of the suggestions is towards videos which people like you decided to watch at length afterwards. Like with Amazon’s “Customers Who Bought This Item Also Bought” recommendations, the algorithm follows the audience“. The primary goal of the suggestions is to maximise the time spent on YouTube by all of its users. Within this, the suggestions also aim to maximise engagement with the videos, for example through likes or subscriptions, so as to further improve the suggestion algorithm and thereby increase watch time even more. As put succinctly, though somewhat ominously, by Todd Beaupre of YouTube’s Search and Discovery Team, “The way we think about the systems that we build is that they’re really a gigantic feedback loop for the audience.

Underlying this is the ultimate goal of maximising profit through selling advertising space. More time by users spent watching and engaging with videos means more time spent viewing adverts, and more money for the platform. It seems like a decent arrangement for both sides – users get given a choice between a number of potentially interesting videos, while YouTube gets to profit from the extensive reach and precise targeting of its adverts, saving each from having to sift through the barren thousand-year video deserts. In fact however, user responses to the suggestions algorithm are rarely celebratory, being indifferent at best, and at worst being annoyed, concerned and even fearful.

One factor that has contributed to this reception is the zero-sum nature of the suggesting process and its influence on the content. Within YouTube’s “gigantic feedback loop”, a video is either fit or not. In their words, “the ones that work the best tend to stick around and the ones that don’t, tend to not”. Therefore, given YouTube’s immense scale, thousands of videos which are potentially interesting will be passed over each time a suggestion is produced due to them not being absolutely the best suggestion. Overall, this leads to less in-depth content, due to its lower popularity and breadth of viewership, and leads to repetition, due to certain videos becoming so popular that they get suggested again and again. In addition, the ultimate aim of helping advertisers to reach large, specific audiences, requires YouTube to sort its users into limited categories so that they can be offered to advertisers as a specific ‘type’. As a result, suggestions tend to offer videos according to stereotypes more than actual personal characteristics, due to the grouping together of diverse groups based on narrow ranging interests. To illustrate, I have found that watching Jordan Peterson being interviewed on C4 News has caused YouTube to see me as a Jordan Peterson fanboy, and I can’t seem to shake the suggestions wanting me to watch him get ‘Heated’ and ‘DESTROY’ various people, despite the fact that I mostly dislike him and ignore videos featuring him. I guess, as a young white British male (I assume YouTube knows this) who is interested in politics and has watched videos that are critical of ‘SJW’ culture, it’s a potential fit, but Jordan Peterson’s recent fame seems almost like a symptom of the problems with YouTube I’m describing. He is critical of political correctness but doesn’t promote a reactionary political stance, which is broadly appealing since many people across the political spectrum have problems with political correctness. Additionally, he appeals to those who defend political correctness, as a figure to mock due to his extensive popularity within the ‘Manosphere’ – and the Alt-Right associations this can entail. It seems like YouTube’s algorithm has latched on to this dual appeal, and has propelled his digital and public prominence through repeatedly suggesting his videos to these loose pro and anti-politically correct types, and in the process created a strangely composed audience made up from the left and right-wing, supporters and detractors, which has made it difficult for interviewers to characterise him. This has led to a weird and insipid situation where news readers choose to frame him as a bigot, encouraging bigots to proclaim him as their hero, with neither really listening to his statements – despite them usually being at least tenable and pretty typical of a conservative viewpoint – and no one making any real progress in terms of political debate.

Beyond this, a more alarming aspect of YouTube suggestions is their well-documented tendency to promote more and more outlandish videos. Quite simply, extreme content is usually the most engaging content, so it gets suggested more often. This is a common problem in traditional media of course, but YouTube’s scale of production and audience, and levels of extremity, are a worrying new development. TIn an official YouTube Q&A video Beaupre states that they “want to be able to really recognize when viewers have maybe even a life-changing moment with a video, so that we can recommend more videos like that”. It’s quite a strange thing to aim for – when was the last time that you went on to YouTube with the intention of having a “life-changing moment”? What kinds of videos would even have the potential to change your life? It’s not hard to see that this drive to make life-changing video suggestions opens up the space for extreme, sensational and manipulative content which exploits people’s insecurities and destabilizes their life-position, providing grim fascination that still counts as “life-changing” engagement nonetheless. Arguably the most significant example of this was its effect during the 2016 US election , where suggested videos, from “wherever you started, whether it was from a Trump search or a Clinton search, [were] much more likely to push you in a pro-Trump direction”, results gathered from an experiment and told to The Guardian by former YouTube employee Guillaume Chaslot. A major proportion of these suggestions contained incredible false accusations, with claims ranging from Hillary Clinton being a paedophile to her having Parkinson’s disease. Even the fake video creators themselves were shocked about their success during the election, with one saying “every video I put out about the Clintons, YouTube would push it through the roof”. Chaslot’s experiment developed a program to automatically follow through each layer of suggestions given from a starting “seed” video, recording every suggestion which allowed him to analyse their tone and subject matter on a mass scale. He has applied it to all kinds of topics, including all major European government elections since 2016, and has found evidence that “YouTube systematically amplifies videos that are divisive, sensational and conspiratorial” – this can be seen on the project website algotransparency.org. Chaslot’s data found that recommended videos relating to the 2016 US election were viewed over 3 billion times before the vote. With 150 million YouTube users in the US, and Trump’s win resulting from 80,000 votes over three swing states, these videos will have had a major influence, and it could even be argued that they were a key factor in the result.

Outside of political events, the fact that YouTube has surpassed 1 billion hours watched per day since February 2017 shows the potential for broader and deeper effects on society. Clearly the stakes are high here, and there is one particularly disturbing example of the potential dangers involved – the proliferation of YouTube Kids content optimized for the kinds of things which captivate young children. Kids TV is frequently weird, often intentionally, but the massive quantity of eerie and disturbing clickbait that characterises YouTube Kids is truly nightmarish. A typical video in this style will include iconic children’s entertainment characters taken out of context (together Spiderman and Frozen’s Elsa are the number one stars – just search ‘Spiderman and Elsa’ to see), garbled titles optimized for search engines like ‘#spiderman Takes Liberties Frozen Elsa, Covert Glance Her Dress# Elsa Is Stolen Her Snack‘, cheap garish visuals, a fragmented, almost schizophrenic narrative structure, and distressing subject matter such as drownings or injuries and medical procedures. These types of videos may not always act out directly distressing situations, but the general atmosphere they create is one of fear and disorientation. With characters frequently attacking or humiliating each other unprovoked, and with no real structure to either their relationships or the storyline, it just seems like the only thing to take away from them is mindless cruelty. It’s true that Tom and Jerry, a show I grew up with, was also based on mindless cruelty, but it also had clever ideas, consistent characters and, most importantly, limits to the violence of its content. YouTube Kids clickbait on the other hand is a free-for-all, where anything that grabs children’s attention particularly well gets pounced upon and churned out in thousands of slightly different repetitions. Writer James Brindle, in his essay ‘Something is wrong on the internet’, shows how the structure of YouTube recommendations is integral to creating things with such a sense of “offness“. Firstly, the desire for sidebar and Autoplay placement means that video producers will latch on to anything that strikes a chord with children, adding it into titles for search engine optimization and into the videos themselves, mixing up themes, characters and situations that don’t make sense together, like a dentist visit where Peppa Pig “is basically tortured, before turning into a series of Iron Man robots and performing the Learn Colours dance”. These interchangeable elements then get transposed into masses of videos with slightly different combinations to maximise potential views.  Secondly, the production is usually automated to some extent, where either human actors are “acting out the implications of a combination of algorithmically generated keywords” or are done away with completely “to create infinite reconfigurable versions of the same videos over and over again” using CGI programmes. For both there is a disturbing ambiguity as to “where the automation starts and ends”, with humans producing content directed by algorithms and algorithms independently generating videos at a pace beyond human ability. Finally, like with adult clickbait, the drive for views and algorithmic warping push the content into the areas of extremity and weirdness that presumably appeal to children for their novelty and emotional intensity. Needless to say, the end result of all of this is very strange, and not something safe for children. It’s so bad that a subreddit has been set up at www.reddit.com/r/ElsaGate to investigate whether this type of content is being produced systematically by actors unknown to traumatise millions of children worldwide. This misses the deeper causes of the problem in the structure of YouTube, but it shows how severe the situation is.

Multiple news sources have reported on inappropriate YouTube Kids content, but they tend to portray it as incidental, resulting from filtering failures or the actions of trolls, which could be fixed with tighter regulation. But this is not something which can be fixed through stricter filters and uploader monitoring. Firstly, YouTube would be unable to ensure that no harmful content gets through due to the sheer quantity of uploads, and the fact that they do not have the authority as a platform that prioritises openness to make judgments on what is and isn’t suitable for children. This is due to the complexity and controversy involved: how do you create a set of objective criteria for safe content, given that not everyone agrees on what is safe for children, and that not all harmful content is immediately obvious? For example, the fragmented storylines and narrative structure found in children’s clickbait might over a long period of time hinder their concentration and imagination, but it’s not like you could create an automated framework to assess whether videos reach a minimum level of coherence. Such judgements aren’t entirely rational, they are also emotional and intuitive – if something feels like it’s unsuitable for children, then it probably is, and a logical framework or artificial intelligence is never going to adequately account for this. Secondly, in a fundamental way, YouTube is complicit in the proliferation of these videos, implicitly encouraging their production through rewarding producers with money based on the views and therefore the ad revenue that they provide, thus favouring clickbait style videos  in the same way as with fake news and other harmful clickbait. James Bridle summarises this powerfully: “It’s not about trolls, but about a kind of violence inherent in the combination of digital systems and capitalist incentives. It’s down to that level of the metal.”

It seems overall that YouTube’s ambitions are overblown and inappropriate for the kind of company that they are. They should realise that they aren’t there to provide “life-changing moments” – instead they should accept that their primary purpose for the great majority of people is as a source of simple entertainment and as a tool for finding specific useful information and media. My first memories of YouTube were good in this way – it was like a massive video warehouse or record shop, with a straightforward recommendation system linking videos based on direct similarity, for example taking you to other releases from a record label. It felt more open, unpressured and rewarding of curiosity, whereas now its intense personalisation makes it seem bloated and intrusive.  Relaxing the amount of personalisation would improve the recommendation system in this regard, and would also be useful because currently it creates too much focus on arbitrary characteristics like age, gender and geographical location, which makes it hard to reconcile the multifaceted and contradictory aspects of individual personality. This is especially important in terms of politics, where people’s views are often ambivalent and where people need to be exposed to a broad amount of information and perspectives in order to develop a legitimate understanding of issues.

As part of Google, YouTube’s mission statement is to “organize the world’s information and make it universally accessible and useful”, while their self-perception in ‘About YouTube’ is as a haven where humanity can “build a community through our stories” and flourish through free expression, opportunity and belonging. However, as long as the only aim of their video recommendation system, the backbone of their recent success, is to maximise watch time in service of ad revenue, these lofty ideals will ring hollow.

 


Warning: Illegal string offset 'ID' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 952

Warning: Illegal string offset 'key' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 953

Warning: Illegal string offset 'label' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 954

Warning: Illegal string offset 'name' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 955

Warning: Illegal string offset 'menu_order' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 956

Warning: Illegal string offset 'parent' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 957

Warning: Illegal string offset 'key' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 961

Warning: Illegal string offset 'ID' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 952

Warning: Illegal string offset 'key' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 953

Warning: Illegal string offset 'label' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 954

Warning: Illegal string offset 'name' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 955

Warning: Illegal string offset 'menu_order' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 956

Warning: Illegal string offset 'parent' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 957

Warning: Illegal string offset 'key' in /var/sites/e/europeanecho.com/public_html/wp-content/plugins/advanced-custom-fields-pro/api/api-field.php on line 961

comments