Features

Content Warning: The Problem With YouTube

Recent months have seen controversy build around YouTube and its plans to restrict its more sensitive uploads. Jo Ralphs looks into what the real problem with YouTube is.

In the aftermath of the Logan Paul controversy, YouTube has come under fire from users and creators alike for its poor handling of inappropriate content on the site. With over 300 hours of video uploaded every minute, it’s easy to see how policing the site has come to be a problem.

The massive saturation of people using YouTube as their primary source of income has perhaps contributed to the problem – shock tactics are becoming increasingly common in order to generate views, popularity and income.

There remains a separate problem too. Not only a platform for entertainment, YouTube has unwittingly played host to radical propaganda videos and so-called ‘fake news’.

What is being done about it?

September 2016 – This was the first round of crackdowns in YouTube monetization rights – tighter enforcement of user guidelines led to the platform being accused outright of censorship. Removing videos which were ‘sexually suggestive’, ‘violent’, had ‘inappropriate language’, ‘promotion of drugs’ or ‘controversial or sensitive subjects’ appears to many users to be altogether too broad. Channels like Philip De Franco or Seeker Daily had videos demonetised which they did not feel had violated the terms. The subjective nature of the guidelines, they felt, suppressed part of their freedom of speech and dictated the kind of content they made, as well as perhaps removing some of the more offensive videos on the platform.

“Enabling restricted mode was supposed to help parents young children against sexually explicit content, many videos did not feature explicit content”

March 2017 – YouTube introduces ‘Restricted Mode’, which was highly criticised after numerous videos by LGBT+ YouTubers or discussing LGBT+ issues were unable to be viewed while in restricted mode. While enabling restricted mode was supposed to help parents protect young children against sexually explicit content, many videos did not feature explicit content at all. YouTube later admitted to the error as it emerged LGBT+ YouTubers were having their work wrongly censored. Yet another instance of ineffectual policing of the platform having a political dimension.

January 2018 – Logan Paul is dropped from Google Preferred (a reward scheme for popular creators) and two of his YouTube Red projects are axed after he publishes a video containing disturbing footage. However, before Paul removed the video from his channel, YouTube itself handpicked the footage to be on it’s trending page for 48 hours before taking it down – in which time almost 6 million people viewed the video. YouTube’s CBO, Robert Kyncl, called the shocking video “borderline” in reference to its content, explaining this as the reason for the delay in punishment, roughly a week. This approach was clearly flawed for a number of reasons, but notably YouTube’s preferential treatment of Paul over other controversial YouTubers, such as Pewdiepie.

February 2018 – The most recent action YouTube has taken has been rolled out in the US – flagging news stories and channels on the site which have funded by state media. This was done amidst US government efforts to stop the circulation of Russian-funded news deemed as propaganda for the Russian government. However, organisations like PBS have been labeled in the initiative. They object to the moniker ‘publically funded broadcaster’ on the grounds that it implies the government is able to influence their content, which is not only incorrect, but illegal under US statute. Far from clarifying sources and marking out propaganda, this move appears to have added to confusion over news sites.

“The site has become notorious in recent months for demonetizing videos which break guidelines for including copyrighted content and swearing”

Also this month, YouTube CEO Susan Wojcicki announced the company’s priorities in the coming year – it appears in direct response to the Logan Paul controversy. She not only wants to allow YouTubers to find new revenue streams, have a better ability to interact with viewers, to further their learning – but to tighten and enforce the policies already in place, with continued transparency. The effect of her statements is as yet unclear – the site has become notorious in recent months for demonetizing videos which break guidelines for including copyrighted content and swearing; yet videos such as Logan Paul’s still get onto the site. Paul himself acknowledges in the video that it will be demonetized – yet still uploaded it. Clearly, Wojcicki’s approach is flawed.

Whose responsibility is it to prevent this kind of thing happening?

“Is it right to restrict the movement of ideas and intellect on the basis that something is damaging to the public?

These are just a number of issues on the site, and many more controversies have found their roots on the platform, including the horrific prevalence of child abuse. Clearly, YouTube’s measures are ineffectual in preventing public outrage and personal injury (cf. the viral video of Shane Dawson being accused of pedophilia with no evidence). There is a valid argument for government intervention in cases like Logan Paul’s, where public interest – and especially the protection of young viewers – is at stake. But which government is supposed to police this global corporation – and to what extent? YouTube is notably banned on internet providers in China, Iran and North Korea for highly political reasons; so, is it right to restrict the movement of ideas and intellect on the basis that something is damaging to the public, or is allowing controversial content to remain online the lesser of two evils?

Jo Ralphs

Follow us on Twitter and like our Facebook page for more articles and information on how to get involved

Featured image courtesy of ‘Aurelien Blaha’ via Flickr. Image licence found here.

Categories
Features

Leave a Reply