Features

Are Online Companies Doing Enough To Protect Children?

Sarah Harris

In a monumental move for the porn industry, this week Pornhub removed all unverified videos from their website. The move was prompted after a New York Times article revealed that the site was riddled with non-consensual and child abuse content.

In response to the article, a spokesperson for the site claimed that the allegations were “irresponsible and fragrantly untrue.” However, this is not the first time that such statements have been made against the website.

Several activist groups have been fighting for change to the site for years. Kate Isaacs started the #NotYourPorn campaign after a friend had her iCloud account hacked and personal videos leaked. And this is just one of hundreds of similar stories.

This year, the UK’s revenge porn helpline has experienced its busiest year on record, with 242 cases in April 2020 alone, compared to 122 in April 2019. There is a clear need for stricter laws regarding the online content we share, not just in terms of the porn industry but in general.

So, should other platforms such as YouTube and TikTok be following in the steps of Pornhub?

When technology first began to advance, parents frantically worried about how they could keep their kids safe from harmful content online. From graphic video games to profanity on TV, technology has since developed so rapidly that we’re constantly having to find new ways to protect the children of today.

So, should other platforms such as YouTube and TikTok be following in the steps of Pornhub?

Although video-sharing platforms such as YouTube and TikTok weren’t designed for the purpose of sharing adult content, there are no specific laws that stop people from doing so.

A 2018 Ofcom report revealed that YouTube is increasingly becoming the
streaming platform of choice with an estimated 77% of 8-11 year olds using the site.

Since the creation of YouTube Kids in 2015, parents and children have turned to the app that supposedly, is only home to child-friendly content. But that doesn’t mean there isn’t a dark side to the site.

The head of YouTube’s global family and learning content, told the New York Post, that inappropriate videos were “the extreme needle in the haystack,” but what was being done to prevent this?

In June 2017, a YouTube video titled Minnie Mouse Mommy has Pregnancy Problem and Doctor Treats Episodes! racked up to 3 million views within the space of a day.

From dangerous pranks to child predators, TikTok has introduced dozens of dangers that would have been unthinkable a decade ago

The episode featured a woman dressed as Minnie Mouse fall down some stairs, only to find herself trapped and spurting out blood. Her children who bear witness to this, cry out for help, whilst traumatised viewers most likely do the same. This video was later removed after concerns that the channel, that was aimed at children, was too violent.

Similarly, TikTok has been subject to heavy criticism since it’s rise to fame in 2019. From dangerous pranks to child predators, TikTok has introduced dozens of dangers that would have been unthinkable a decade ago. Although it is primarily full of creative videos, how are we preventing our toddlers from coming across a recreation of the WAP dance?

There is a version of TikTok that like YouTube Kids, has been specifically designed to filter out content that is deemed inappropriate for children and allows you to create videos, but not post them, but this too has its dark corners. Which begs the question, are online companies doing enough or do they need to do more?

Pornhub was pushed into their decision after being publicly exposed, so maybe we need to do the same with other online companies. Although naming and shaming these companies may not be the most efficient way to spark change, what other options are we left with?

There’s only so much control parents and guardians have over the content their children consume and what they get up to behind closed doors

Most social media sites such as Facebook and Instagram require you to be over 13-years-old to create an account, but an estimated 18% of 8-11 year-olds still have a social media account.

There’s only so much control parents and guardians have over the content their children consume and what they get up to behind closed doors, so perhaps online companies need to take on more responsibility.

Hopefully, this move by Pornhub prompts change from other online companies, but how long will it be before we see this, and will the damage be irreversible by that point?

We’re already beginning to see the effects of this, with mental health and body image issues in children more common than ever. It’s time that these companies accept that making their platforms overly accessible, can in fact do more harm than good, to certain groups.

Sarah Harris

Featured image courtesy of Ludovic Tionel on Unsplash. Image license found hereNo changes were made to this image. 

For more content including uni news, reviews, entertainment, lifestyle, features and so much more, follow us on Twitter and Instagram, and like our Facebook page for more articles and information on how to get involved. 

If you just can’t get enough of Features, like our Facebook as a reader or a contributor.

Categories
Features

Leave a Reply