YouTube must do more to protect children; blocking ads and comments is not enough

Arvind Hickman
By Arvind Hickman | 27 November 2017
Arvind Hickman

User-generated platforms like YouTube, Facebook and Twitter are impossible to police and will always pose a risk to advertisers, but there is a more important issue at play that often goes unreported.

Where do we draw the line at what is appropriate and acceptable to publish to the masses?

YouTube has moved quickly to block advertising and comments on videos of scantily-dressed young girls rolling around on beds or making sexually suggestive content.

A lot of this, no doubt, was posted innocently. But my view is that it should never be allowed to be published on a publicly-accessible media platform in the first place.

These are minors, children far too young to realise the consequence of their actions. Have their parents given them permission to publish this content and has YouTube checked?

YouTube should remove inappropriate videos

My question is why hasn't YouTube removed all videos of minors that most people in society would deem highly inappropriate to publish? Stopping ad dollars from funding them or paedophiles from commenting under them isn't solving the deeper problem at play.

And YouTube isn't alone. Instagram has tonnes of profiles and posts that would easily be regarded as soft porn. Facebook was unable to prevent a murder from being live broadcast on its platform.

The abuse and bullying on Twitter towards minors can be so extreme and harmful that it causes young kids depression and worse.

There is a huge tension between a user-generated platform providing freedom of expression to democratise publishing to the masses and running platforms that are not only brand safe, but also socially unacceptable.

Worse still, these platforms are targeting young people, many of whom are legally minors.

I've now investigated two brand safety scandals on YouTube this year and both left me with a chilling feeling for different reasons.

Earlier this year I saw a video of an Indonesian man being burnt alive with a Marketo ad running on top of it. How on earth is this acceptable?

marketo-burned-alive.jpgThis shocking video of an Indonesian man being burned alive has been viewed 1.2 million
times and carried a Marketo ad next to it. 

Today I've seen many videos of scantily-dressed young girls posing in bikinis or pyjamas, probably copying their idols on MTV. One video had two minors twerking on their bed with a several disturbing lewd comments posted underneath (see below).  

The number of videos of this nature is appalling and shocking, but YouTube, for some reason, refuses to remove them.

If anyone was caught by the police with such videos on their laptop, they would be rightfully arrested and charged with possession, yet YouTube thinks it's acceptable to publish these on a public server that can be accessed by sexual predators and other criminals across the world?

If my 10-year-old son ever posted anything on social media of this nature I would not only ban him and remove all of his accounts, but I would be having strong words with any platform that allowed this and possibly the police.

There are extremely strict laws that prohibit what magazines and websites like AdNews are allowed to publish with very serious ramifications if we stray. Content featuring minors is tightly controlled for obvious reasons.

I agree with GroupM's chief digital officer John Miskelly on this issue. It is no longer acceptable for tech companies like Google and Facebook to shirk their responsibility as publishers by publicly stating they are technology companies and not media companies.

If you provide the means for anyone to publish content, journalist or child, you must play by the same rules and laws as a media company. There are no 'ifs' or 'buts'.

It is your responsibility to ensure that the content does not exploit minors and other vulnerable members of society.

I would like to see far stricter rules and restrictions imposed on what is acceptable content because clearly the status quo isn't working.

This is a far bigger societal issue than a few advertisers threatening to boycott advertising on YouTube. A serious rethink of publishing standards on these platforms is urgently needed, children should never be exploited in this way.

comments powered by Disqus