Facebook slow to remove child pornography, terrorist videos

Arvind Hickman
By Arvind Hickman | 14 April 2017
 

Facebook risks facing criminal proceedings for refusing to remove illegal content that had been flagged to moderators, a leading British QC has warned.

An investigation by The Times found that Facebook moderators had refused to remove dozens of posts with offensive content that breached English law, including ISIS beheadings, violent paedophilic cartoons, a video depicting a sexual assault on a child and propaganda glorifying recent terrorist attacks.

The Times says that when the content was initially flagged, moderators judged it had adhered to Facebook’s community standards. However, when later contacted by the newspaper, Facebook removed the posts.

Justin Knowles QC told the newspaper that under English law Facebook could be liable of committing a criminal offence in the UK if it refused to remove illegal content after being made aware.

This latest investigation casts fresh doubts about Facebook’s ability to deal with offensive content distributed on its platform and who should bear liability.

It comes at a time when advertisers are questioning the brand safety of digital user-generated content platforms.

In the past few months, many large companies, including Holden, Telstra, Vodafone in Australia, have paused advertising on YouTube amid brand safety concerns. Google is working on introducing tighter brand safety controls and upscaling its monitoring team.

Although it is impossible for Facebook to monitor the billions of posts on its platform each day, advertisers would expect content that is flagged as ‘inappropriate’ to be dealt with efficiently.

Silicon Valley tech giants are not responsible for monitoring content in the US and have resisted calls by other governments to better police the activities of their users.

Germany is the only country that has legislated against social networks if they fail to remove hate speech and fake news.

Many media and marketing industry leaders, including WPP boss Sir Martin Sorrell, believe social media companies should bear some responsibility for the content distributed on their platforms.

Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au

Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.

comments powered by Disqus