Online comment systems are broken: why defamation is just the tip of the iceberg

Quiip social media strategist Amber Robinson
By Quiip social media strategist Amber Robinson | 28 June 2019
 

As the dust settles on Justice Rothman’s ruling in the Dylan Voller defamation case, publishers face a major challenge. Although always responsible for libellous comments left on their own websites, this ruling now sets the precedent that publishers will be responsible for comments left on third-party platforms like Facebook.

This is a problem. Publishers – and their followers - generate a relentless tide of comments on Facebook, which offer minimal moderation options for publishers who have their own page on the platform. One broadcaster we work with receives over 138,000 comments per week on just one of their pages. The only way to stop comments appearing on posts published to the page is to add in word-based filters which automatically hide content until moderators can review and publish the content.

These filters are designed to be used to automatically hide profanity, not ‘hacked’ to prevent every single comment from appearing. That’s because Facebook is built in the United States, where the right to Freedom of Speech is enshrined and libel is hardly ever prosecuted.

It’s obvious that this is not a workable solution. Either our defamation laws need to change to take in to account social media, as is the case in Britain, or Facebook needs to provide better moderation services to publishers whose content drives an enormous amount of usage of their product. (Publishers also spend large amounts of money with the platform just to have their posts seen in the newsfeed.) Publishers can, and should, push for these changes.

While social media teams take stock of this new regulatory environment, I also invite them to take pause to consider the broader landscape of the online communities they have created on third party platforms as well as in the comment sections on their own websites.

Are Australian online communities safe and respectful? Do they inform and support their members? Do they align with publisher’s organisational goals? I’d argue that they’re not.

If social media platforms can be compared to a public square, we have allowed preachers of hate to enter our squares. We’ve allowed bots disguised as humans to infiltrate our places of discussion and manipulate conversations. We’ve given equal space to people just dropping by and those who’ve contributed to the community for years.

By reframing our online communities to become intentional rather than accidental spaces, there is a huge opportunity to create value and yes – drive website traffic. To do that we need to look at the operating rules of our digital communities and make sure they are fit for purpose.

To that end, this week saw WPP interim AUNZ CEO John Steedman call for an end to anonymous online comments.

 “If somebody has something relevant to say about any issue, they should be required to log in,” he said in an open letter to media outlets.

While anonymity certainly gives some online trolls the guts to say nasty things, anyone who has moderated Facebook comments knows that people are also happy to say absolutely vile things with their full name on display. (For proof, look at some of the replies Change.Org director and LBQTI activist Sally Rugg received after appearing on the ABC TV show Q and A this week.) In addition, anonymity provides valuable opportunities to discuss sensitive topics such as mental health and abuse without fear of real-world repercussions.

There is another way. Websites like Reddit and The Guardian have tried to improve the quality of their online communities by implementing user tools which enable readers to ‘upvote’ comments and downvote others. Reddit also awards pro-social behaviour with its ‘karma” point system which elevates comments by regular members whose opinion is respected by the wider group.

The Guardian features comments it believes are particularly worthwhile as a ‘Guardian Pick’, acting as both an incentive as an exemplar for others.

Creating intentional communities isn’t easy. Facebook needs to take the moderation concerns of publishers seriously and provide more tools and control for page owners. Publishers need to not just invest in moderation services, but in community management which takes a holistic view of their entire digital footprint.

In a sea of online news served up via algorithm, a great community is what can make your online presence worth visiting – and staying for.

comments powered by Disqus