In the face of a tumultuous two years at Facebook, its head of global policy management Monika Bickert said the company is making headways in fixing issues around fake news and hate speech, but admitted it's a huge challenge to monitor two billion accounts.
Over the two years since the 2016 US elections the social media giant has been embroiled in the Cambridge Analytica scandal and taken to court by advertisers for allegedly misleading them on its potential reach.
Locally, The Office of the Australian Information Commissioner announced it’s informally investigating the platform over another privacy breach. In addition, the government, along with six other nations including the UK, has invited founder Mark Zuckerberg to testify in front of an international grand committee.
Despite these issues, Bickert focused on the company’s increased transparency around formulating policy to take down hate speech and remove fake accounts.
In the second and third quarter Facebook said it’s taken down more fake accounts than in any previous quarters, with 800 million and 754 million removed respectively. This brings the rate of fake accounts to 3% to 4% of monthly active users.
“We’re in far better shape,” Bickert said.
“Most of the misinformation is being shared by fake accounts and the technological tool to remove them at the time of upload is something we didn’t have a few years ago.”
It’s also focused on improving users’ experience by reducing the amount of hate speech and bullying across its platform. So far, it's has been able to detect 52% of hate speech before it’s reported by users.
Bickert maintains faith in leadership
Queried about her faith in Zuckerberg and chief operating officer Sheryl Sandberg, following reports by The New York Times they ignored and concealed the full extent of Russia’s attempts to influence the 2016 US election, Bickert was adamant of her support.
“Although it’s my team working on making these policies, senior leaders do get involved,” she said.
“So when we have a big content issue such as deciding on how we’re going to treat certain hate-speech, we will involve Mark, Sheryl and other senior leaders in these conversations and make sure they have a voice in what we’re doing.”
She added the platform failed to stop Russian groups from infiltrating the election because it was hard to detect, not because it wasn’t paying attention.
The former federal prosecutor said it required a team with information operations at Facebook to take a deeper look, which led to it uncovering the full extent of Russian activity.
“You look at it as a prosecutor – when you talk about finding bad actors it’s a game where you make an advance and they make an advance and this was new behaviour we hadn’t seen before," she said.
Facebook continues to offload responsibility
When asked by AdNews about its fact-checking efforts in Australia, Bickert squashed the idea it’s responsible for the accuracy of its content but highlighted the progress it’s making to eliminate fake news.
Facebook has faced criticism for years for positioning itself as a platform rather than a publisher, allowing it to profit from content by other businesses without taking on any editorial responsibility.
“Sometimes there is a misconception Facebook is acting as a truth police now, saying this is true and this is false,” Bickert said.
“That’s not what we’re doing.”
Bickert went on to explain Facebook has a three-pronged approach around misinformation, which includes removing and reducing content and also informing users about publishers that post content on its platform.
Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at firstname.lastname@example.org