One result of the 2016 Presidential election was that Congress realized how much news is read by the average American on Facebook, the largest social network, and how much influence the company of Mark Zuckerberg therefore wields.
And secondly, how much fake news is being spread via social media as a whole.
In order to limit this, Facebook hired fact checkers to control news offerings on their site and label them as false, hoping to regain a position of trust from its users.
However, those fact checkers (working for independent news organizations that are partnering with Facebook) fear their reputations might be at stake given that they have created a so-called conflict of interest by allowing the social media giant to still give Fake News a pass.
According to multiple journalists that have responded anonymously to the press, the fact check on Facebook is still largely a tool that doesn’t meet its need.
For example, when the two latest mass shootings happened in the US, in the states of Nevada and Texas an article immediately started trending on Facebook that the shooter was linked to anti-fascist groups, even though multiple fact-checkers debunked the piece and Facebook labeled it as false a short while afterward. Despite that warning, it was still shared more than 260,000 times by users.
Mr Alexios Mantzarlis, director of the International Fact-Checking Network at Poynter who are supposed to verify Facebook’s third-party fact checkers admitted: “We’re sort of in the dark. We don’t know what is actually happening. There are a lot of people at Facebook who really care about this but the level of information that is being handed out is entirely insufficient. This is potentially the largest real-life experiment in countering misinformation in history. We could have been having an enormous amount of information and data.”
Meanwhile, Facebook claims that its project is still developing, and will get better over time. According to the company, the ‘impressions’ of an article dropped by 80% as soon as it had been labeled ‘False.’
Facebook sent out this message: “Our work with third-party fact-checkers is not just meant to educate people about what has been disputed. It also helps us better understand what might be false and show it lower in News Feed,” the spokesperson said in an email, adding that the data informed its algorithms to “more quickly and accurately detect future false stories.”