Connect with us


Facebook learns their fake news flagging system had opposite effect, switches tactics



Facebook learns their fake news flagging system had opposite effect switches tactics

In a corrective move that is both unexpected and mildly amusing, Facebook has changed the way it will fight “fake news.” Their recent experience with the “Disputed Flag” proved not only ineffective but at times actually made people more likely to click on a story.

It took an academic study for Facebook to realize the common sense reality that if you tell people not to look at something, many will instantly want to look at it.

Their next attempt to tackle fake news will be to put related articles next to stories that would have been flagged as disputed. This will allow people more context. They will also continue doing what they did originally which is to push anything deemed to be fake news down lower in users’ news feeds. This, they claim, has reduced views by 80% once it happens, but it can often take more than a day.

Here’s the video they put out about it last week:

My Take

I am adamantly opposed to fake news, but I do not trust Facebook to be the gatekeeper, nor do I believe their “reliable fact checkers” like PolitiFact are unbiased. They’re clearly not. The left-leaning organization uses left-leaning fact checkers to keep what they deem as fake out of reach from the general public.

Do we even need gatekeepers at all? To some extent, yes. False reports and incorrect assessments are rampant on the internet and affect both sides of the political aisle. For every alleged instance of right-wing manipulation of the facts, there’s a left-winger pushing in the opposite direction.

The problem is in finding trusted gatekeepers. There really aren’t any, at least not with enough prominence to make a real difference. Until something better emerges, it’s important to remain diligent in promoting the real news. That’s one of the reasons I write for this site.