Facebook is developing measures to fight “fake” stories in people’s news feeds. Users will soon be able to flag a story they believe is a hoax, which is then sent to 3rd party “fact” checking organizations for review. If the groups determine “concern” for the information’s validity, then a message appears underneath your news feed post when you view it:
What this will do is make “falser” stories become less and less visible.
Imagine there is a report about vaccine’s being harmful. Well, if enough people don’t like it, they flag it, the 3rd part organization blacklists it for everyone else who tries to share it, being shown a message about how “fake” this might be.
Then more people agree it’s fake, more people flag it. Each time it gets flagged, it dropped from visibility from other people’s feeds, going lower and lower.
So if there is something I want to know about, something popular to judge myself, well I can’t. Fakebook has decided to drive down popularity and essentially create fakenews for anything just by warning someone it’s fake. All it takes is a warning. People will take that as authentic authorized fakery.
Do many people go to a site after google warns that it might be dangerous? Not most. This will have the same effect. People will believe the “fakenews” flag, before they will bother to judge for themselves.
Also, these “fakenews” flags will prevent those posts that are flagged, from being promoted through advertising. Sharing is still possible, but with that warning that most people will take seriously, since they buy into the whole “fakenews” hype. Facebook already limits what gets more attention, now promoting family and friends posts to the top of people’s news feeds. Posts people read but don’t share, are also ranked lower on a feed.
The NY TImes even admits why the whole hype about “fakenews” and “post-truth” is even a thing right now, and that’s because of Donald Trump winning:
“The company has been under that spotlight since Nov. 8, when Donald J. Trump was elected the 45th president.”
If Trump didn’t win, apparently Facebook wouldn’t be under any spotlight… Zuckerberg at first said it a “pretty crazy idea”, but Facebook has since developed internal divisions pushing Zuckerberg to address the issue.
The fakestream media is commenting about how dire these important changes to Facebook are:
“But the fake cat is already out of the imaginary bag,” Ms. Bell added. “If they didn’t try and do something about it, next time around it could have far worse consequences.
OMG! Next time… it could be far worse! LOL.
And who are the 3rd party “fact checkers”? Snopes, PolitiFact, The Associated Press, FactCheck.org and ABC News.
I feel safer already, don’t you?
Concerns for Facebook are now about how this flagging will affect their revenue from advertising, since flagging popular content will reduce engagement from that content. Facebook executives say they recognize that they have to override their focus on engagement. Zuckerberg has said similarly:
“I think of Facebook as a technology company, but I recognize we have a greater responsibility than just building technology that information flows through… We have a responsibility to make sure Facebook has the greatest positive impact on the world.”
Facebook itself, along with the mainstream media, are not the only ones who lay responsibility at Facebook’s feet. Others agree, but also recognize the problem with this system: “They will get so many things false-flagged as fake news by people with an axe to grind, so it’s going to make it more challenging to moderate… But it’s a step in the right direction”, said Daniel Sieradski, an activist and journalist who developed a plugin called BS Detector to flag questionable news sources. There are issues with this system but people still want it. What about helping people discern knowledge themselves? Nah… we wouldn’t want people to think better…
The mainstream “fakenews” induced worry is not going away any time soon. They are desperately trying to quash any independent information and thinking outside of the establishment. After all, Facebook is a bastion of truth right?
“It’s important to us that the stories you see on Facebook are authentic and meaningful.” – Facebook Press Release
All of the “truther” and alternative information being put out on Facebook is going to take a serious hit when this rolls out. Facebook, through it’s mainstream establishment-accepting userbase, will decide what is “authentic” and “meaningful” information for you to see more or less of on your feed.
On Steemit, ideas about flagging councils have been around for a while. Facebook is going ahead with a similar model through 3rd party outsourcing of flagged content, but also focused specifically on “fakenews”.
Some users on Steemit have raised concerns about flagging abuses. A council of users would be one method to address such abuses and possibly correct damage that was undeserved by malicious intent from people abusing the flagging feature.
Is a similar model worth implementing here? Are people against an organization for flagging review?
– Facebook now flags and down-ranks fake news with help from outside fact checkers
– Facebook to begin flagging fake news in response to mounting criticism
– Facebook Mounts Effort to Limit Tide of Fake News
Upvoting, Sharing, and Resteeming below.