Facebook Shouldn’t Fact-Check

Facebook Shouldn’t Fact-CheckBy Jessica Lessin

Nov 29 2016

We finally got a grudging mea culpa from Mark Zuckerberg: an admission that fake news is a significant problem that his social network must help solve.

But as a journalist who has been covering the inner workings of the technology industry for more than a decade, I find the calls for Facebook to accept broad responsibility for fact-checking the news, including by hiring editors and reporters, deeply unsettling.

What those demanding that Facebook accept “responsibility” for becoming the dominant news aggregator of our time seem to be overlooking is that there’s a big difference between the editorial power that individual news organizations wield and that which Facebook could. Such editorial power in Facebook’s hands would be unprecedented and dangerous.

We can all agree that Facebook should do much more to make sure that blatantly fabricated claims that Donald J. Trump won the popular vote or received the pope’s endorsement don’t spread and are, at a minimum, labeled fakes.

Facebook admits, and my sources confirm, that it can do a better job of this by helping users flag dubious articles and predicting fakes based on data it has for search. This doesn’t have to involve humans: Facebook could decide to label content as suspected as fake if it was flagged a certain number of times and if it displayed other questionable attributes. Such a move would not mean Facebook’s taking broad responsibility for what’s true.

But hiring editors to enforce accuracy — or even promising to enforce accuracy by partnering with third parties — would create the perception that Facebook is policing the “truth,” and that is worrisome. The first reason has to do with the nature of Facebook’s business. The second has to do with the news business.

One thing is clear to anyone who has worked in a newsroom: Not all fact-checking decisions are black and white.

Did the pope endorse Mr. Trump? He did not.

[snip]

Advertisements