Media in the Age of Algorithms

Media in the Age of Algorithms
By Tim O’Reilly
Nov 11 2016
https://medium.com/@timoreilly/media-in-the-age-of-algorithms-63e80b9b0a73

Since Tuesday’s election, there’s been a lot of finger pointing, and many of those fingers are pointing at Facebook, arguing that their newsfeed algorithms played a major role in spreading misinformation and magnifying polarization. Some of the articles are thoughtful in their criticism, others thoughtful in their defense of Facebook, while others are full of the very misinformation and polarization that they hope will get them to the top of everyone’s newsfeed. But all of them seem to me to make a fundamental error in how they are thinking about media in the age of algorithms.

Consider Jessica Lessin’s argument in The Information:

“I am deeply, deeply worried about the calls I am hearing, from journalists and friends, for Facebook to intervene and accept responsibility for ensuring citizens are well-informed and getting a balanced perspective….
Facebook promoting trustworthiness sounds great. Who isn’t in favor of accepting responsibility and ferreting out misinformation? But major moves on Facebook’s part to mediate good information from bad information would put the company in the impossible position of having to determine “truth,” which seems far more objective than it really is. Moreover, it would be bad for society.”
My response: Facebook crossed this river long ago. Once they got into the business of curating the newsfeed rather than simply treating it as a timeline, they put themselves in the position of mediating what people are going to see. They became a gatekeeper and a guide. This is not an impossible position. It’s their job. So they’d better make a priority of being good at it.

But those who argue strongly for Facebook’s responsibility to weed out the good from the bad also get it wrong. For example, on Vox, Timothy B. Lee wrote:

A big issue here is about the way Facebook has staffed its editorial efforts. In a traditional news organization, experienced editorial staff occupy senior roles. In contrast, Facebook has relegated the few editorial decisions it has made to junior staffers. For example, until earlier this year Facebook had a team of 15 to 18 independent contractors who were in charge of writing headlines for Facebook’s “trending news” box.
When Facebook faced accusations that these staffers were suppressing conservative stories, Facebook panicked and laid all of them off, running the trending stories box as an automated feature instead. But that hasn’t worked so well either, as fake news keeps popping up in the trending news box.
The problem here wasn’t that Facebook was employing human editors to evaluate stories and write headlines. The problem was that Facebook’s leadership didn’t treat this as an important part of Facebook’s operations.
If Facebook had an experienced, senior editorial team in place, there’s a lot it could do to steer users toward high-quality, deeply reported news stories and away from superficial, sensationalistic, or outright inaccurate ones.
Lee is right to say that curating the news feed isn’t a job for junior staffers and independent contractors. But he’s wrong that it’s a job for “an experienced, senior editorial team.” It’s a job for the brightest minds on Facebook’s algorithm team!

And Lee is wrong to say that the problem wasn’t that Facebook was employing human editors to evaluate stories and write headlines. That was precisely the problem.

Like drivers following a GPS over a bridge that no longer exists, both Jessica Lessin and Timothy Lee are operating from a out-of-date map of the world. In that old map, algorithms are overseen by humans who intervene in specific cases to compensate for their mistakes. As Jessica rightly notes, this is a very slippery slope.

Jessica says:

…We shouldn’t let Facebook off the hook for every problem it creates or exacerbates. But we can’t hold it responsible for each of them either. We’re witnessing the effects of a world where the internet has driven the cost of saying whatever you want to whomever you want to zero, as Sam often says. This is an irreversible trend no company can stop, nor should we want them to.
But there is a good existence proof for another approach, one that Facebook has worked long and hard to emulate.

Google has long demonstrated that you can help guide people to better results without preventing anyone’s free speech. Like Facebook, they are faced every day with determining which of a thousand competing voices deserve to be at the top of the list. The original insight that Google was founded on, that a link is a vote, and that links from reputable sources that had been around a long time, were worth more than others, was their initial tool for weeding out the wheat from the chaff. But over the years, they developed hundreds if not thousands of signals that help to determine which links are the most valuable.

[snip]

Advertisements

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s