Surveillance-based manipulation: How Facebook or Google could tilt elections
From Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World.
By Bruce Schneier
Feb 26 2015
Bruce Schneier is a cryptographer and security expert who has been blogging on those topics since 2004. He is the author of numerous books, including Carry On and Liars and Outliers. The following is an excerpt from his latest book, Data and Goliath: The Hidden Battles to Collect your Data and Control Your World. Copyright © 2015 by Bruce Schneier. With permission of the publisher, W. W. Norton & Company, Inc. All rights reserved.
Someone who knows things about us has some measure of control over us, and someone who knows everything about us has a lot of control over us. Surveillance facilitates control.
Manipulation doesn’t have to involve overt advertising. It can be product placement that makes sure you see pictures that have a certain brand of car in the background. Or just increasing how often you see those cars. This is, essentially, the business model of search engines. In their early days, there was talk about how an advertiser could pay for better placement in search results. After public outcry and subsequent guidance from the FTC, search engines visually differentiated between “natural” results by algorithm, and paid results. So now you get paid search results in Google framed in yellow, and paid search results in Bing framed in pale blue. This worked for a while, but recently the trend has shifted back. Google is now accepting money to insert particular URLs into search results, and not just in the separate advertising areas. We don’t know how extensive this is, but the FTC is again taking an interest.
When you’re scrolling through your Facebook feed, you don’t see every post by every friend; what you see has been selected by an automatic algorithm that’s not made public. But someone can pay to increase the likelihood that their friends or fans will see their posts. Corporations paying for placement is a big part of how Facebook makes its money. Similarly, a lot of those links to additional articles at the bottom of news pages are paid placements.
The potential for manipulation here is enormous. Here’s one example. During the 2012 election, Facebook users had the opportunity to post an “I Voted” icon, much like the real stickers many of us get at polling places after voting. There is a documented bandwagon effect with respect to voting; you are more likely to vote if you believe your friends are voting, too. This manipulation had the effect of increasing voter turnout 0.4% nationwide. So far, so good. But now imagine if Facebook manipulated the visibility of the “I Voted” icon based on either party affiliation or some decent proxy of it: ZIP code of residence, blogs linked to, URLs liked, and so on. It didn’t, but if it did, it would have had the effect of increasing voter turnout in one direction. It would be hard to detect, and it wouldn’t even be illegal. Facebook could easily tilt a close election by selectively manipulating what posts its users see. Google might do something similar with its search results.
A truly sinister social networking platform could manipulate public opinion even more effectively. By amplifying the voices of people it agrees with, and dampening those of people it disagrees with, it could profoundly distort public discourse. China does this with its 50 Cent Party: people hired by the government to post comments on social networking sites supporting, and challenge comments opposing, party positions. Samsung has done much the same thing.
Many companies manipulate what you see based on your user profile: Google search, Yahoo News, even online newspapers like the New York Times. This is a big deal. The first listing in a Google search result gets a third of the clicks, and if you’re not on the first page, you might as well not exist. The result is that the Internet you see is increasingly tailored to what your profile indicates your interests are. This leads to a phenomenon that political activist Eli Pariser has called the “filter bubble”: an Internet optimized to your preferences, where you never have to encounter an opinion you don’t agree with. You might think that’s not too bad, but on a large scale it’s harmful. We don’t want to live in a society where everybody only ever reads things that reinforce their existing opinions, where we never have spontaneous encounters that enliven, confound, confront, and teach us.