Are Google’s Personalized Results Making Us Politically Partisan?

Next Story

Spend Your Brick And Wood On This Ultimate Settlers Of Catan Gameboard Kickstarter Project

Here is an unintended dark side to a search engine that only provides the information we want to see: it cocoons us in an echo chamber of political information that confirms our pre-existing opinions. Google competitor, DuckDuckGo, released a tiny study showing how Google’s personalized search yielded wildly different results on abortion and gun control, even for users that weren’t logged on to a Google service. “You search for raw information, but you’re getting more of what you already agree with,” says a clever advertisement from the company.

Since 2009, Google has pushed personalized search for every user, increasingly adding more data to tailor results, including location and search history. Google argues that personalization is essential, since the only way to know whether a search for “Taj Mahal” was for the building in India or the jazz musician, is to make an informed guess about the user’s intentions.

Personalization has unintended consequences for political information, where it’s commonly assumed that citizens need a balanced perspective–whether they like it or not. DuckDuckGo, however, found in a tiny study of 100 volunteers, personalized Google results for “abortion” yielded Obama’s stance on abortion for some, and information on pro-life activist, Gianna Jessen, for others. Political partisanship may thus be another unintended consequence of collecting private information.

The echo-chamber argument is not new. Legal scholar and former White House officer, Cass Sunstein, called this the “cyberbalkanization” (or “splinternet” if brevity is your thing). Research, however, finds that we’re not as aggressively partisan as some worry. Scholars who conduct large scale tracking of users news reading habits find little evidence of partisan information seeking (more research here and here). “There is no evidence that individuals abandon news stories that contain information with which they disagree. Implications and directions for future research are discussed,” writes Ohio University Professor, Kelly Garrett.

In Google’s case, personalized search is unintentional partisanship, which could result in a spiral of ever-more self-confirming information. DuckDuckGo would have to make the case that this holds up in experimental conditions and has an affect on civic knowledge and voting behavior. As we see above, it’s easy to make a theoretical article about the impacts of the Internet, but it’s quite another to prove they’re actually true.

Still, it’s worth studying, because the implications of personalized search could be very troubling for our democracy.

[Via Talking Points Memo]