It is frighteningly easy to dupe the public with statistics, since most journalists and readers aren’t trained in all the sophisticated ways pollsters can make subtle changes to research that make a gigantic difference in the outcome. Recently, a near-paranoid picture of the American public made its way around the media cycle, thanks to a “study” commissioned by TrustE, a web security company, on privacy. Since we know you want to be intelligent readers, and you can’t always depend on journalists to recognize a bad statistic, we thought we’d show you how to spot a misleading study using the report’s glaringly bad methodology.
1. Beware of really high or low percentages. A sizable chunk of the American electorate is shockingly ignorant. As of 2008, 30% still believe Saddam Hussein had weapons of mass destruction and 18% believe the sun revolves around the earth. Any truly representative survey will reflect uneducated respondents and therefore will have a buffer away from 100% and 0%, cushioned by these citizen-ostriches.
So, when TrustE reports that 94% are concerned about privacy, 79% delete cookies or browser cache and 64% use ad-blocking or anti-tracking systems, ready your bullshit meter. I seriously doubt 79% of users even know they use a browser, let alone the intricacies of the setting preferences. As quasi-evidence, see this funny man-on-the-street interview clip of pedestrians scratching their noodle at the question, “What is a browser?”
2. Beware of non-peer reviewed research or surveys not conducted by an established authority (like Pew): Research has found that industry-funded studies are biased towards the outcome that makes the sponsored business the most money (i.e. smoking is really unhealthy, despite what the tobacco lobby may claim).
3. Word order and phrasing matter: In surveys, the exact same question can elicit substantially different results if questions are asked in a different order. In one study, dissatisfaction with President Bush jumped 10 points merely if it was asked after respondents rated their “overall satisfaction” with his performance. Similarly, in Iyender and Kinder’s famous experiment, the researchers found that they could manipulate which issues voters found most important by changing the topic of TV shows they watched right before a survey.
Phrasing matters too: In 1989, a poll found that far more respondents seemed to support interracial marriage when they were asked whether the government should “allow” the marriage (32%) vs. “forbid” such marriage (19%), even though, legally speaking, it’s the exact same. The polling irregularity, known as the “forbid/allow asymmetry,” showed how seemingly innocuous changes could cause a big difference in the outcome.
Similarly, the TrustE’s study phrasing is all about privacy, and primes unsuspecting respondents to be scared about the potential ways privacy can be abused. So, of course, it appears as though respondents think privacy is the most important bloody issue on the planet. It’s the equivalent of asking, “Did you know that your neighborhood had 10 violent break-ins last year? So, how likely are you to buy a security alarm?” A more responsible way to handle a poll is to intersperse neutral or irrelevant questions in with randomized question order.
4. Beware trend research based on respondent memory. TrustE claims that concern for privacy is on the rise, based on the number of respondents who said they were more concerned now about privacy than a year ago. Bullshit. Respondents are known to have crazy-bad memory recall: they forget heart attacks, cancer, the date of the death of a relative , and major life episodes. We have a hard enough time remembering what we had for breakfast yesterday, let alone a an obscure personal preference 365 days ago.
TrustE (and their polling firm, Harris Interactive) aren’t alone in conducting bad research: it’s done with politics, social media, education and countless other issues. So, shield your minds with knowledge and a healthy skepticism, the wielders of bad research want to own your opinions.