During a live Web chat in late January, National Security Agency whistle-blower Edward Snowden explained one of the least discussed dangers of bulk collection. By indiscriminately sweeping up
the call records and the
international communications of Americans, the government has the ability to engage in retroactive investigation, or mining the historical data of targets for any evidence of suspicious, illegal or simply embarrassing activities. It is a disturbing capability that should make even those fully convinced of their own propriety to think twice before uttering out loud, “What do I have to fear if I have nothing to hide?”
But there’s another danger that Snowden didn’t mention that’s inherent in the government’s having easy access to the voluminous data we produce every day: It can imply guilt where there is none. When investigators have mountains of data on a particular target, it’s easy to see only the data points that confirm their theories — especially in counterterrorism investigations when the stakes are so high — while ignoring or downplaying the rest. There doesn’t have to be any particular malice on the part of investigators or analysts, although prejudice no doubt comes into play, just circumstantial evidence and the dangerous belief in their intuition. Social scientists refer to this phenomenon as confirmation bias, and when people are confronted with data overload, it’s much easier to weave the data into a narrative that substantiates what they already believe. Criminologist D. Kim Rossmo, a retired detective inspector of the Vancouver Police Department, was so concerned about confirmation bias and the investigative failures it causes that he warned police officers
in Police Chief magazine to always be on guard against it. “The components of confirmation bias,” he wrote, “include failure to seek evidence that would disprove the theory, not utilizing such evidence if found, refusing to consider alternative hypotheses and not evaluating evidence diagnosticity.”
To get a better sense of the dangers, consider the case of Brandon Mayfield.
Mistaken identity
On March 11, 2004, Al-Qaeda-inspired terrorists coordinated a massive bombing of the Madrid commuter train system during the morning rush hour, killing 193 people and wounding approximately 1,800. Two latent fingerprints recovered during the investigation on a bag of detonators by the Spanish National Police (SNP) were shared with the FBI through Interpol. When the prints were run through the
bureau’s database, it returned 20 possible matches for one of the fingerprints, one of whom was Brandon Mayfield. A former U.S. Army platoon leader, Mayfield was now an attorney specializing in child custody, divorce and immigration law in Portland, Ore. His prints were in the FBI system because of Mayfield’s military service as well as an arrest two decades earlier because of a misunderstanding. The charges were later dropped.
Despite finding that Mayfield’s print was not an identical match to the print left on the bag of detonators, FBI fingerprint examiners rationalized away the differences,
according to a report by the Department of Justice’s Office of the Inspector General (OIG). Under the one discrepancy rule, the FBI lab should have concluded Mayfield did not leave the print found in Madrid — a conclusion the SNP reached and repeatedly communicated to the FBI. The FBI’s Portland field
Algerian national Ouhane Daoud. Only then did Mayfield’s traumatic journey into the stomach of the national security state end.
Cautionary tale
Mayfield’s ordeal is a cautionary tale of what can happen when the government clamps down on its suspect and refuses to release its grip. In the fortunate case of Mayfield, the government finally released him but only after it turned his life upside down in the process.
Nearly a decade later, the government’s secret surveillance capabilities have become only more powerful, thanks to social media, smartphones and other technologies. The bulk collection of Americans’ personal data makes it more likely that false positives — innocent Mayfields coming under government scrutiny — will occur. And when that false positive is an American Muslim or an anarchist or an aggressive environmental activist, will government agents and analysts have the ability to set aside their prejudices and excitement and weigh all information, particularly contradictory evidence, before condemning those unfortunate few to bogus charges and public suspicion?
Confirmation bias should make us skeptical of this possibility.