View all Articles
Commentary By Heather Mac Donald

Common Sense and Computer Analysis

Energy, Cities, Cities Technology, New York City

Irrational paranoia about computer technology threatens to shut down an entire front in the war on terror.

A prestigious advisory panel has just recommended that the Defense Department get permission from a federal court any time it wants to use computer analysis on its own intelligence files. It would be acceptable, according to the panel, for a human agent to pore over millions of intelligence records looking for al Qaeda suspects who share phone numbers, say, and have traveled to terror haunts in South America. But program a computer to make that same search, declares the advisory committee, and judicial approval is needed, because computer analysis of intelligence databanks allegedly violates “privacy.”

This nonsensical rule is the latest development in the escalating triumph of privacy advocacy over common sense. Unfortunately, the privacy crusade is jeopardizing national security as well. The privacy advocates’ greatest victory to date was in shutting down the Pentagon’s Total Information Awareness program. That research was testing whether computers can spot terrorist activity by sifting through reams of electronic data. In the wake of the TIA’s demise last September, the defense secretary appointed a panel of Washington stalwarts, including Floyd Abrams and Lloyd Cutler, to advise the Pentagon on future intelligence technology research. But rather than clarifying the issues around computer analysis, privacy and national security, the panel’s recent report has made a bad situation worse.

At stake is a young technology known as data mining. Data mining responds to the explosion of information in scientific, government and commercial databanks. Through complex algorithms, it uncovers significant patterns in computer data whose sheer volume defeats human analysis. TIA researchers hoped to use data mining on intelligence databases, and possibly commercial databanks as well, to find the electronic footprints terrorists leave as they plan and rehearse their next attack.

Misinformation swirled around the TIA almost from the moment it was announced. Privacy advocates claimed that pattern analysis represents a radically new and unconstitutional approach to law enforcement. It does not. Police officers search for patterns every time they observe a city street looking for suspicious behavior -- someone casing a jewelry store, say, or trolling for drug buyers. Looking for suspicious behavior in computer databases -- connecting the dots, for example, between purchases of large volumes of bomb-making chemicals, phone calls to known Islamic radicals, travel to Sudan and the rental of a Ryder truck -- differs only in the medium of observation, not the technique.

TIA critics also charged that it would insuperably violate privacy. But information in commercial databanks is probably the least private thing about us: It is routinely sold to marketers and is often available by Internet search. The government already has legal access to such data without obtaining a warrant. Nevertheless, TIA researchers were developing cutting-edge privacy protections that would make electronic records anonymous until a sufficiently suspicious pattern suggesting terrorist activity emerged.

The facts about the TIA were lost under nonstop charges that the project represented an Orwellian plot to spy on every American. And now the Pentagon advisory committee has taken the hysteria about data mining to a whole new level.

The committee demands that counterterrorism analysts seek court approval to mine the Pentagon’s own lawfully acquired intelligence files, if there is a chance that they might contain information on U.S. citizens or resident aliens -- basically all intelligence files. Eyeball scrutiny of those same files, however, requires no such judicial oversight. This rule suggests a bizarre conceit that the automation of human analysis, which is all data mining is, somehow violates privacy more than the observation of those same items by a person. In fact, the opposite is true. A computer has no idea what it is “reading,” but merely selects items by rule.

The advisory committee’s technophobia does not end with intelligence analysis. It would also require the defense secretary to give approval for, and certify the absolute necessity of, Google searches by intelligence agents. Even though any 12-year-old with a computer can freely surf the Web looking for Islamist chat rooms, defense analysts may not do so, according to the panel, without strict oversight.

The defense secretary should reject the panel’s recommendations, which are based neither in logic nor in law. The government receives 126 million intelligence intercepts a day. Humans cannot possibly keep up with the intelligence tidal wave; anti-terror agents miss connections between suspects, places and events every day. Computer analysis of intelligence data is not merely optional, it is virtually required, for the government to have any hope of extracting evidence of terrorist activity from the tsunami of possibly relevant information. To demand a laborious court appeal every time the government wants to sift that data electronically would bring our intelligence efforts to a halt, and leave us vulnerable to the next terror attack.

This piece originally appeared in Washington Post

This piece originally appeared in The Washington Post