Home > News > How the 9/11 Commission helped Edward Snowden
129 views 7 min 0 Comment

How the 9/11 Commission helped Edward Snowden

- February 10, 2014

(Jason Reed/Reuters)
The New York Times article on how former NSA contractor Edward Snowden got data from the National Security Agency has gotten a lot of ridicule from tech people, who find its breathless references to exotic software such as “wget” (a commonly used Unix tool) to be hilarious. However, there is some quite interesting information in the piece. If the article is correct, Snowden used a Web crawler to trawl for links to relevant documents, targeting the NSA’s shared “wikis,” easily modifiable Web sites that use the same rough architecture as Wikipedia. So what are Wikipedia equivalents doing hidden in the internal systems of the NSA?
Much of the answer lies in the 9/11 Commission report. The 9/11 Commission was established to identify the reasons why the United States had not anticipated and stopped the Sept. 11 attack. One of the key problems it identified was the lack of sharing of information across the U.S. government. Different parts of the government had different pieces of information, which, if they had been put together in time, could have allowed the government to discover the plot. However, different agencies did not communicate well with each other, thanks both to their organizational culture and various official barriers to sharing. The commission strongly recommended that intelligence and law enforcement move toward far better sharing of information across and within agencies.
These recommendations acquired new political urgency after Iraq. As it became clear that the decision to invade Iraq had been based in part on faulty intelligence, the various intelligence agencies came under heavy fire from politicians for providing dubious information. Intelligence officials were furious (because they felt they were getting the blame for decisions that had really been taken by politicians, who had in some cases plausibly exerted pressure on the agencies to get the analysis they wanted) and terrified that the intelligence agencies would be forced to undergo major reorganization. However, they needed a plausible response to the critics; something that would show that they had a plan to respond.
They found it in new technologies such as wikis (which many in the intelligence community had already been pushing, because of their innate advantages). Wikis, blogs and social networking would allow intelligence analysts from different agencies to work together, without the ‘stovepiping’ that had previously plagued the sharing of valuable information. They would provide new kinds of work product, that could be quickly updated to take account of new information. And they would allow the better coordination of information within agencies, too, by cutting down on the usual organizational overhead and lengthy vetting processes.
This article (pdf) by Andrew Chomik, provides a useful overview of the results. The intelligence community built a number of internal tools, including Intellipedia, a Wikipedia-like compendium of information across the intelligence community, and A-space, a social networking site intended to allow analysts both to identify others with shared interests and to converse with them. These tools were eagerly taken up by many analysts, although they are only partly compatible with existing bureaucratic structures. As Chomik describes it:

Intellipedia now has over 1.28 million pages, used by over 180,000 users contributing content . A-Space has also achieved significant adoption rates and usage among USIC analysts. Intellipedia was also integral to information sharing during the 2008 Mumbai terrorist attacks, and won Homeland Security Awards in 2009 for the improvements it made in information sharing among analysts (Wu, 2010). This, however, does not mean that Web 2.0 has been effective for producing intelligence. An attempt to produce a National Intelligence Estimate solely on Intellipedia itself was ultimately rejected and sent back into the conventional stream of intelligence analysis and dissemination . Having this particular product revert back to conventional bureaucratic processes suggests that using Intellipedia and other social computing tools as channels for building and disseminating intelligence are problematic, and lacking in a cohesive, fluid workflow of collaboration among agencies.

What tools like Intellipedia have been useful for is organizing relevant content. They allow analysts not only to summarize the state of knowledge on a given topic, but to link both to other pages elsewhere in Intellipedia, and to useful supporting material. This plausibly makes search quicker and more efficient than traditional databases.
However, as the Snowden leaks show, efficient search can have its downside. If the Times is right that Snowden targeted some Intellipedia-type structure within the NSA, his task was made far, far easier by the existence of a wiki with organized links, which could then be searched through an automated crawler, which explored all relevant pages, and downloaded all content that was linked from those pages.
As a result of Sept. 11, and subsequent scandals, the U.S. intelligence community has become far more open internally than it used to be, and has introduced new technologies aimed at making it easier for analysts to post, change and exchange information, and to organize internal material in easy-to-find ways. This, plausibly, has had advantages in allowing information sharing within institutions that are typically heavily bureaucratic and which often fight with each other. However, open information sharing can have trade-offs, especially if some insiders have different motivations than those of the organization.
In the wake of the WikiLeaks scandal, intelligence officials strongly defended information sharing practices, and claimed that it was possible to reconcile these practices with strong security. They are likely about to come under renewed political pressure, as a result of Sunday’s revelations.