The former National Security Agency (NSA) contractor Edward Snowden used widely available software, and inexpensive technology to mine the NSA's networks, according to intelligence officials who are investigating the former contractor.
Officials told the New York Times that whistleblower Snowden, who leaked thousands of documents to the press, used "web crawler" software that is made to search, index and back up websites to "scrape" highly classified files.
Snowden is believed to have accessed about 1.7 million documents, by hopping from website to website and following links within each document. The "web crawler" software - said to be similar to Googlebot - copies everything it encounters and it is likely that it was automated.
"We do not believe this was an individual sitting at a machine and downloading this much material in sequence," an unnamed official told the Times.
The NSA, which acts to thwart cyber-attacks from other nation states, failed to detect the less sophisticated insider breach by Snowden, investigators found. Although there was a brief exchange between agency officials and Snowden about the unusual activity, he convinced them that it was legitimate and continued to mine data.
Officials went as far as saying that web crawlers are almost never used on NSA internal systems, which leads to questions as to why Snowden's actions did not set off alarms.
This is the second report to raise further questions about security measures at the NSA. In November, a Reuters report citing unnamed sources found that Snowden had persuaded other NSA colleagues to give him their login details and passwords, which he later used to gain access to classified information.
This was all the more surprising because Snowden was a contractor rather than an official employee at the organisation.
By eliminating high entry costs for big data analysis, you can convert more raw data into valuable business insight.
A discussion of the "risk perception gap", its implications and how it can be closed