Officials Say Snowden used Web Crawling Software to Collect NSA Data

Intelligence officials who spoke with the New York Times said that Edward Snowden used a standard web crawler, a tool that allows you to index websites typically used in search engines, to automatically collect all the information he wanted.

“He was either very lucky or very strategic,” one intelligence official said. A new book, “The Snowden Files,” by Luke Harding, a correspondent for The Guardian in London, reports that Mr. Snowden sought his job at Booz Allen because “to get access to a final tranche of documents” he needed “greater security privileges than he enjoyed in his position at Dell.”

snowden.png

The only internal defences that he needed to bypass were the right login credentials. Since the NSA was not preventing insiders from accessing data, the web crawler could in theory collect anything. Snowden's Hawaii bureau did not have any activity monitors set up that would have caught his bot in the act.

WSJ: Nokia to Release an Android Phone at MWC

Programming in C: Structures