Whistleblower Edward Snowden used cheap software to hack the National Security Agency's networks, American intelligence officials say.
The senior officials stated that Snowden used the 'web crawler' software to search, index and back up a website and get the confidential information from the systems.
"We do not believe this was an individual sitting at a machine and downloading this much material in sequence," an unnamed official told The New York Times, Sunday.
This is surprising given the fact that the U.S. stores all secret data related to military and intelligence in high-end computer systems immune to cyber attacks, particularly those from Russia and China. Furthermore, the investigators also found that that the NSA could have easily sensed any attack on their systems with the use of such inexpensive and simple software.
The 30-year-old whistleblower was given near about complete access to all NSA files as he was working as a technology contractor for the agency in Hawaii and helping to manage their computer systems in a settlement that focused mainly on China and North Korea.
According to 'The Snowden Files,' a book by Guardian journalist, Snowden got a job in Honolulu with security company Booz Allen Hamilton because it gave more privileges.
A web crawler, also called a spider, moves from website to website and follows links embedded in each document. It can be programmed to get everything in its path. The NYT reports that it seemed Snowden set the limits for the searches that included which subjects to look for and how deeply to follow links to documents and other NSA data on the NSA's internal networks. According to the US intelligence officials, Snowden accessed around 1.7 million files.