A new report detailing how easily whistle-blower Edward Snowden was able to download troves of highly classified NSA files is raising further concerns about the agency's internal security measures. 

The New York Times cites a senior intelligence official in reporting that Snowden used a common, inexpensive "webcrawler" tool to "scrape" data out of NSA systems. 

The official did not name the crawler program Snowden used, but explained it functioned similar to Googlebot, a web crawler developed by Google to find and index Web pages. (Via Fox News)

CNET explains Googlebot "can be programmed with various search phrases; it then travels automatically from web page to web page, following links, and going ever deeper in search of relevant documents."

Apparently, Snowden used his crawler in the same way, setting parameters for searches, including what NSA files to look for and how deeply to follow links to documents on the agency's servers. 

At a House committee hearing last week, intelligence officials disclosed Snowden had accessed roughly 1.7 million restricted files. (Via C-SPAN)

That information, coupled with the disclosure Snowden was able to use rudimentary methods to thwart NSA internal security, has raised concerns about the agency's systems. 

Particularly because the agency is tasked with maintaining U.S. cyber security. There are now serious doubts whether NSA systems could hold up against a sophisticated, foreign cyber attack if it could not detect Snowden's insider attack. (Via ABC)

But Snowden's movements apparently were detected, several times, and he was challenged by NSA officials over his actions on more than one occasion. 

Apparently providing supervisors with legitimate-sounding explanations, that as a systems administrator he was tasked with conducting routine network maintenance — backing up files and moving information. (Via Al Jazeera

Actually, the senior intelligence official told the Times Snowden's crawler likely would have been caught if he were at any other facility than Oahu, Hawaii, which was the last facility to receive an internal security upgrade — something investigators are unsure if Snowden planned when he sought a job there. (Via NBC)

Snowden released a response to the report through his lawyer in the U.S. While he did not confirm or deny intelligence officials' theory on how he obtained the files he leaked, he said: 

"It's ironic that officials are giving classified information to journalists in an effort to discredit me for giving classified information to journalists. The difference is that I did so to inform the public about the government's actions, and they're doing so to misinform the public about mine." (Via The Guardian)

The NSA has declined the Times' request to provide details about specific security changes the agency has made since the Snowden leaks last year. 

Snowden Used Common, Low-Cost Tool To Get NSA Files: Report

by John O'Connor
0
Transcript
Feb 9, 2014

Snowden Used Common, Low-Cost Tool To Get NSA Files: Report

(Image source: The Guardian)

BY John O'Connor

A new report detailing how easily whistle-blower Edward Snowden was able to download troves of highly classified NSA files is raising further concerns about the agency's internal security measures. 

The New York Times cites a senior intelligence official in reporting that Snowden used a common, inexpensive "webcrawler" tool to "scrape" data out of NSA systems. 

The official did not name the crawler program Snowden used, but explained it functioned similar to Googlebot, a web crawler developed by Google to find and index Web pages. (Via Fox News)

CNET explains Googlebot "can be programmed with various search phrases; it then travels automatically from web page to web page, following links, and going ever deeper in search of relevant documents."

Apparently, Snowden used his crawler in the same way, setting parameters for searches, including what NSA files to look for and how deeply to follow links to documents on the agency's servers. 

At a House committee hearing last week, intelligence officials disclosed Snowden had accessed roughly 1.7 million restricted files. (Via C-SPAN)

That information, coupled with the disclosure Snowden was able to use rudimentary methods to thwart NSA internal security, has raised concerns about the agency's systems. 

Particularly because the agency is tasked with maintaining U.S. cyber security. There are now serious doubts whether NSA systems could hold up against a sophisticated, foreign cyber attack if it could not detect Snowden's insider attack. (Via ABC)

But Snowden's movements apparently were detected, several times, and he was challenged by NSA officials over his actions on more than one occasion. 

Apparently providing supervisors with legitimate-sounding explanations, that as a systems administrator he was tasked with conducting routine network maintenance — backing up files and moving information. (Via Al Jazeera

Actually, the senior intelligence official told the Times Snowden's crawler likely would have been caught if he were at any other facility than Oahu, Hawaii, which was the last facility to receive an internal security upgrade — something investigators are unsure if Snowden planned when he sought a job there. (Via NBC)

Snowden released a response to the report through his lawyer in the U.S. While he did not confirm or deny intelligence officials' theory on how he obtained the files he leaked, he said: 

"It's ironic that officials are giving classified information to journalists in an effort to discredit me for giving classified information to journalists. The difference is that I did so to inform the public about the government's actions, and they're doing so to misinform the public about mine." (Via The Guardian)

The NSA has declined the Times' request to provide details about specific security changes the agency has made since the Snowden leaks last year. 

View More
Comments
Newsy
www2