The accused NSA leaker used "web crawler" software, which is widely available, to methodically collect a huge collection of data while he went about his duties as an IT contractor for the super-secret agency.
A senior intelligence official told the Times the process was "quite automated" and probably could have been easily detected. "We do not believe this was an individual sitting at a machine and downloading this much material in sequence," the official said.
Web crawler software is also known as a spider and basically moves from Web site to Web site via embedded links in documents. The programs can be directed to copy whatever information it comes across.
Officials also told the Times Snowden was able to snoop around sensitive areas because he was working out of a small NSA branch office in
But officials also told the Times that Snowden's activities had been questioned by his supervisors "a few times." Snowden, however, avoided trouble by saying he was performing basic maintenance in line with his position as a system administrator.
Most Popular Stories
- Koch Brothers Step up Anti-Obamacare Campaign
- Obama Administration Releases Proposal to Regulate For-Profit Colleges
- Elizabeth Vargas' Husband Marc Cohn Addresses Rumors
- Quiznos Files for Chapter 11
- FDIC Sues Big Banks Over Rate Manipulation
- U.S. to Relinquish Gov't Control Over Internet
- Keurig Adds Peet's coffee, Alters Starbucks deal
- Vybz Kartel Convicted of Murder
- SoCalGas Reaches Record Spend on Diversity Suppliers
- U.S. Consumer Sentiment Falls in Early March