SKYNET (surveillance program)
SKYNET is a program by the U.S. National Security Agency that performs machine learning analysis on communications data to extract information about possible terror suspects. The tool is used to identify targets, such as al-Qaeda couriers, who move between GSM cellular networks. These couriers often swap SIM cards within phones that have the same ESN, MEID or IMEI number. The tool uses classification techniques like random forest analysis. Because the data set includes a very large proportion of true negatives and a small training set, there is a risk of overfitting. Bruce Schneier argues that a false positive rate of 0.008% would be low for commercial applications where "if Google makes a mistake, people see an ad for a car they don't want to buy" but "if the government makes a mistake, they kill innocents."
Participation and partnerships
NSA directorates participating:
- Signals Intelligence: S21, S22, SSG
- Research: R6
- Technology: T12, T14
Al-Jazeera's bureau chief in Islamabad, Ahmad Zaidan, was wrongly identified as the most probable member of al-Qaeda and the Muslim Brotherhood on their records. Despite the fact it is improbable to be a member of both groups, he is also widely and publicly known for traveling to meet with radical groups, but was instead identified due to mobile phone surveillance placing him in rural locations. This has been seen to show the failing of the system, as it has misidentified a journalist conducting legitimate, public business as a potential terrorist, whilst also harming freedom of the press and breaking US law on surveillance of journalists.
- Grothoff, Christian; Porup, J. M. (16 February 2016). "The NSA's SKYNET program may be killing thousands of innocent people". Ars Technica UK.
- "SKYNET: Applying Advanced Cloud-based Behavior Analytics". The Intercept. 8 May 2015.
- Crockford, Kade (8 May 2015). "MIT and Harvard Worked with NSA on SKYNET Project". Privacy SOS. Italic or bold markup not allowed in:
- Robbins, Martin (18 February 2016). "Has a rampaging AI algorithm really killed thousands in Pakistan?". The Guardian.