Jump to content

User:Showtime oski/sandbox

From Wikipedia, the free encyclopedia

Evaluate an article activity (responses for two articles included)

[edit]

This is where you will complete your article evaluation. Please use the template below to evaluate your selected article.

  • Name of article: 1) Information privacy ; 2) United States v. Jones
  • Briefly describe why you have chosen this article to evaluate.
    • 1) I selected this article to evaluate because it is mandatory to evaluate this specific article.
    • 2) I selected this article to evaluate because the Supreme Court case was of particular interest to me, especially the Court's division despite its unanimous ruling.

Lead evaluation

[edit]

1) The Lead includes an introductory sentence that is concise and clear about the article's topic. While the lead does briefly hint at some of the article's major sections, such as political and legal issues existing in the field of information privacy, it does not specifically foreshadow the sections on "Information types" or the "United States Safe Harbor program and passenger name record issues." Beyond the introductory sentence, the Lead is also very concise.

2) The Lead includes an introductory sentence that concisely and clearly describes the article's topic, and includes brief descriptions of the articles major sections (while not including any of the arguably "minor" sections, such as "Oral Argument," "Reception," or "Subsequent developments." The Lead only includes information that is present in the article and is written concisely.

Content evaluation

[edit]

1) The article's content is relevant to the topic and up-to-date; some of the sources are as recent as 2018 and 2019. There does not seem to be any content that is out-of-place. However, there are many topics on information privacy that are not discussed, but instead linked to related fields (e.g., criminal justice investigations and health care records). The article does not seem to deal with any of Wikipedia's equity gaps.

2) The article's content is relevant, up-to-date, is not missing content or including content that is out-of-place. The article does, by quoting Justice Sonia Sotomayor's concurring opinion, allude to historically underrepresented populations and topics, including AIDS, the gay community, abortion, and Muslims.

Tone and balance evaluation

[edit]

1) The article is, on the whole, neutral, with no claims that appear heavily biased toward a particular position. However, within the "United States Safe Harbor program and passenger name record issues" section, it claims that the US and EU's Passenger Name Record agreement was "controversial," and the following paragraph includes mostly criticism (from the EU and Brussels, not the writer) of the agreement. The absence of the US point of view on the agreement -- whether US officials supported the agreement or not -- can be seen as underrepresentation. But it does, at its closing, take a neutral, fact-based stance on the issue (explaining that the source of tensions between the US and Brussels is the relatively weak data protection in the US).

2) The article is neutral, without any overtly biased claims or overrepresentation of certain viewpoints. It does not attempt to persuade the reader towards any particular viewpoint.

Sources and references evaluation

[edit]

1) All the facts in the article are backed up by a reliable secondary source of information, are thorough and current (as previously mentioned), and are written by a diverse spectrum of authors. However, there are no mentions of historically marginalized individuals. The links I checked worked.

2) All of the facts in the article are backed up by a reliable secondary source of information. The sources are thorough and current, with most being published around the period the case was decided, in 2012. While much of the article's information is taken directly from the Court's decision, it also includes a variety of sources, including articles from The Wall Street Journal, The Washington Post, Wired.com, and legal blogs. The links I checked worked.

Organization evaluation

[edit]

1) The article is well-written, although it is also too informal at times -- for example, the "Protecting privacy on the internet" section includes the phrase "users give away a lot of information about themselves" -- but is otherwise grammatically correct, with correct spelling, and is well-organized.

2) The article is well-written, contains no grammatical or spelling errors, and is very well-organized, being broken down into relevant sections: "Background," "Oral Argument," "Opinion of the Court," "Reception," and "Subsequent developments," with subsections in the major sections of "Background" and "Opinion of the Court" (the latter of which includes subsections for the majority opinion and concurring opinions of the Court).

Images and media evaluation

[edit]

1) The article does not include images.

2) The article does include images that enhance the understanding of the topic -- it includes images of the justices who authored the opinions, accompanied by captions that succinctly summarize the justices' opinions. All of the images adhere to Wikipedia's copyright regulations (they are official portraits which belong in the public domain) and are laid out in a visually appealing way (the images are spread out, in an inverted V-shape, <, such that the reader's eyes are taken across both sides of the page, from right to left).

Talk page evaluation

[edit]

1) Some of the conversations occurring in the talk page are about possible additions (like about the privacy protection in India and China or about critiquing search engine data). The article is within the scope of several WikiProjects, including those on Computing, Internet, and Mass surveillance. The article is rated "C" for all three projects. Wikipedia discusses this topic in a much more opinionated way than any of the discussion on privacy we've done in class; this includes the user who submitted the suggestion for a section on "Privacy Protection in India and China" cursing to emphasize how important they feel such a section would be too the page.

2) The talk page is mostly filled with suggestions for additions that had been addressed, as well as a review for a "GA" ("Good Article") application that was approved. This article is within the scope of the WikiProject Law and is a part of WikiProject U.S. Supreme Court cases. The article discusses the legal aspects of privacy, which we have not talked about in class.

Overall evaluation

[edit]

1) The article is well-written, although a bit informal at times, but it can be improved greatly, on the whole, by developing more of the sections, which include many of the information privacy issues today but instead link them to other Wikipedia pages. The treatment that the "United States Safe Harbor program and passenger name record issues" section received -- relatively in-depth and detailed writing -- should be extended to issues like search engine privacy or the status of privacy rights in, for example, China.

2) The article is very strong. It provides, in a neutral manner, succinct summaries of the case's background, opinion, and subsequent developments. The article could be improved by expanding the "Subsequent developments" section, perhaps by including subsequent applications of the case or whether the case has actually curtailed law enforcement's surveillance capabilities. Besides a potential expansion, the article is complete and well-developed.

What I will add to my Privacy+ article page

[edit]

I plan on creating a new article on the Hancock programming language. No such page exists, currently. It is like a missing piece of history -- there is a Wired article on the language and AT&T's reported usage of it for data mining of phone records, but not much is known about it otherwise. I hope to create a Wikipedia page that objectively details what the Hancock programming language was, how it came into existence, what it was used for, and to write about the surrounding era of government surveillance with relation to similar data mining programs, presenting multiple perspectives on the issue.

Hancock (programming language)

[edit]
AT&T researchers created Hancock in 1998. They used it to write data mining programs that analyzed the company's U.S. long-distance phone call streams.[1]

Hancock is a C-based programming language, first developed by researchers at AT&T Labs in 1998, to analyze data streams.[1] The language was intended by its creators to improve the efficiency and scale of data mining. Hancock works by creating profiles of individuals, utilizing data to provide behavioral and social network information.

The development of Hancock was a part of the telecommunications industry's use of data mining processes to detect fraud and to improve marketing. However, following the September 11, 2001 attacks, and the increased government surveillance of individuals, Hancock and similar data mining technologies came into public scrutiny, especially regarding its perceived threat to individual privacy.[2]

Background

[edit]

Data mining research, including Hancock, grew during the 1990s, as scientific, business, and medical interest in massive data collection, storage, and management increased.[3] During the early 1990s, transactional businesses became increasingly interested in data warehousing, which provided storage, query, and management capabilities for the entirety of recorded transactional data. Data mining research with a focus on databases became focused on creating efficient data structures and algorithms, particularly for data which was located off of main memory storage, on a disk, for example. Padharic Smyth believed that data mining researchers aimed to write algorithms which could scale the massive amounts of data in shorter amounts of time.[3]

Researchers at AT&T Labs, including Corinna Cortes, pioneered the Hancock programming language from 1998 to 2004. Hancock, a C-based domain specific programming language, was intended to make program code for computing signatures from large transactional data streams easier to read and maintain, thus serving as an improvement over the complex data mining programs written in C. Hancock also managed issues of scale for data mining programs.[1]

The data streams Hancock programs analyzed were intended to handle hundreds of millions of signatures daily, ideally suited for transactions like telephone calls, credit card purchases, or website requests.[1] At the time Hancock was developed, this data were usually amassed for billing or security purposes, and increasingly, to analyze how transactors behaved.[1] Data mining can also be useful for identifying atypical patterns in transactor data. In regards to anti-terrorist activities, data mining’s assistance in pattern-finding can help find links between terrorist suspects, through funding or arms transfers, for example.[4]

Data stream applications also include network monitoring, financial monitoring, such as security derivative pricing[5], prescription drug effect monitoring[5], and e-commerce[6]. Data mining can be used by firms to find their most profitable consumers or to conduct churn analysis.[5] Data mining can also help firms make credit-lending decisions by designing models which determine a customer’s credit worthiness.[7] These models are intended to minimize risky credit-lending while maximizing sales revenue.[7]

Besides Hancock, other data stream systems in existence by 2003 included Aurora, Gigascope, Niagara, STREAM, Tangram, Tapestry, Telegraph, and Tribeca.[6]

Processes

[edit]

Databases

[edit]

Hancock is a language for data stream mining programs. Data streams differ from traditional stored databases in that they experience very high volumes of data and allow analysts to act upon such data in near-real time. Stored databases, on the other hand, involve data being inputted for offline querying.[6] Data warehouses, which store intersectional data from different systems, can be costly to build and lengthy to implement. Simplified data warehouses can take months to build.[5]

The scale of massive data stream mining poses problems to data miners. For example, internet and telephone network data mining might be tasked with finding persistent items, which are items that regularly occur in the stream.[8] However, these items might be buried in a large amount of the network’s transactional data; while the items can eventually be found, data miners aim for increased time efficiency in their search.[8]

In database technology, users do not necessarily know where the data they are searching for is located. These users only have to issue queries for data, which the database management system returns. In a large data set, data can be contained in random-access memory (RAM), which is the primary storage, or disk storage, which is secondary storage. In 2000, Padharic Smyth estimated that, using the most recent technology, data located in RAM could be accessed relatively quickly, “on the order of 10-7-10-8 seconds,” while secondary storage data took significantly longer to access, “on the order of 104-105” seconds.[3]

Data mining

[edit]

Data mining can be broken down into the processes of input, analysis, and the reporting of results; it uses algorithms to find patterns and relationships among the subjects and has been used by commercial companies to find patterns in client behavior.[9] Data analysts are needed to collect and organize data and train algorithms.[4]

KianSing Ng and Huan Liu opine that even with straightforward data mining goals, the actual process is still complex. For example, they argue that real-world data mining can be challenged by data fluctuations, which would render prior patterns “partially invalid.” Another complication is that most databases in existence in 2000 were characterized by high dimensionality, which means that they contain data on many attributes. As Ng and Liu note, high dimensionality produces long computing times; this can be solved by data reduction in the pre-processing stage.[10]

Hancock's process is as follows:

  • Hancock programs analyze data as it arrives, in real-time, into data warehouses.[2]
  • Hancock programs computed the signatures, or behavioral profiles, of transactors in the stream.[1]
    • Data stream transactors include telephone numbers or IP addresses .[1]
    • Signatures enable analysts to discover patterns hidden in the data.[1]
  • Telecommunications data streams consist of call-records, which include information on the locations of callers, time of calls, and sometimes include recordings of conversations.[11]
    • Hancock was used to process signatures based on data like the length of phone calls and the amount of calls to a particular area over a specified interval of time.[12]
  • Hancock programs used link analysis to find “communities of interest," which connected signatures based on similarities in behavior.[12] Link analysis require that linkages between data are continually updated, and are used to detect fraud networks.[12]
    • Link analysis, which can be considered a form of association data mining, aims to find connections between relationships.[13] One such relationship is call patterns in telecommunications.[13] Association data mining aims to find relationships between variables. For example, one research paper[13] suggested that a market could use association analysis to find the probability that a customer who purchases coffee also purchases bread; the market could then use that information to influence store layout and promotions.

Because Hancock code performed efficiently, even with large amounts of data, the AT&T researchers claimed that it allowed analysts to create applications "previously thought to be infeasible."[1]

Applications

[edit]

The AT&T Labs researchers analyzed telecommunications data streams, including the company’s entire long distance stream, which included around 300 million records from 100 million customer accounts daily.[1] By 2004, the entirety of AT&T's long-distance phone call record signatures were written in Hancock[1] and the company used Hancock code to peruse through nine gigabytes of network traffic, nightly.[2]

Telecommunications companies share information derived from data mining network traffic for research, security, and regulatory purposes.[14]

Marketing

[edit]

Hancock programs assisted in AT&T's marketing efforts.[2] In the 1990s, large data stream mining and the increased automation of government public record systems allowed commercial corporations in the United States to personalize marketing.[15] Signature profiles were developed from both transaction records and public record sources.[15] Ng and Liu, for example, applied data mining to customer retention analysis, and found that mining of association rules allowed a firm to predict departures of influential customers and their associates. They argued that such knowledge subsequently empowers the company’s marketing team to target those customers, offering more attractive pitches.[10]

Data mining assisted telecommunications companies in viral marketing, also known as buzz marketing or word-of-mouth marketing, which uses consumer social networks to improve brand awareness and profit.[16] Viral marketing depends on connections between consumers to increase brand advocacy, which can either be explicit, such as friends recommending a product to other friends, or implicit, such as influential consumers purchasing a product.[16] For firms, one of the goals of viral marketing is to find influential consumers who have larger networks. Another method of viral marketing is to target the neighbors of prior consumers, known as “network targeting.”[16] Using Hancock programs, analysts at AT&T were able to find "communities of interest," or interconnected users who featured similar behavioral traits.[12]

One of the issues viral marketing promoters encountered was the large size of marketing data sets, which, in the case of telecommunication companies, can include information on transactors and their descriptive attributes and transactions.[16] Marketing data sets, when amounting in the hundreds of millions, can exceed the memory capacity of statistical analysis software.[16] Hancock programs addressed data scaling issues and allowed analysts to make decisions as the data flowed into the data warehouses.[2]

While the development of wireless communication devices allowed law enforcement to track the location of users, it also allowed companies to improve consumer marketing, such as by sending messages according to wireless user’s proximity to particular businesses.[15] Through cell site location data, Hancock programs were capable of tracking wireless users' movements.[2]

According to academic Alan Westin, the increase of telemarketing during this period also increased consumer annoyance.[15] Statisticians Murray Mackinnon and NedGlick hypothesized in 1999 that firms hid their use of commercial data mining because of potential consumer backlash for mining customer records.[5] As an example, Mackinnon and Glick cited a June 1999 lawsuit in which the state of Minnesota sued US Bancorp for releasing customer information to a telemarketing firm; Bancorp promptly responded to the lawsuit by restricting its usage of customer data.[5]

Fraud detection

[edit]

AT&T researchers, including Cortes, showed that Hancock-related data mining programs could be used for finding telecommunications fraud.[14]

Telecommunications fraud detection includes subscription fraud, unauthorized calling card usage, and PBX fraud.[17] It is similar to mobile communications and credit card fraud: in all three, firms must process large amounts of data in order to obtain information; they must deal with the unpredictability of human behavior, which makes finding patterns in the data difficult; and their algorithms must be trained to spot the relatively rare cases of fraud among the many legitimate transactions.[17] According to Daskalaki et al, in 1998, telecommunications fraud incurred billions of dollars in annual losses globally.[17]

Because fraud cases were relatively few compared to the hundreds of millions of daily telephone transactions that occurred, algorithms for data mining of telecommunication records need to provide results quickly and efficiently.[12] The researchers showed that communities of interest could identify fraudsters since data nodes from fraudulent accounts are typically located closer to each other than to a node from a legitimate account.[14]

Through social network analyses and link analysis, they also found that the set of numbers that were targeted by fraudulent accounts, which were then disconnected, were often called on by fraudsters from different numbers; such connections could be used to identify fraudulent accounts. Link analysis methods are based on the assumption that fraudsters rarely deviate from their calling habits.[12]

Relation to surveillance

[edit]

In 2007, Wired magazine published an online article claiming that Hancock was created by AT&T researchers for "surveillance purposes." The article highlighted research papers written by Cortes et al, particularly the researchers' concept of "communities of interest." The article connected Hancock's concept with the recent public findings that the Federal Bureau of Investigation (FBI) had been making warrantless requests for records of "communities of interest" from telecommunication companies under the USA PATRIOT Act.[2] (For more on FBI telecommunications surveillance, see below.)

The article claims that AT&T "invented the concept and the technology" of creating "community of interest" records, citing the company's ownership of related data mining patents. Finally, the article noted how AT&T, along with Verizon, was, at the time, being sued in federal court for providing the National Security Agency (NSA) with access to billions of telephone records belonging to Americans. The NSA, the article claims, obtained such data with the intention of data mining it to find suspected terrorists and warrantless wiretapping targets.[2] (For more on NSA telecommunications surveillance, see below.)

FBI telecommunication records surveillance

[edit]

Federal telecommunications surveillance is not a recent historical development in the United States. According to academic Colin Agur, telephone surveillance by law enforcement in the United States became more common in the 1920s.[18] Particularly, telephone wiretapping became a prevalent form of evidence collection by law enforcement officials, especially federal agents, during Prohibition.[18] Agur argues that the Communications Act of 1934, which established the Federal Communications Commission, reigned in law enforcement abuse of telephone surveillance.[18] Under the act, telecommunications companies could keep records and report to the FCC illegal telecommunications interception requests.[18] After the Federal Wiretap Act of 1968 and the Supreme Court's decision in Katz v. United States, which both extended Fourth Amendment protections to telephone communications, federal telecommunications surveillance required warrants.[18]

According to academics William Bendix and Paul Quirk, the FBI was first authorized to obtain national security letters (NSLs) for communication billings records, including those from telephone services, after Congress passed the Electronic Communications Privacy Act of 1986. The letters forced telephone companies to provide the FBI with customer information, such as names, addresses, and long-distance call records. Congress would eventually expand NSL authority to include warrants for local-distance call records as well.[19]

After the September 11, 2001 attacks, Congress passed the USA PATRIOT Act, which made it easier for investigators at the FBI to be issued national security letters for terrorism investigations (NSLs). Bendix and Quirk contend that the PATRIOT Act allowed the FBI to access and collect the private data of many citizens, without the approval of a judge. The FBI was allowed to keep a collection of records, with no time limit for possession. It could also force NSL recipients to remain silent through the use of gag orders.[19]

The Wired article claimed that the FBI began making warrantless requests to telecommunication companies for "communities of interest" records of suspects under the USA PATRIOT Act. The article claimed that law enforcement discovered the existence of such records based on research by Hancock's creators.[2]

In 2005, government leaks revealed the FBI’s abuse of NSLs. In 2006, when the PATRIOT Act was renewed, it included provisions that required the Justice Department’s inspector general to annually review NSL usage. The first inspector general report found that 140,000 NSL requests, on nearly 24,000 U.S. persons, were granted to FBI agents from 2003 to 2005. The data was then added to databanks available to thousands of agents.[19]

NSA telecommunication records surveillance

[edit]

The public-private relationship of telecommunication companies extends into the homeland security domain. Telecommunication companies, including AT&T, Verizon, and BellSouth, cooperated with NSA requests for access to transactional records.[20] Telecommunications companies, including AT&T, have maintained partnerships with government agencies, like the Department of Homeland Security, to collaborate on sharing information and solving national cybersecurity issues. AT&T representatives sit on the board of the National Cyber Security Alliance (NCSA), which promotes cybersecurity awareness and computer user protection.[21]

Analysts at the NSA, under authority of the secret Terrorist Surveillance Program, also used data mining to find terrorist suspects and sympathizers. In this search, the NSA intercepted communications, including telephone calls, leaving and entering the United States. Agents screened the information for possible links to terrorism, such as the desire to learn to fly planes or specific locations of the communication’s recipients, like Pakistan.[20]

In 2005, the New York Times reported on the existence of the program, which the Bush administration defended as necessary in its counterterrorism efforts and limited to terrorist suspects and associates.

However, in 2007, the Wired article noted how AT&T and Verizon were being sued in federal court for providing the NSA with access to billions of telephone records belonging to Americans for anti-terrorism activities, such as using data mining to locate suspected terrorists and warrantless wiretapping targets.[2]

In 2013, following the Snowden leaks, it was revealed that the program had also mined the communications of not just terrorist suspects, but also millions of American citizens. A 2014 independent audit by the Privacy and Civil Liberties Oversight Board found that the program had limited counterterrorism benefits.[20]

See also

[edit]

References

[edit]
  1. ^ a b c d e f g h i j k Cortes, Corinna; Fisher, Kathleen; Pregibon, Daryl; Rogers, Anne; Smith, Frederick (2004-03-01). "Hancock: A language for analyzing transactional data streams". ACM Transactions on Programming Languages and Systems. 26 (2): 301–338. doi:10.1145/973097.973100. ISSN 0164-0925.
  2. ^ a b c d e f g h i j Singel, Ryan (2007-10-29). "AT&T Invents Programming Language for Mass Surveillance". Wired. ISSN 1059-1028. Retrieved 2020-11-08.{{cite news}}: CS1 maint: url-status (link)
  3. ^ a b c Smyth, P. (2000-08-01). "Data mining: data analysis on a grand scale?". Statistical Methods in Medical Research. 9 (4): 309–327. doi:10.1191/096228000701555181.
  4. ^ a b Kim, Won (2005). "On U.S. Homeland Security and Database Technology:". Journal of Database Management. 16 (1): 1–17. doi:10.4018/jdm.2005010101. ISSN 1063-8016.
  5. ^ a b c d e f Mackinnon, Murray J.; Glick, Ned (Sep 1999). "Data Mining and Knowledge Discovery in Databases - An Overview". Australian New Zealand Journal of Statistics. 41 (3): 255–275. doi:10.1111/1467-842X.00081. ISSN 1369-1473.
  6. ^ a b c Babcock, Brian; Babu, Shivnath; Datar, Mayur; Motwani, Rajeev; Thomas, Dilys (2004-12-01). "Operator scheduling in data stream systems". The VLDB Journal. 13 (4): 333–353. doi:10.1007/s00778-004-0132-6. ISSN 1066-8888.
  7. ^ a b Sarantopoulos, Georgios (2003). "Data mining in retail credit". Operational Research. 3 (2): 99–122.
  8. ^ a b Lahiri, Bibudh; Tirthapura, Srikanta; Chandrashekar, Jaideep (2014). "Space-efficient tracking of persistent items in a massive data stream: Small-Space Algorithm to Detect Temporal Persistence". Statistical Analysis and Data Mining: The ASA Data Science Journal. 7 (1): 70–92. doi:10.1002/sam.11214.
  9. ^ Guzik, Keith (2009-10-13). "Discrimination by Design: predictive data mining as security practice in the United States' 'war on terrorism'". Surveillance & Society. 7 (1): 3–20. doi:10.24908/ss.v7i1.3304. ISSN 1477-7487.
  10. ^ a b Ng, KianSing; Liu, Huan (2000-12-01). "Customer Retention via Data Mining". Artificial Intelligence Review. 14 (6): 569–590. doi:10.1023/A:1006676015154.
  11. ^ Clarke, Roger (2001-01-01). "Person location and person tracking ‐ Technologies, risks and policy implications". Information Technology & People. 14 (2): 206–231. doi:10.1108/09593840110695767. ISSN 0959-3845.
  12. ^ a b c d e f Bolton, Richard J.; Hand, David J.; Provost, Foster; Breiman, Leo; Bolton, Richard J.; Hand, David J. (2002). "Statistical Fraud Detection: A ReviewCommentCommentRejoinder". Statistical Science. 17 (3): 235–255. doi:10.1214/ss/1042727940.
  13. ^ a b c Hian Chye, Koh; Leong Gerry, Chan Kin (2002). "Data Mining and Customer Relationship Marketing in the Banking Industry". Singapore Management Review. 24 (2): 1–27 – via Berkeley Library.
  14. ^ a b c Bonchi, Francesco; Castillo, Carlos; Gionis, Aristides; Jaimes, Alejandro (2011-05-06). "Social Network Analysis and Mining for Business Applications". ACM Transactions on Intelligent Systems and Technology. 2 (3): 1–37. doi:10.1145/1961189.1961194. ISSN 2157-6904.
  15. ^ a b c d Westin, Alan F. (2003). "Social and Political Dimensions of Privacy: Social and Political". Journal of Social Issues. 59 (2): 431–453. doi:10.1111/1540-4560.00072.
  16. ^ a b c d e Hill, Shawndra; Provost, Foster; Volinsky, Chris (2006). "Network-Based Marketing: Identifying Likely Adopters via Consumer Networks". Statistical Science. 21 (2): 256–276 – via JSTOR.
  17. ^ a b c Daskalaki, S.; Kopanas, I.; Goudara, M.; Avouris, N. (2003-03-01). "Data mining for decision support on customer insolvency in telecommunications business". European Journal of Operational Research. 145 (2): 239–255. doi:10.1016/S0377-2217(02)00532-5. ISSN 0377-2217.
  18. ^ a b c d e Agur, Colin (2013). "Negotiated Order: The Fourth Amendment, Telephone Surveillance, and Social Interactions, 1878-1968". Information & Culture. 48 (4): 419–447 – via ProQuest.
  19. ^ a b c Bendix, William; Quirk, Paul J. (2016). "Deliberating Surveillance Policy: Congress, the FBI, and the Abuse of National Security Letters". Journal of Policy History. 28 (3): 447–469. doi:10.1017/S0898030616000178. ISSN 0898-0306.
  20. ^ a b c Theoharis, Athan (2016). "Expanding U.S. Surveillance Powers: The Costs of Secrecy". Journal of Policy History. 28 (3): 515–534. doi:10.1017/S0898030616000208. ISSN 0898-0306.
  21. ^ Busch, Nathan (2012). "Public-private partnerships in homeland security: Opportunities and challenges". Homeland Security Affairs. 8 (1): 1–25.

Peer review (Bobalily)

[edit]

General info

[edit]

Lead

[edit]

Guiding questions:

  • Has the Lead been updated to reflect the new content added by your peer? Yes
  • Does the Lead include an introductory sentence that concisely and clearly describes the article's topic? Yes
  • Does the Lead include a brief description of the article's major sections? No
  • Does the Lead include information that is not present in the article? No
  • Is the Lead concise or is it overly detailed? Concise

Lead evaluation

[edit]

The lead is easy to read and understand. Seems like the major sections can be described more in the lead.

Content

[edit]

Guiding questions:

  • Is the content added relevant to the topic? Yes
  • Is the content added up-to-date? Yes
  • Is there content that is missing or content that does not belong? No
  • Does the article deal with one of Wikipedia's equity gaps? Does it address topics related to historically underrepresented populations or topics? Yes

Content evaluation

[edit]

Content is retrieved from up-to-date sources listed at the end of the draft, and all content belongs. I like how you dedicated a section to introduce the background of Hancock programming.

Tone and Balance

[edit]

Guiding questions:

  • Is the content added neutral? Yes
  • Are there any claims that appear heavily biased toward a particular position? No
  • Are there viewpoints that are overrepresented, or underrepresented? No
  • Does the content added attempt to persuade the reader in favor of one position or away from another? No

Tone and balance evaluation

[edit]

The tone is neutral, and no opinionated words are used.

Sources and References

[edit]

Guiding questions:

  • Is all new content backed up by a reliable secondary source of information? Yes
  • Are the sources thorough - i.e. Do they reflect the available literature on the topic? Yes
  • Are the sources current? Yes
  • Are the sources written by a diverse spectrum of authors? Do they include historically marginalized individuals where possible? Yes
  • Check a few links. Do they work? Yes

Sources and references evaluation

[edit]

The sources are mostly in the early 2000s, and if there are more current sources it might be better. But it all seems good.

Organization

[edit]

Guiding questions:

  • Is the content added well-written - i.e. Is it concise, clear, and easy to read? Yes
  • Does the content added have any grammatical or spelling errors? Minor issues
  • Is the content added well-organized - i.e. broken down into sections that reflect the major points of the topic? Yes

Organization evaluation

[edit]

The sections are separated neatly and all sections have a reasonable length without having a super lengthy or short section. Here are some grammatical suggestions I have:

Grammatical Errors/ Suggestions

Background:

Paragraph1: “increasingly interest in” to “increasingly interested in”, “the aim of data mining researchers was” to “data mining researchers aimed”

Paragraph2: “data miningprograms” to “data mining programs”

Paragraph3: “credit card purchase” to ”credit card purchases”

Marketing:

Paragraph4: “users movements” to “user’s movements”

NSA telecommunication records surveillance:

Paragraph4: “based on by” to “based on”

Images and Media

[edit]

Guiding questions: If your peer added images or media

  • Does the article include images that enhance understanding of the topic? Yes
  • Are images well-captioned? Yes
  • Do all images adhere to Wikipedia's copyright regulations? Yes
  • Are the images laid out in a visually appealing way? Yes

Images and media evaluation

[edit]

For New Articles Only

[edit]

If the draft you're reviewing is a new article, consider the following in addition to the above.

  • Does the article meet Wikipedia's Notability requirements - i.e. Is the article supported by 2-3 reliable secondary sources independent of the subject? Yes
  • How exhaustive is the list of sources? Does it accurately represent all available literature on the subject? Exhaustive, Yes
  • Does the article follow the patterns of other similar articles - i.e. contain any necessary infoboxes, section headings, and any other features contained within similar articles? Yes
  • Does the article link to other articles so it is more discoverable? Yes

New Article Evaluation

[edit]

Hyperlinks are added when keywords are mentioned in the draft, and there is also a "See Also" section that links to other relevant Wikipedia pages.

Overall impressions

[edit]

Guiding questions:

  • Has the content added improved the overall quality of the article - i.e. Is the article more complete? Yes
  • What are the strengths of the content added? Each sections are well described and is organised in a way readers can go through easily.
  • How can the content added be improved? Some minor grammatical suggestions and maybe more relevant sources.

Overall evaluation

[edit]

Great job on the article. Your hard work is seen in your writing!

Article Feedback (Leadership)

[edit]

Great job with your article! I really like how you broke down your topic into key sections, and I also found the information you included about data mining very interesting. Great job linking your article to multiple other articles and including a “See also” section. It is evident that you did great, thorough research. Also, great job maintaining a neutral tone throughout your article. I also liked the image that you chose to include in your article. Overall, very good job taking a complex topic and making it easy to follow and learn about!

Suggestions:

  • I would link “data mining” the first time you use it in your lead to the respective page. Also, I would remove the hyphen between data and mining each time you use it.
  • “Data mining research, including Hancock, grew during the 1990s, as scientific, business, and medical interest in massive data collection, storage, and management increased.”
    • I would add a citation after this sentence.
  • I would also suggest making your subheadings for the Application section like “Marketing” and “Fraud detection” sub-heading 1. I would also do the same for the subheadings in the Relation to surveillance section.
  • et al. should be italicized whenever used, and I would also suggest maybe introducing specific researchers you mention

Peer review - Jennifer (lilmeowmeow3161)

[edit]

General info

[edit]

Lead

[edit]

Guiding questions:

  • Has the Lead been updated to reflect the new content added by your peer? Yes
  • Does the Lead include an introductory sentence that concisely and clearly describes the article's topic? Yes
  • Does the Lead include a brief description of the article's major sections? Yes
  • Does the Lead include information that is not present in the article? Yes
  • Is the Lead concise or is it overly detailed? No

Lead evaluation

[edit]

The lead section is interesting, and I like that your caption is descriptive to AT&T as well! I like your point on Hancock coming into increaed scrutiny post 9/11 which was a key detail for the background later on.

Content

[edit]

Guiding questions:

  • Is the content added relevant to the topic? Yes
  • Is the content added up-to-date? Yes
  • Is there content that is missing or content that does not belong? Not really, you have some interesting information on the processes (such as the databases, which you could link to other Wikipedia pages)
  • Does the article deal with one of Wikipedia's equity gaps? Does it address topics related to historically underrepresented populations or topics? Yes, it brings up anti-terrorist activities in regards to data mining, so it does address a specific niche that would have otherwise been overlooked.

Content evaluation

[edit]

The content is all relevant, especially the background about the background of Hancock. I like your sections on the "processes" because these categories actually use Hancock! I think that your section on NSA telecommunication surveillance was especially interesting and I thought that was an important addition.

Tone and Balance

[edit]

Guiding questions:

  • Is the content added neutral? Somewhat, it has a leaning towards privacy, in that the content concentrates on homeland security/ their connection with Hancock.
  • Are there any claims that appear heavily biased toward a particular position? NA
  • Are there viewpoints that are overrepresented, or underrepresented? NA
  • Does the content added attempt to persuade the reader in favor of one position or away from another? Not really, it is primarily factual.

Tone and balance evaluation

[edit]

Tone is very neutral! the descriptions and sentences are somewhat technical, but that's good! No biases.

Sources and References

[edit]

Guiding questions:

  • Is all new content backed up by a reliable secondary source of information? Yes
  • Are the sources thorough - i.e. Do they reflect the available literature on the topic? You have around 21 sources which is good!
  • Are the sources current? Yes
  • Are the sources written by a diverse spectrum of authors? Do they include historically marginalized individuals where possible? Yes
  • Check a few links. Do they work? Links work

Sources and references evaluation

[edit]

The links work! I like how diverse your sources were!

Organization

[edit]

Guiding questions:

  • Is the content added well-written - i.e. Is it concise, clear, and easy to read? Yes
  • Does the content added have any grammatical or spelling errors? Not particularly, the sources that were complex had citations.
  • Is the content added well-organized - i.e. broken down into sections that reflect the major points of the topic? Yes

Organization evaluation

[edit]

I think the organization was straightforward, it had the relation to survceillance followed by subsections and links that supported your article!

Images and Media

[edit]

Guiding questions: If your peer added images or media

  • Does the article include images that enhance understanding of the topic? Yes
  • Are images well-captioned? Yes, well captioned
  • Do all images adhere to Wikipedia's copyright regulations? I'm assuming so?
  • Are the images laid out in a visually appealing way? Yes

Images and media evaluation

[edit]

I liked your picture with AT&T because it was neutral, but if you would want to add some more pictures of the language itself that may help readers visualize it.

Overall evaluation

[edit]

Overall, I think this article is pretty good! I like that you were able to maintain neutrality even with your topic focusing on national security, the NSA, and AT&T without having a bend in a particular direction. Overall, you included a lot of links and that was helpful as well! Good job!

Peer review - James Wang

[edit]

General info

[edit]

Lead

[edit]

Guiding questions:

  • Has the Lead been updated to reflect the new content added by your peer?
  • Does the Lead include an introductory sentence that concisely and clearly describes the article's topic?
  • Does the Lead include a brief description of the article's major sections?
  • Does the Lead include information that is not present in the article?
  • Is the Lead concise or is it overly detailed?

Lead evaluation

[edit]

The lead section is great and includes clear sentences about the programming language! It includes everything that's presented in the article! To improve, maybe consider adding a few sentences about the applications of Hancock.

Content

[edit]

Guiding questions:

  • Is the content added relevant to the topic?
  • Is the content added up-to-date?
  • Is there content that is missing or content that does not belong?
  • Does the article deal with one of Wikipedia's equity gaps? Does it address topics related to historically underrepresented populations or topics?

Content evaluation

[edit]

The content is all relevant, especially the background about the background of Hancock. I like your sections on Data mining and database because these categories actually use Hancock! Also the applications part os good as well. One thing to consider might be to tie together how exactly Hancock programming language connects to surveillance of FBI and NSA. Another thing to consider is the weaknesses and shortcomings of Hancock.

Tone and Balance

[edit]

Guiding questions:

  • Is the content added neutral?
  • Are there any claims that appear heavily biased toward a particular position?
  • Are there viewpoints that are overrepresented, or underrepresented?
  • Does the content added attempt to persuade the reader in favor of one position or away from another?

Tone and balance evaluation

[edit]

Tone is very neutral! the descriptions and sentences are somewhat technical, but that's good! No biases.

Sources and References

[edit]

Guiding questions:

  • Is all new content backed up by a reliable secondary source of information?
  • Are the sources thorough - i.e. Do they reflect the available literature on the topic?
  • Are the sources current?
  • Are the sources written by a diverse spectrum of authors? Do they include historically marginalized individuals where possible?
  • Check a few links. Do they work?

Sources and references evaluation

[edit]

The links work! Also good job on linking article to a lot of different topic, examples, and cases/laws!

Organization

[edit]

Guiding questions:

  • Is the content added well-written - i.e. Is it concise, clear, and easy to read?
  • Does the content added have any grammatical or spelling errors?
  • Is the content added well-organized - i.e. broken down into sections that reflect the major points of the topic?

Organization evaluation

[edit]

I think the contents are well organized, but maybe some topics could be broken down into subtopics like for example marketing and data mining!

Images and Media

[edit]

Guiding questions: If your peer added images or media

  • Does the article include images that enhance understanding of the topic?
  • Are images well-captioned?
  • Do all images adhere to Wikipedia's copyright regulations?
  • Are the images laid out in a visually appealing way?

Images and media evaluation

[edit]

Consider adding some images related to examples of Hancock codes, how it's used in marketing or fraud detection. Maybe also show pictures of FBI investigation and such.

For New Articles Only

[edit]

If the draft you're reviewing is a new article, consider the following in addition to the above.

  • Does the article meet Wikipedia's Notability requirements - i.e. Is the article supported by 2-3 reliable secondary sources independent of the subject?
  • How exhaustive is the list of sources? Does it accurately represent all available literature on the subject?
  • Does the article follow the patterns of other similar articles - i.e. contain any necessary infoboxes, section headings, and any other features contained within similar articles?
  • Does the article link to other articles so it is more discoverable?

New Article Evaluation

[edit]

It meats the notability requirement! The sources are all up-to-date and represent a good range of topics related to Hancock coding language, data mining, etc. The article also does a good job linking to other articles on Wikipedia.

Overall impressions

[edit]

Guiding questions:

  • Has the content added improved the overall quality of the article - i.e. Is the article more complete?
  • What are the strengths of the content added?
  • How can the content added be improved?

Overall evaluation

[edit]

Overall, I think this article is pretty great! Alex, you have a lot of good contents on here and you divide them up into nice topics as well! I like how you have a category for surveillance, but perhaps include how Hancock language is applied in these surveillance situations. Also consider adding some images and combining some long sections to make paragraphs shorter and easier to read. Overall, nice job!

Peer review (Nankingaszz)

[edit]

General info

[edit]

Lead

[edit]

Guiding questions:

  • Has the Lead been updated to reflect the new content added by your peer?
  • Does the Lead include an introductory sentence that concisely and clearly describes the article's topic?
  • Does the Lead include a brief description of the article's major sections?
  • Does the Lead include information that is not present in the article?
  • Is the Lead concise or is it overly detailed?

Lead evaluation

[edit]

The Lead includes a great introduction for the article's topic! It also mentions all major sections in the article and it's concise.

Content

[edit]

Guiding questions:

  • Is the content added relevant to the topic?
  • Is the content added up-to-date?
  • Is there content that is missing or content that does not belong?
  • Does the article deal with one of Wikipedia's equity gaps? Does it address topics related to historically underrepresented populations or topics?

Content evaluation

[edit]

The content added are all relevant to the topic and are all up-to-date. No specific Wikipedia's equity gaps or underrepresented groups are addressed in the article.

Tone and Balance

[edit]

Guiding questions:

  • Is the content added neutral?
  • Are there any claims that appear heavily biased toward a particular position?
  • Are there viewpoints that are overrepresented, or underrepresented?
  • Does the content added attempt to persuade the reader in favor of one position or away from another?

Tone and balance evaluation

[edit]

The content is neutral and no particular points are overrepresented. All the information are presented in a neutral way.

Sources and References

[edit]

Guiding questions:

  • Is all new content backed up by a reliable secondary source of information?
  • Are the sources thorough - i.e. Do they reflect the available literature on the topic?
  • Are the sources current?
  • Are the sources written by a diverse spectrum of authors? Do they include historically marginalized individuals where possible?
  • Check a few links. Do they work?

Sources and references evaluation

[edit]

There are abundant citations and hyperlinks in the article and they are thorough. All the citations are from around 2000-2019 so they are current. The links I checked all worked and there is a diversity of authors.

Organization

[edit]

Guiding questions:

  • Is the content added well-written - i.e. Is it concise, clear, and easy to read?
  • Does the content added have any grammatical or spelling errors?
  • Is the content added well-organized - i.e. broken down into sections that reflect the major points of the topic?

Organization evaluation

[edit]

The article is very well-organized! I really like how you not only mention its applications but also its potential harms (surveillance), different sections of the topic make your article more reliable and more useful. The background section also provide readers with an overall insight, which is great.

Images and Media

[edit]

Guiding questions: If your peer added images or media

  • Does the article include images that enhance understanding of the topic?
  • Are images well-captioned?
  • Do all images adhere to Wikipedia's copyright regulations?
  • Are the images laid out in a visually appealing way?

Images and media evaluation

[edit]

The article has images that help understanding of its topic and the caption under the image explains well. The image is very straightforward and provides the most direct information to readers.

For New Articles Only

[edit]

If the draft you're reviewing is a new article, consider the following in addition to the above.

  • Does the article meet Wikipedia's Notability requirements - i.e. Is the article supported by 2-3 reliable secondary sources independent of the subject?
  • How exhaustive is the list of sources? Does it accurately represent all available literature on the subject?
  • Does the article follow the patterns of other similar articles - i.e. contain any necessary infoboxes, section headings, and any other features contained within similar articles?
  • Does the article link to other articles so it is more discoverable?

New Article Evaluation

[edit]

The article meets all the points above!

Overall impressions

[edit]

Guiding questions:

  • Has the content added improved the overall quality of the article - i.e. Is the article more complete?
  • What are the strengths of the content added?
  • How can the content added be improved?

Overall evaluation

[edit]

Your article looks great! I like how you firstly introduce the background of the concept and provides its application besides its relation to surveillance. The various information would make readers to treat the concept in a more critical way comparing to the situation when only the surveillance section is mentioned.

Peer Review (Lolabaylo)

[edit]

This is where you will complete your peer review exercise. Please use the following template to fill out your review.

General info

[edit]

Lead

[edit]

Guiding questions:

  • Does the Lead include an introductory sentence that concisely and clearly describes the article's topic? Yes - the introductory sentence is succinct, clear, and gives an overview of the Hancock programming language and its developers.
  • Does the Lead include a brief description of the article's major sections? Yes - it briefly touches upon how Hancock works, the uses of Hancock, and its surveillance concerns.
  • Does the Lead include information that is not present in the article? No.
  • Is the Lead concise or is it overly detailed? The Lead is concise, but touches upon each major section listed in the rest of the article.

Lead evaluation

[edit]

Overall, a really strong Lead that provides a succinct and comprehensive overview of Hancock programming language.

Content

[edit]

Guiding questions:

  • Is the content added up-to-date? Content is up-to-date: all content is relevant and pertains to events from the early 2000s and 2010s.
  • Is there content that is missing or content that does not belong? - No.
  • Does the article deal with one of Wikipedia's equity gaps? Does it address topics related to historically underrepresented populations or topics? - Historically underrepresented populations are not explicitly addressed, but this is difficult to do so given the article subject. It does discuss how vulnerable populations are targeted and surveillanced with Hancock, however.

Content evaluation

[edit]

Overall, the content is informative, clear, and well-detailed.

Tone and Balance

[edit]

Guiding questions:

  • Is the content added neutral? Yes - there are no opinionated statements included.
  • Are there any claims that appear heavily biased toward a particular position? No.
  • Are there viewpoints that are overrepresented, or underrepresented? No. It discusses some benefits to Hancock - such as assisting AT&T to detect fraud - while also presenting its negative aspects - such as its use in surveillance.
  • Does the content added attempt to persuade the reader in favor of one position or away from another? No - maintains a neutral, unbiased tone throughout.

Tone and balance evaluation

[edit]

Overall, the tone of is neutral and formal. The writing style seeks to inform rather than persuade.

Sources and References

[edit]

Guiding questions:

  • Is all new content backed up by a reliable secondary source of information? Yes - all information has an in-text citation.
  • Are the sources thorough and current? - i.e. Do they reflect the available literature on the topic? Yes - the sources are all from reputable academic journals. These journals are varied and reflect the diversity of available literature on the topic: there are statistics, artificial intelligence, and public policy academic journals cited. These sources are current: all were published in the early 2000s or 2010s.
  • Check a few links. Do they work? All links I tried work.

Sources and references evaluation

[edit]

Overall, the sources look reputable, strong, and relevant. Adding your last 4 sources to the bibliography would help strengthen the article.

Organization

[edit]

Guiding questions:

  • Is the content added well-written - i.e. Is it concise, clear, and easy to read? Yes - content is easy to understand and succinct, but still maintains a formal tone.
  • Does the content added have any grammatical or spelling errors? No grammar or spelling errors found.
  • Is the content added well-organized - i.e. broken down into sections that reflect the major points of the topic? Yes - I think the organization and flow is very intuitive. I like how the first sections provide a history of Hancock and the way it works in the "Background" and "Processes" sections. Then, the next sections delve into the surveillance issues associated with Hancock.

Organization evaluation

[edit]

Overall, organization is intuitive and clear.

Images and Media

[edit]

N/A - no images or media provided.

For New Articles Only

[edit]

If the draft you're reviewing is a new article, consider the following in addition to the above.

  • Does the article meet Wikipedia's Notability requirements - i.e. Is the article supported by 2-3 reliable secondary sources independent of the subject? Yes - there are a sufficient number of reliable, secondary resources to support this article, and the article addresses the topic of Hancock directly and in detail.
  • How exhaustive is the list of sources? Does it accurately represent all available literature on the subject? There are 16 sources supporting this article. They do represent a wide variety of available literature on this subject, drawing from statistics, artificial intelligence, and public policy academic journals.
  • Does the article follow the patterns of other similar articles - i.e. contain any necessary infoboxes, section headings, and any other features contained within similar articles? - Headings and organization are similar to other articles. Infoboxes and media are not provided; including these things could make the article a bit more engaging and comprehensive.
  • Does the article link to other articles so it is more discoverable? Yes - hyperlinks to other Wikipedia articles are provided throughout.

Overall impressions

[edit]

Guiding questions:

  • Has the content added improved the overall quality of the article - i.e. Is the article more complete? This original article provides detailed content on the Hancock programming language.
  • What are the strengths of the content added? The content is thorough, neutral, and addresses positive and negative aspects of Hancock programming language.
  • How can the content added be improved? Possibly adding media and infoboxes.

Overall evaluation

[edit]

Overall, a great start to creating an original Wikipedia article. It seems super fleshed out and professional!

Peer Review (Nicholas100000)

[edit]

Lead

[edit]

Overall, the lead is concise and gives a good definition of Hancock in the introductory sentence. It also concisely explains its history, use, and public scrutiny, all topics which are further explained later in the article. The only thing that I would mention is there is no explanation for what a data stream is, but there is a wiki article on it so you can probably just link it.

Content

[edit]

The content is relevant, but I am looking forward to how Hancock is connected to surveillance and more information on why it was scrutinized.The most recent event in the article is 2014's independent audit, so I am not sure how up to data it is. However, you do a good job of listing relevant legislation such as the USA Patriot Act.

Tone and Balance

[edit]

The content added is neutral. The article sticks to facts and events that occurred, and when there is mention of an opinion the opinion is backed with the person who said it. Making it not an unclaimed opinion.

Sources and References

[edit]

All 10 sources seem to come from reliable journals. In addition, I tested 4 of them and the links work. There are only 2 articles from 2016, but the use of old articles is reflective of the large time frame that the topic deals with. Also the sources seem to be diverse as there are sources from both genders and multiple ethnicity. I am not sure if you need to, but in the lead, none of the statements are backed up by a secondary source.

Organization

[edit]

The contend logically follows from background, processes, application, to relations to surveillance. This is logical as it allows the reader to first understand what it is, how it works, and then its impact. The content is also concisely written, and brief explanations of terminology helps readers understand the terminology. I found no grammatical or spelling errors.

For New Articles Only

[edit]

The article is supported by Wiki's data mining and C (Programming language), among others,. The list of sources is not exhaustive, but it is a strong start spanning multiple literature. The article also effectively uses section headings, links, and an introductory sentence.

Overall impressions

[edit]

Overall, I am impressed with the article. It covers a topic that I am not familiar with and have little expertise, but I was able to read and comprehend the content. Accordingly, the article does a good job of concisely explaining complex terminology. The article also has good organization of its sections. The content is still being written, but it seems like you know what needs to be added.

Peer Review (Madssnake)

[edit]

Lead

[edit]

Your lead is well written, with a clear opening sentence that informs the reader about the Hancock language. I like how you preface a little bit about surveillance and why this topic became more relevant in the recent decade.

“The language was intended by its creators to improve the efficiency and scale of data-mining.” How was the efficiency improved? Maybe briefly include how it is more efficient (this will preface the process section)

copy edit: “Hancock works by creating profiles of individuals”

Content

[edit]

Your content is relevant and covers the whole range of Hancock’s timeline. I think there are some areas where you can switch from past tense to present tense, because although Hancock emerged two decades ago, the process is still current. Your content is encyclopedic and covers all aspects of Hancock programming.

copy edit: (Marketing section) “...location of users. But it also allowed...” connect the sentences or rephrase the start of the second sentence (don’t start a sentence with But)

Tone and Balance

[edit]

This article is written with a neutral tone. You do a good job of neutrally presenting the both viewpoints of a situation. There is also a good balance between topics, with each section getting enough content. Since you have not added why Hancock is related to government access to telecommunication records, the last section may get a bit long compared to the rest of the article, so I would make sure to have subsections to make that section easier to navigate.

Sources and References

[edit]

I know you have a See also section, but I would also add hyperlink Wikipedia articles to terms early on and throughout the article so people can quickly catch up on some definitions if they don’t know much about certain topics. Also, add some citations to your lead. Other than that, you have references in every paragraph and use reliable sources. Although your topic may be a bit dated, you also include more recent sources from 2016, which is good–Are there any sources more recent than 2016? It would be nice to have a 2020 perspective if you find any. Great job citing sources.

Organization

[edit]

This article is really well organized, and the transitions from background and process to applications and security concerns flows very smoothly. Your sections are easy to follow and fairly concise. Maybe see if you can break down the surveillance section into a few more subsections?

For New Articles Only

[edit]

Your new article passes Wikipedia’s notability requirements, as you have gathered a lot of reliable sources on your subject. It is well formatted, with a lead, sections, a see also section, and a reference page. I would link other Wikipedia links in the article, and then go to those related articles and link the Hancock article on those pages when your article is completed so it can be discovered.

Overall impressions

[edit]

Overall, this is a really well written article with ample content in all sections. The writing is encyclopedic, keeping a neutral tone and giving both sides of view. I really like your lead, as I think it does a great job of prefacing the article. I also like how you provided a background section for further clarifications on Hancock’s history before diving into the more complicated matters. I think my last suggestion would be that after hyperlinking Wikipedia articles throughout your article, removing some of the less related See Also links. Great article so far! Madssnake (talk) 00:58, 31 October 2020 (UTC)

Peer Review (Hiiisparks)

[edit]

Lead

[edit]

The lead well introduces what Hancock programming is and the reason behind its existence. For your Background section, I think it provides really good information on the history of data mining research as it relates to Hancock programming. The readers will be able to learn a lot and follow along the history of its development. Also, I think you should introduce the author before just stating his name as "Smyth."

copy edit: Maybe try to change this wording"which Hancock can be included in"

Content

[edit]

The content is very relevant to the topic and everything seems to be written in good faith to better inform the readers about the topic. I enjoyed reading about the applications and liked how you brought up how Hancock solved the issues viral marketing promoters face. I think it would be a good idea to bring up more applications if possible,

copy edit: First sentence part: "subjects, and has" doesn't need a comma.

Tone and Balance

[edit]

The article is written in a neutral tone and doesn't appear to have any bias. Anytime there is an opinion, you make sure to reference to whomever said it. It seems very factual and does not try to push the reader in favor of one view.

Sources and References

[edit]

In the FBI telecommunication records surveillance, I think you can probably just reference the article one at the end since you are constantly referring to it. I don't think there is a need to be super redundant since it only reference to one source. You do not miss a citation, which is good. The only thing would be to look over to see if you are missing any possible Wiki links such as e-commerce.

Organization

[edit]

In the Background section, the part were it talks about how Hancock is used is a bit confusing. I think a better way to introduce this information would be to just move how the programming is used to at each time period. That way there is a more clear structure on the changes in use of how the data is used. The second paragraph in your Processes section can be moved to your lead as I feel like it helps introduce the concept rather than the process. The Process section is bit hard to follow; I think that maybe using bullet points or numbers could help the readers identify the steps that are used in the process. I think overall there is no use of bullet points and numbers, and I think you could try to find ways to implement them to make it easier to read than just a chunk of words. Maybe also for the processes section you can create a subsection to add all the opinions in in order to not take away from the main focus of describing the process.

For New Articles Only

[edit]

There is a long list of reliable sources that back up the subject. It does follow a similar background to other articles with a history/background section, an application section, a section on how it actually works, and its relations to surveillance. There are also a decent amount of Wiki links, and everything has a reference link, which is good. I think there could be more additional sections, but I know that will be added in the long run.

Overall impressions

[edit]

It is very evident you did a lot of research and provide the readers with a lot to learn. I think the tone and content are good, and I assume you have more that you plan on adding. One of my main suggestions would to use bullet points to organize your paragraphs better since not all of the information ties together. Also a blob of information is hard to process as a reader. Keep up the good work, I look forward to seeing how your article improves!

Peer review (Exploredragon)

[edit]

General info

[edit]

Lead

[edit]

Guiding questions:

  • Has the Lead been updated to reflect the new content added by your peer? Yes.
  • Does the Lead include an introductory sentence that concisely and clearly describes the article's topic? Yes.
  • Does the Lead include a brief description of the article's major sections? Yes.
  • Does the Lead include information that is not present in the article? No.
  • Is the Lead concise or is it overly detailed? Concise.

Lead evaluation

[edit]

I really like your lead!

Content

[edit]

Guiding questions:

  • Is the content added relevant to the topic? Yes
  • Is the content added up-to-date? Yes
  • Is there content that is missing or content that does not belong? No
  • Does the article deal with one of Wikipedia's equity gaps? Does it address topics related to historically underrepresented populations or topics? Yes

Tone and Balance

[edit]

Guiding questions:

  • Is the content added neutral? Yes
  • Are there any claims that appear heavily biased toward a particular position? No
  • Are there viewpoints that are overrepresented, or underrepresented? No
  • Does the content added attempt to persuade the reader in favor of one position or away from another? No

Sources and References

[edit]

Guiding questions:

  • Is all new content backed up by a reliable secondary source of information? Yes
  • Are the sources thorough - i.e. Do they reflect the available literature on the topic? Yes
  • Are the sources current? Yes
  • Are the sources written by a diverse spectrum of authors? Do they include historically marginalized individuals where possible? Yes
  • Check a few links. Do they work? Yes

Organization

[edit]

Guiding questions:

  • Is the content added well-written - i.e. Is it concise, clear, and easy to read? Concise
  • Does the content added have any grammatical or spelling errors? No
  • Is the content added well-organized - i.e. broken down into sections that reflect the major points of the topic? Yes

For New Articles Only

[edit]

If the draft you're reviewing is a new article, consider the following in addition to the above.

  • Does the article meet Wikipedia's Notability requirements - i.e. Is the article supported by 2-3 reliable secondary sources independent of the subject? Yes
  • How exhaustive is the list of sources? Does it accurately represent all available literature on the subject? Yes
  • Does the article follow the patterns of other similar articles - i.e. contain any necessary infoboxes, section headings, and any other features contained within similar articles? Yes
  • Does the article link to other articles so it is more discoverable? Yes

Overall impressions

[edit]

Overall evaluation

[edit]

I feel this article is really ready to be presented in the main space. Well done!

Plusoneplusone Peer review

[edit]

This is where you will complete your peer review exercise. Please use the following template to fill out your review.

General info

[edit]

Lead

[edit]

Guiding questions:

  • Has the Lead been updated to reflect the new content added by your peer? Yes.
  • Does the Lead include an introductory sentence that concisely and clearly describes the article's topic? Yes, the first sentence.
  • Does the Lead include a brief description of the article's major sections? Yes.
  • Does the Lead include information that is not present in the article? No.
  • Is the Lead concise or is it overly detailed? It is pretty concise.

Lead evaluation

[edit]

Content

[edit]

Guiding questions:

  • Is the content added relevant to the topic? All the sections are relevant.
  • Is the content added up-to-date? The content in this article is up to 2016.
  • Is there content that is missing or content that does not belong? No.
  • Does the article deal with one of Wikipedia's equity gaps? Does it address topics related to historically underrepresented populations or topics? Since this topic is a programming language, I think it's pretty difficult to address Wikipedia's equity gaps.

Content evaluation

[edit]

Tone and Balance

[edit]

Guiding questions:

  • Is the content added neutral? Yes.
  • Are there any claims that appear heavily biased toward a particular position? No.
  • Are there viewpoints that are overrepresented, or underrepresented? No.
  • Does the content added attempt to persuade the reader in favor of one position or away from another? No.

Tone and balance evaluation

[edit]

Sources and References

[edit]

Guiding questions:

  • Is all new content backed up by a reliable secondary source of information? Yes.
  • Are the sources thorough - i.e. Do they reflect the available literature on the topic? Yes, I can see a variety of sources here.
  • Are the sources current? Yes.
  • Are the sources written by a diverse spectrum of authors? Do they include historically marginalized individuals where possible? It's written by a diverse spectrum of authors.
  • Check a few links. Do they work? Yes.

Sources and references evaluation

[edit]

Organization

[edit]

Guiding questions:

  • Is the content added well-written - i.e. Is it concise, clear, and easy to read? Yes.
  • Does the content added have any grammatical or spelling errors? There are rarely any grammatical or spelling errors.
  • Is the content added well-organized - i.e. broken down into sections that reflect the major points of the topic? Yes, I can see the content flows in a comprehensive way.

Organization evaluation

[edit]

Images and Media

[edit]

Guiding questions: If your peer added images or media

  • Does the article include images that enhance understanding of the topic? Yes.
  • Are images well-captioned? Yes.
  • Do all images adhere to Wikipedia's copyright regulations? Yes.
  • Are the images laid out in a visually appealing way? Yes.

Images and media evaluation

[edit]

For New Articles Only

[edit]

If the draft you're reviewing is a new article, consider the following in addition to the above.

  • Does the article meet Wikipedia's Notability requirements - i.e. Is the article supported by 2-3 reliable secondary sources independent of the subject? Yes, it is supported by more than 3 reliable sources.
  • How exhaustive is the list of sources? Does it accurately represent all available literature on the subject? Yes.
  • Does the article follow the patterns of other similar articles - i.e. contain any necessary infoboxes, section headings, and any other features contained within similar articles? Yes.
  • Does the article link to other articles so it is more discoverable? Yes.

New Article Evaluation

[edit]

Overall impressions

[edit]

Guiding questions:

  • Has the content added improved the overall quality of the article - i.e. Is the article more complete? The content has delivered a comprehensive overview of the topic.
  • What are the strengths of the content added? In general, this is already a good piece of work. The content added is all relevant and the structure flows in a comprehensive way. There is also no problem with the general tone and the sources.
  • How can the content added be improved? I would suggest to add citations in the lead section and expand a little more on the fraud detection section. Great work!

Overall evaluation

[edit]