Amazon Mechanical Turk
|Alexa rank||8,419 (January 2013[update])|
The Amazon Mechanical Turk (MTurk) is a crowdsourcing Internet marketplace that enables individuals or businesses (known as Requesters) to co-ordinate the use of human intelligence to perform tasks that computers are currently unable to do. It is one of the sites of Amazon Web Services. The Requesters are able to post tasks known as HITs (Human Intelligence Tasks), such as choosing the best among several photographs of a store-front, writing product descriptions, or identifying performers on music CDs. Workers (called Providers in Mechanical Turk's Terms of Service, or, more colloquially, Turkers) can then browse among existing tasks and complete them for a monetary payment set by the Requester. To place HITs, the requesting programs use an open Application Programming Interface, or the more limited MTurk Requester site. Requesters are restricted to US-based entities.
Requesters can ask that Workers fulfill Qualifications before engaging a task, and they can set up a test in order to verify the Qualification. They can also accept or reject the result sent by the Worker, which reflects on the Worker's reputation. Workers can have an address anywhere in the world. Payments for completing tasks can be redeemed on Amazon.com via gift certificate (gift certificates are the only payment option available to international workers, apart from India) or be later transferred to a Worker's U.S. bank account. Requesters, which are typically businesses, pay 10 percent of the price of successfully completed HITs to Amazon.
The name Mechanical Turk comes from "The Turk", a chess-playing automaton of the 18th century, which was made by Wolfgang von Kempelen. It toured Europe, beating the likes of Napoleon Bonaparte and Benjamin Franklin. It was later revealed that this "machine" was not an automaton at all, but was in fact a chess master hidden in a special compartment controlling its operations. Likewise, the Mechanical Turk web service allows humans to help the machines of today perform tasks for which they are not suited.
History, HIT types, and user demographics
The service was initially invented for Amazon's in-house use by Peter Cohen, to find duplicates among its web pages describing products.
MTurk was launched publicly on November 2, 2005 and, as of January 2014[update], is still in beta. Following its launch, the Mechanical Turk user base grew quickly. At that time, there were a huge number of "Human Intelligence Tasks" (HITs) in the system. In early- to mid-November 2005, there were tens of thousands of HITs, all of them uploaded to the system by Amazon itself for some of its internal tasks that required human intelligence. Most of these were related to music CD items. Web traffic grew to a massive amount near the beginning of December.
However, the number of HITs in the system soon decreased, and by December 20, there were fewer than 100 groups of HITs on the average page load. In January, new types of HITs were set up, such as top three lists ranking for the (now defunct) Amazon Unspun site, and third-party HITs began to appear as well. By April 2006, there were only the occasional batch of 25 HIT groups being offered, and the service had slowed to a crawl.
As of January 2007, there were new HITs being offered of podcast transcribing, rating and image tagging. The transcription HITs, mostly offered by CastingWords, are still posted regularly as of May 2013[update]. Other common HIT types ask Turkers to write or rewrite sentences, paragraphs, or whole articles. These have rewards ranging from one cent to about $10. HITs that reward people for linking to or commenting on a blog, or friending a person on Facebook are also often encountered, as are surveys.
In March 2007, there were reportedly more than 100,000 workers in over 100 countries. This increased to over 500,000 workers from over 190 countries in January 2011. In the same year, techlist published an interactive map pinpointing the locations of 50,000 of their MTurk workers around the world.
Worker and Requester Interfaces
After visiting the home page of Mechanical Turk website, users have the option of either becoming a worker or a requester. The worker and requester interfaces are different, to accommodate the needs and purposes of each respective role. Workers have access to a dashboard which displays three different tabs- Total Earnings, Your HIT Status, and HIT Totals. The Total Earnings tab displays the earnings that a worker has received from completing Human Intelligence Tasks, the earnings made from bonuses, which can be received from any Human Intelligence Task, and the total of these two. The Your HIT Status tab displays a list of daily activity and daily earnings, along with the number of HITs that were submitted, approved, rejected, or pending that day. The HIT Totals tab displays information about either accepted or submitted HITs, which include the percentages of HITs that were submitted, returned, or abandoned and the percentage of HITs that were approved, rejected, or pending from those that were submitted.
Workers complete HITs by looking through the large database of HITs that are available through Mechanical Turk. Some HITs, however, require qualifications, which workers earn through either taking a qualification test or by acquiring certain benchmarks such as completing a certain number of HITs or having an approval rate above a certain percentage. Mechanical Turk Masters are Workers who have demonstrated consistent accuracy on specific types of HITs across a wide variety of Requesters. As of September 2012, there were two types of Masters: Data Categorization Masters and Photo Moderation Masters. Masters are granted access to higher levels of work and access to a private forum.
Payments in Mechanical Turk are all done cooperatively with Amazon Payments. A requester has a different interface which is designed to assist the user in creating tasks for other users to complete. A requester is presented with many different templates from which to choose in the design of a HIT which include a writing, survey, translation, categorization, and other templates. A requester has a wide range of flexibility in the design of a HIT as it is completely up to the requester how to design, publish, and manage their HITs. After publishing a HIT, the HIT is put on the HIT database for Mechanical Turk workers to complete.
Missing persons searches
In 2007, the service began to be used to search for prominent missing individuals. It was first suggested during the search for James Kim, but his body was found before any technical progress was made. That summer, computer scientist Jim Gray disappeared on his yacht and Amazon's Werner Vogels, a personal friend, made arrangements for DigitalGlobe, which provides satellite data for Google Maps and Google Earth, to put recent photography of the Farallon Islands on the Mechanical Turk. A front-page story on Digg attracted 12,000 searchers who worked with imaging professionals on the same data. The search was unsuccessful.
In September 2007, a similar arrangement was repeated in the search for aviator Steve Fossett. Satellite data was divided into 85 squared meter sections, and Mechanical Turk users were asked to flag images with "foreign objects" that might be a crash site or other evidence that should be examined more closely. This search was also unsuccessful, partly due to the limited search area. The satellite imagery was mostly within a 50 mile radius. The crash site was eventually found by hikers about a year later 65 miles away.
Social science experiments
Beginning in 2010, numerous researchers have explored the viability of Mechanical Turk to recruit subjects of social-science experiments. In general, researchers found that while the sample of respondents obtained through Mechanical Turk does not perfectly match characteristics of the U.S. population, it doesn't present a wildly inaccurate view either. They determined that the service works best for random population sampling; it is less successful with studies that require more precisely defined populations. Overall, the US MTurk population is mostly female and white, and is somewhat younger and more educated than the US population overall.
The cost of MTurk was considerably lower than other means of conducting surveys, with workers willing to complete tasks for less than half the US minimum wage.
Artistic and educational research
In addition to growing interest from the social sciences MTurk has also been used as both a tool for artistic and educational exploration. Artist Aaron Koblin has made use of MTurk's crowd-sourcing power to create a number of collaborative artistic works such as The Sheep Market and Ten Thousand Cents which combined thousands of individual drawings of a US$100 bill. The work functions as a sort of reverse exquisite corpse drawing.
Inspired by Koblin's collaborative artworks a Concordia University graduate research student turned to MTurk to see if the crowd-sourcing technology could also be used for educational research. Scott McMaster conducted two pilot projects which used HITs to request drawings but unlike Koblin's work the Turkers (or participants) knew exactly what the drawings were being used for. The HITs required participants to visually represent sets of words in drawings and fill out a short demographic survey. Although the research would be considered in its infancy McMaster made several interesting findings which suggest a globalizing effect is taking place within visual cultural representations. It is also perhaps one of the first published instances of this type of online research into visual culture.
Programmers have developed various browser extensions and scripts designed to simplify the process of completing HITs. According to the Amazon Web Services Blog, however, Amazon appears to disapprove of the ones that automate the process 100% and take out the human element. Accounts using so-called automated bots have been banned.
Amazon makes available an API to give users another access point to the MTurk system. The MTurk API lets a programmer submit HITs to MTurk, retrieve completed work, and approve or reject that work. Web sites and web services can use the API to integrate MTurk work into other web applications, providing users with alternatives to the interface Amazon has built for these functions.
Amazon coined the term Artificial artificial intelligence for processes outsourcing some parts of a computer program to humans, for those tasks carried out much faster by humans than computers. Jeff Bezos was responsible for the concept that led to Amazon's Mechanical Turk being developed to realize this process.
MTurk is comparable in some respects to the now discontinued Google Answers service. However, the Mechanical Turk is a more general marketplace that can potentially help distribute any kind of work tasks all over the world. The Collaborative Human Interpreter (CHI) by Philipp Lenssen also suggested using distributed human intelligence to help computer programs perform tasks that computers cannot do well. MTurk could be used as the execution engine for the CHI.
Because HITs are typically simple, repetitive tasks and users are paid often only a few cents to complete them, some have criticized Mechanical Turk as a "digital sweatshop". At the same time, workers set their own hours and are not under any obligation to accept any work they do not wish to do. Because workers are paid as contractors rather than employees, requesters do not have to file forms for, nor pay payroll taxes, and they avoid laws regarding minimum wage, overtime, and workers compensation. Workers, though, must report their income as self-employment income. In addition, some requesters have taken advantage of workers by having them do the tasks, then rejecting their submissions in order to avoid paying. However, at least some workers on Mechanical Turk are people who do the work for fun. The average wage for the multiple microtasks assigned, if they are done quickly, is about one dollar an hour, with each task averaging a few cents.
Amazon.com does not monitor the service and refers all complaints to the poster of the HIT.
- "mturk.com Site Info". Alexa Internet. Retrieved 2013-01-26.
- "Overview | Requester | Amazon Mechanical Turk". Requester.mturk.com. Retrieved 2011-11-28.
- "Can international Requesters use Amazon Mechanical Turk to get tasks completed". mturk.com. Retrieved 2012-06-13.
- Artificial Intelligence, With Help From the Humans, The New York Times, 25 March 2007
- "Turker Index - Turkers 2.0 (beta)". Turkers.castingwords.com. Retrieved 2011-11-28.
- "AWS Developer Forums". Retrieved 14 November 2012.
- Tamir, Dahn. "50000 Worldwide Mechanical Turk Workers". techlist. Retrieved 8 November 2011.
- Panos Ipeirotis (March 19, 2008). "Mechanical Turk: The Demographics". New York University. Retrieved 2009-07-30.
- Panos Ipeirotis (March 16, 2009). "Turker Demographics vs Internet Demographics". New York University. Retrieved 2009-07-30.
- "What is a Mechanical Master?". Frequently Asked Questions. Amazon. Retrieved 26 September 2012.
- Steve Silberman (July 24, 2007). "Inside the High-Tech Search for a Silicon Valley Legend". Wired magazine. Retrieved 2007-09-16.
- "AVweb Invites You to Join the Search for Steve Fossett". Avweb.com. Retrieved 2011-11-28.
- "Official Mechanical Turk Steve Fossett Results". Amazon.com. 2007-09-24. Retrieved 2012-08-14.
- Jim Christie (October 1, 2008). "Hikers find Steve Fossett's ID, belongings". Reuters. Archived from the original on 20 December 2008. Retrieved 2008-11-27.
- "Evaluating Online Labor Markets for Experimental Research: Amazon.com’s Mechanical Turk". JournalistsResource.org, retrieved June 18, 2012
- Paolacci, Gabriele.; Chandler, Jesse; Ipeirotis, Panos (2010). "Running Experiments on Amazon Mechanical Turk". Judgment and Decision Making.
- Buhrmester, Michael; Kwang, Tracy; Gosling, Sam (2011). "Amazon's Mechanical Turk A New Source of Inexpensive, Yet High-Quality, Data?". Perspectives on Psychological Science. doi:10.1177/1745691610393980.
- Berinsky, Adam J.; Huber, Gregory A.; Lenz, Gabriel S. (2012). "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk". Political Analysis. doi:10.1093/pan/mpr057.
- Horton, John; Chilton, Lydia (2010). "The Labor Economics of Paid Crowdsourcing". Proceedings of the 11th ACM conference on Electronic commerce. doi:10.1145/1807342.1807376.
- Koblin, Aaron: http://www.aaronkoblin.com/work.html
- McMaster, S. (2012). New Approaches to Image-based Research and Visual Literacy. In Avgerinou, Chandler, Search and Terzic (Eds.), New Horizons in Visual Literacy: Selected Readings of the International Visual Literacy Association (122-132). Siauliai, Lithuania: SMC Scientia Educologica: http://concordia.academia.edu/SCOTTMCMASTER
- "Amazon Web Services Blog: Amazon Mechanical Turk Status Update". Aws.typepad.com. 2005-12-06. Retrieved 2011-11-28.
- "Documentation Archive : Amazon Web Services". Developer.amazonwebservices.com. Retrieved 2011-11-28.
- "Artificial artificial intelligence". The Economist.
- Harris, Mark (2008-12-21). "Email from America". London: Sunday Times.
- Mieszkowski, Katharine (2006-07-24). "I make $1.45 a week and I love it". Salon.com. Archived from the original on 14 July 2008. Retrieved 2008-06-15.
- "Amazon Mechanical Turk: The Digital Sweatshop" Ellen Cushing Utne Reader January–February 2013:
- Official website
- Wired Magazine story about "Crowdsourcing," June 2006.
- Business Week article on Mechanical Turk by Rob Hof, November 4, 2005.
- New York Times article on Mechanical Turk by Jason Pontin, March 25, 2007.
- Salon.com article on Mechanical Turk by Katharine Mieszkowski, July 24, 2006.
- Technology Review article on Mechanical Turk, "How Mechanical Turk is Broken," by Christopher Mims, January 3, 2010.
- Requester Best Practices Guide, Updated June 2011.