|Developer(s)||Amazon, Amazon Web Services|
|Initial release||30 November 2016|
|Type||Software as a service|
Amazon Rekognition is a cloud-based Software as a service (SaaS) computer vision platform that was launched in 2016. It has been sold and used by a number of United States government agencies, including ICE and Orlando, Florida police, as well as private entities.
Rekognition provides a number of computer vision capabilities, which can be divided into two categories: Algorithms that are pre-trained on data collected by Amazon or its partners, and algorithms that a user can train on a custom dataset.
- Celebrity recognition in images
- Facial attribute detection in images, including gender, age range, emotions (e.g. happy, calm, disgusted), whether the face has a beard or mustache, whether the face has eyeglasses or sunglasses, whether the eyes are open, whether the mouth is open, whether the person is smiling, and the location of several markers such as the pupils and jaw line.
- People Pathing enables tracking of people through a video. An advertised use-case of this capability is to track sports players for post-game analysis.
- Text detection and classification in images
- Unsafe visual content detection
Algorithms that a user can train on a custom dataset
- SearchFaces enables users to import a database of images with pre-labeled faces, to train a machine learning model on this database, and to expose the model as a cloud service with an API. Then, the user can post new images to the API and receive information about the faces in the image. The API can be used to expose a number of capabilities, including identifying faces of known people, comparing faces, and finding similar faces in a database.
- Face-based user verification
History and use
In late 2017, the Washington County, Oregon Sheriff's Office began using Rekognition to identify suspects' faces. Rekognition was marketed as a general-purpose computer vision tool, and an engineer working for Washington County decided to use the tool for facial analysis of suspects. Rekognition was offered to the department for free, and Washington County became the first US law enforcement agency known to use Rekognition. In 2018, the agency logged over 1,000 facial searches. The county, according to the Washington Post, by 2019 was paying about $7 a month for all of its searches. The relationship was unknown to the public until May 2018. In 2018, Rekognition was also used to help identify celebrities during a royal wedding telecast.
In April 2018, it was reported that FamilySearch was using Rekognition to enable their users to "see which of their ancestors they most resemble based on family photographs." In early 2018, the FBI also began using it as a pilot program for analyzing video surveillance.
In May 2018, it was reported by the ACLU that Orlando, Florida was running a pilot using Rekognition for facial analysis in law enforcement, with that pilot ending in July 2019. After the report, on June 22, 2018, Gizmodo reported that Amazon workers had written a letter to CEO Jeff Bezos requesting he cease selling Rekognition to US law enforcement, particularly ICE and Homeland Security. A letter was also sent to Bezos by the ACLU. On June 26, 2018, it was reported that the Orlando, Florida police force had ceased using Rekognition after their trial contract expired, reserving the right to use it in the future. The Orlando Police Department said that they had "never gotten to the point to test images" due to old infrastructure and low bandwidth.
In July 2018, the ACLU released a test showing that Rekognition had falsely matched 28 members of Congress with mugshot photos, particularly Congresspeople of color. 25 House members afterwards sent a letter to Bezos, expressing concern about Rekognition. Amazon responded saying the Rekognition test had generated 80 percent confidence, while it recommended law enforcement only use matches rated at 99 percent confidence. The Washington Post states that Oregon instead has officers pick a "best of five" result, instead of adhering to the recommendation.
On December 1, 2018, it was reported that 8 Democratic lawmakers had said in a letter that Amazon had "failed to provide sufficient answers" about Rekognition, writing that they had "serious concerns that this type of product has significant accuracy issues, places disproportionate burdens on communities of color, and could stifle Americans’ willingness to exercise their First Amendment rights in public."
In January 2019, MIT researchers published a peer-reviewed study asserting that Rekognition had more difficulty in identifying dark-skinned females than competitors such as IBM and Microsoft. In the study, Rekognition misidentified darker-skinned women as men 31% of the time, but made no mistakes for light-skinned men. Amazon called the report "misinterpreted results" of the research with an improper "default confidence threshold."
In January 2019, Amazon's shareholders "urged Amazon to stop selling Rekognition software to law enforcement agencies." Amazon in response defended its use of Rekognition, but supported new federal oversight and guidelines to "make sure facial recognition technology cannot be used to discriminate." In February 2019, it was reported that Amazon was collaborating with the National Institute of Standards and Technology (NIST) on developing standardized tests to improve accuracy and remove bias with facial recognition.
In April 2019, Amazon was told by the Securities and Exchange Commission that they had to vote on two shareholder proposals seeking to limit Rekognition. Amazon argued that the proposals were an "insignificant public policy issue for the Company" not related to Amazon's ordinary business, but their appeal was denied. The vote was set for May. The first proposal was tabled by shareholders. On May 24, 2019, 2.4% of shareholders voted to stop selling Rekognition to government agencies, while a second proposal calling for a study into Rekognition and civil rights had 27.5% support.
In August 2019, the ACLU again used Rekognition on members of government, with 26 of 120 lawmakers in California flagged as matches to mugshots. Amazon stated the ACLU was "misusing" the software in the tests, by not dismissing results that did not meet Amazon's recommended accuracy threshold of 99%. By August 2019, there had been protests against ICE's use of Rekognition to surveil immigrants.
Controversy regarding facial analysis
Racial and gender bias
In 2018, MIT researchers Joy Buolamwini and Timnit Gebru published a study called Gender Shades. In this study, a set of images was collected, and faces in the images were labeled with face position, gender, and skin tone information. The images were run through SaaS facial recognition platforms from Face++, IBM, and Microsoft. In all three of these platforms, the classifiers performed best on male faces (with error rates on female faces being 8.1% to 20.6% higher than error rates on male faces), and they performed worst on dark female faces (with error rates ranging from 20.8% to 30.4%). The authors hypothesized that this discrepancy is due principally to Megvii, IBM, and Microsoft having more light males than dark females in their training data, i.e. dataset bias.
In January 2019, researchers Inioluwa Deborah Raji and Joy Buolamwini published a follow-up paper that ran the experiment again a year later, on the latest versions same three SaaS facial recognition platforms, plus two additional platforms: Kairos, and Amazon Rekognition. While the systems' overall error-rates improved over the previous year, all five of the systems again performed better on male faces than on dark female faces.
Classification of gender minorities
Rekognition's gender identification technology categorizes faces as only male or female, with no other options. Critics have identified a number of disadvantages of this approach. First, there is no category for individuals of nonbinary gender. Second, in an experiment conducted by a journalist, it was found that Rekognition is less accurate at identifying the gender of transgender individuals than of cisgender individuals. Amazon updated its website on September 2019 to say that Rekognition was not designed to "categorize a person's gender identity", and suggested only using its gender data in aggregate.
- Amazon Lex
- Amazon Mechanical Turk
- Amazon Polly
- Amazon SageMaker
- Amazon Web Services
- Facial recognition system
- Timeline of Amazon Web Services
- Lardinois, Frederic (2016-11-30). "Amazon launches Amazon AI to bring its machine learning smarts to developers". TechCrunch. Retrieved 2019-07-21.
- "What Is Amazon Rekognition?". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
- "What is the Celebrity Recognition API? Is that the same or different than doing a face search?". AWS. Retrieved 2019-07-21.
- Lardinois, Frederic (2016-06-08). "Amazon Rekognition can now recognize celebrities". TechCrunch. Retrieved 2019-07-21.
- "Detecting Faces in an Image". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
- "Amazon Rekognition launches enhanced face analysis". Planet Biometrics. 2019-03-19. Retrieved 2019-07-21.
- "People Pathing". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
- "Detecting Text". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
- O'Brien, Chris (2018-09-13). "Mapillary will use Amazon Rekognition in effort to ease urban parking crunch". Venture Beat. Retrieved 2019-07-21.
- "Detecting Unsafe Content". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
- "Searching Faces in a Collection". Amazon Rekognition Developer Guide. Retrieved 2019-07-21.
- "Amazon's facial-recognition technology is supercharging Washington County police". Oregon Live. Retrieved 2019-07-21.
- "Amazon Rekognition Customers". AWS. Retrieved 2019-07-21.
- Glaser, April (July 19, 2019). "How to Not Build a Panopticon". Slate. Retrieved August 27, 2019.
- Harwell, Drew (April 30, 2019). "Oregon became a testing ground for Amazon's facial-recognition policing. But what if Rekognition gets it wrong?". The Washington Post. Retrieved August 27, 2019.
- Pasternack, Alex (April 4, 2019). "Amazon says face recognition fears are "insignificant." The SEC disagrees". Fast Company. Retrieved August 27, 2019.
- "Amazon Rekognition Improves Accuracy of Real-Time Face Recognition and Verification". AWS. 2018-04-02. Retrieved 2019-07-21.
- Brandom, Russell (2018-05-22). "Amazon is selling police departments a real-time facial recognition system". The Verge. Retrieved 2019-07-21.
- Statt, Nick (2019-07-18). "Orlando police once again ditch Amazon's facial recognition software". The Verge. Retrieved 2019-07-21.
- Zhou, Marrian (June 26, 2018). "Orlando stops using Amazon's controversial facial recognition tech". CNET. Retrieved August 27, 2019.
- Keane, Sean (June 22, 2018). "Amazon employees protest sale of face recognition software to police". CNET. Retrieved August 27, 2019.
- Singh Guliani, Neema (October 24, 2018). "Amazon Met With ICE Officials to Market Its Facial Recognition Product". ACLU. Retrieved August 27, 2019.
- Statt, Nick (November 8, 2018). "Amazon told employees it would continue to sell facial recognition software to law enforcement". The Verge. Retrieved August 27, 2019.
- Day, Matt (October 23, 2018). "Amazon Officials Pitched Their Facial Recognition Software to ICE". The Seattle Times. Retrieved August 27, 2019.
- Boyce, Jasmin (December 1, 2018). "Lawmakers demand answers from Amazon on facial recognition tech". NBC News. Retrieved August 27, 2019.
- Crist, Ry (March 19, 2019). "Amazon's Rekognition software lets cops track faces: Here's what you need to know". CNET. Retrieved August 27, 2019.
- Lacy, Lisa (February 19, 2019). "Amazon Rekognition May Finally Be Audited and Ranked Alongside Other Vendors". Adweek. Retrieved August 27, 2019.
- Hale, Kori (March 12, 2019). "Auditing Amazon's 'Rekognition' A.I. Could Remove Bias". Forbes. Retrieved August 27, 2019.
- Singer, Natasha (May 5, 2019). "Amazon Faces Investor Pressure Over Facial Recognition". The New York Times. Retrieved August 27, 2019.
- Whittaker, Zack (May 20, 2019). "Amazon under greater shareholder pressure to limit sale of facial recognition tech to the government". TechCrunch. Retrieved August 27, 2019.
- Dastin, Jeffrey (May 24, 2019). "Amazon facial recognition ban won just 2% of shareholder vote". Reuters. Retrieved August 27, 2019.
- Wehner, Mike (August 14, 2019). "Amazon's facial recognition system flags dozens of California lawmakers as criminals". BGR. Retrieved August 27, 2019.
- Protalinski, Emil (August 16, 2019). "ProBeat: Breakthrough or BS, Amazon's Rekognition is dangerous". VentureBeat. Retrieved August 27, 2019.
- Menegus, Bryan (August 13, 2019). "Amazon Rekognition Can Now Identify the Emotion It Provokes in Rational People". Gizmodo. Retrieved August 27, 2019.
- Crowe, Michael (August 15, 2019). "Amazon says facial recognition can detect fear, raising concern for some privacy advocates". King5. Retrieved August 27, 2019.
- Mihalcik, Carrie (August 15, 2019). "Amazon's Rekognition software can now spot fear". CNET. Retrieved August 27, 2019.
- Buolamwini, Joy; Gebru, Timnit (2018). "Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification" (PDF). Proceedings of Machine Learning Research.
- Quach, Katyanna (2018-02-13). "Facial recognition software easily IDs white men, but error rates soar for black women". The Register. Retrieved 2019-07-21.
- Raji, Inioluwa Deborah; Buolamwini, Joy (2019-01-27). "Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products" (PDF). AAAI/ACM Conference on Artificial Intelligence, Ethics, and Society.
- Wiggers, Kyle (2019-01-24). "MIT researchers: Amazon's Rekognition shows gender and ethnic bias (updated)". Venture Beat. Retrieved 2019-07-21.
- Johnson, Khari (2019-04-24). "A transgender AI researcher's nightmare scenarios for facial recognition software". Venture Beat. Retrieved 2019-07-21.
- Merlan, Anna; Mehrotra, Dhruv (2019-06-27). "Amazon's Facial Analysis Program Is Building A Dystopic Future For Trans And Nonbinary People". Jezebel. Retrieved 2019-07-21.
- Khalid, Amrita. "Facial recognition AI can't identify trans and non-binary people". Quartz. Retrieved December 9, 2019.