Joy Buolamwini

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Joy Buolamwini
Joy Buolamwini - Wikimania 2018 01.jpg
Born1989/1990 (age 30–31)
EducationGeorgia Institute of Technology (BS)
Jesus College, Oxford (MS)
Massachusetts Institute of Technology (MS)
Known forAlgorithmic Justice
AwardsRhodes Scholarship
Fulbright Fellow
Scientific career
FieldsComputer science
Algorithmic bias
InstitutionsMIT Media Lab
ThesisGender shades : intersectional phenotypic and demographic evaluation of face datasets and gender classifiers (2017, Master's)
Doctoral advisorEthan Zuckerman[1] Edit this at Wikidata

Joy Adowaa Buolamwini is a Ghanaian-American computer scientist and digital activist based at the MIT Media Lab. She founded the Algorithmic Justice League, an organization that looks to challenge bias in decision making software. The organization does this by blending art and research to highlight the social implications and harms of AI.[2]

Early life and education[edit]

Buolamwini was born in Edmonton, Alberta, grew up in Mississippi and attended Cordova High School.[3] At age 9, she was inspired by Kismet, the MIT robot, and taught herself XHTML, JavaScript and PHP.[4][5] She was a competitive pole vaulter.[6]

As an undergraduate, Buolamwini studied computer science at Georgia Institute of Technology, where she researched health informatics.[7] Buolamwini graduated as a Stamps President's Scholar from Georgia Tech in 2012,[8] and was the youngest finalist of the Georgia Tech InVenture Prize in 2009.[9]

Buolamwini is a Rhodes Scholar, a Fulbright fellow, a Stamps scholar, an Astronaut Scholar and an Anita Borg Institute scholar.[10] As a Rhodes Scholar, she studied learning and technology at Jesus College, Oxford.[11][12] During her scholarship she took part in the first formal Service Year, working on community focused projects.[12][13] She was awarded a Master's Degree from MIT in 2017 for research supervised by Ethan Zuckerman.[1]

Career and research[edit]

In 2011, she teamed with the Trachoma program at Carter Center, to develop an Android-based assessment system for Ethiopia and aid eradication of the disease worldwide.[14][4]

Joy Buolamwini at Wikimania 2018 in Cape Town

As a Fulbright fellow, in 2013 Buolamwini worked with local computer scientists in Zambia to empower Zambian youth to become technology creators.[15] On September 14, 2016 Buolamwini appeared at the White House summit on Computer Science for All.[citation needed]

She is a researcher at the MIT Media Lab, where she identifies bias in algorithms and develops practices for accountability during their design;[16] at the lab, Buolamwini is a member of Ethan Zuckerman's Center for Civic Media group.[17][18] During her research, Buolamwini showed facial recognition systems 1,000 faces and asked them to identify whether faces were female or male, she found that software found it hard to identify dark-skinned women.[19] Her project, Gender Shades, has attracted significant media attention and become part of her MIT thesis.[1][20] Her 2018 paper Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,[21] prompted responses from IBM and Microsoft, who swiftly improved their software.[22][23] She also created the Aspire Mirror, a device that lets you see a reflection of yourself based on what inspires you.[24] Her program, Algorithmic Justice League, aims to highlight the bias in code that can lead to discrimination against underrepresented groups.[25] She has created two films, 'Code4Rights' and 'Algorithmic Justice League: Unmasking Bias'.[26][27] She is Chief Technology Officer for Techturized Inc, a haircare technology company.[7]

Buolamwini's research is cited as an influence for Google, Microsoft in addressing gender and race bias in their products and processes.[28]


Joy Buolamwini founded the Algorithmic Justice League (AJL) in 2016 to lead the charge in acquiring equitable and accountable artificial intelligence (AI).[29] The AJL organization combines art and research to illuminate the societal implications and harms of AI. The company's mission is to raise public awareness about the impacts of AI, equip society with research, give a voice to the most vulnerable communities, and galvanize the tech industry and congress.

The AJL website operates as a place to locate information and as a live blog.[30] There are several sections on the site where people can share their stories and take action to donate or write to congressional representatives.

Voicing Erasure[edit]

Voicing Erasure is one of the newer sections of research featured on the AJL website.[31] Joy Buolamwini, Allison Koenecke, Safiya Noble, Ruha Benjamin, Kimberle Crenshaw, Megan Smith, and Sasha Costanza-Chock produced a spoken word about bias in voice systems.[32] Buolamwini and Koenecke are the lead researchers uncovering the biases of voice systems. They've discovered that speech recognition systems have the most trouble with African American Vernacular English speakers, despite the disturbing fact that these systems are secretly listening to our conversations. The final issue addressed is the harmful gender stereotypes perpetuated by the subservient nature of Siri, Alexa, and Cortona.

The Coded Gaze: Unmasking Algorithmic Bias[edit]

The Coded Gaze is a mini documentary debuted at the Museum of Fine Arts Boston in 2016 and currently available via YouTube. Buolamwini uses the mini documentary to concentrate on the bias which lies in Artificial Intelligence's function. The inspiration for the mini documentary and her research came when she was at MIT, creating her art "Aspire Mirror" which uses facial recognition to reflect another person who inspires you, onto your face.[33] Buolamwini anticipated having Serena Williams, another dark-skinned woman, reflected onto her face. However, the technology did not recognize her face. Buolamwini's research investigates why this was occurring, and consequently led Buolamwini to conclude that the exclusion of people who look like her was a result of a term she dubbed, the "Coded Gaze."[34] She further discusses this concept in the mini documentary, "The Coded Gaze." The documentary explores how AI is subject to racial and gender biases that reflect the views and cultural backgrounds of those who develop it.[35]

Coded Bias[edit]

The Coded Bias is a documentary film directed by Shalini Kantayya that features Joy Buolamwini’s research about AI inaccuracies in facial recognition technology and automated assessment software.[36][30] It focuses on the lack of regulation of facial recognition tools sold by IBM, Microsoft, and Amazon that perpetuate racial and gender bias. The film tells the story about a battle between Brooklyn tenants and a building management company that tried to initiate facial recognition to enter the building. The film featured include Weapons of Math Destruction author Cathy O'Neill and members of Big Brother Watch, including Silkie Carlo, which is located in London. On April 5, 2021, the documentary was made available to stream on Netflix.[37]

Awards and recognition[edit]

In 2017, Buolamwini was awarded the grand prize in the professional category in the Search for Hidden Figures contest, tied to the release of the film Hidden Figures in December 2016.[38] The contest, sponsored by PepsiCo and 21st Century Fox, was intended to "help uncover the next generation of female leaders in science, technology, engineering and math,"[39] and attracted 7,300 submissions from young women across the United States.[8]

Buolamwini delivered a TEDx talk at Beacon Street entitled How I'm fighting bias in algorithms.[40][41][42] In 2018 she appeared on TED Radio Hour.[43] She was featured on Amy Poehler's Smart Girls in 2018.[3] Fast Company magazine listed her as one of four "design heroes who are defending democracy online".[44] She was listed as one of BBC's 100 Women in 2018.[45]

In 2019, Buolamwini was listed in Fortune Magazine's 2019 list of the World's Greatest Leaders. The magazine also termed her "the conscience of the A.I. revolution".[46] She also made the inaugural Time 100 Next list in 2019.[47] In 2020, Joy Buolamwini featured in a Levi's woman empowerment campaign for 8 March International Women's Day.[48] She was also featured in the documentary Coded Bias.[49]

Personal life[edit]

Buolamwini has lived in Ghana; Barcelona, Spain; Oxford, United Kingdom; and in the U.S., Memphis, Tennessee and Atlanta, Georgia.[9]


  1. ^ a b c Buolamwini, Joy Adowaa (2017). Gender shades : intersectional phenotypic and demographic evaluation of face datasets and gender classifiers (MS thesis). MIT. hdl:1721.1/114068. OCLC 1026503582. Free to read
  2. ^ "Algorithmic Justice League - Unmasking AI harms and biases". Algorithmic Justice League - Unmasking AI harms and biases. Retrieved May 15, 2021.
  3. ^ a b "The Future of Computer Science and Tech: 12 Young Women to Watch — Part 2". Amy Poehler’s Smart Girls. February 19, 2018. Retrieved March 24, 2018.
  4. ^ a b "Joy Buolamwini |". Archived from the original on March 25, 2018. Retrieved March 24, 2018.
  5. ^ "Meet The Digital Activist That's Taking Human Prejudice Out of Our Machines". June 26, 2017. Retrieved March 24, 2018.
  6. ^ "CHS Pole Vaulting - Joy Buolamwini". Archived from the original on March 25, 2018. Retrieved March 24, 2018.
  7. ^ a b "Tech Startup of The Week: Techturized Wins With Hair Care Company". Black Enterprise. March 15, 2013. Retrieved March 24, 2018.
  8. ^ a b "Joy Buolamwini wins national contest for her work fighting bias in machine learning". MIT News. Retrieved March 24, 2018.
  9. ^ a b "Admissions Conquered | InVenture Prize". Retrieved September 25, 2021.
  10. ^ "Scholar Spotlight: Joy Buolamwini | Astronaut Scholarship Foundation". Retrieved September 25, 2021.
  11. ^ Buolamwini, Joy Adowaa (2014). Increasing participation in graduate level computer science education : a case study of the Georgia Institute of Technology's master of computer science. (MSc thesis). University of Oxford. OCLC 908967245.
  12. ^ a b "Joy Buolamwini Profile". The Rhodes Project. Retrieved March 24, 2018.
  13. ^ "Oxford Launchpad: Confessions of an Entrepreneur: Joy Buolamwini | Enterprising Oxford". Archived from the original on March 25, 2018. Retrieved March 24, 2018.
  14. ^ "Scholar Spotlight: Joy Buolamwini | Astronaut Scholarship Foundation". Retrieved March 24, 2018.
  15. ^ ZamrizeMedia (April 28, 2013), Joy Buolamwini | Fulbright Fellow 2013 | Zambia, retrieved March 24, 2018
  16. ^ "Project Overview ‹ Algorithmic Justice League – MIT Media Lab". MIT Media Lab. Retrieved March 24, 2018.
  17. ^ "interview: joy buolamwini | MIT Admissions". Retrieved March 24, 2018.
  18. ^ "Group People ‹ Civic Media – MIT Media Lab". MIT Media Lab. Retrieved March 24, 2018.
  19. ^ "Photo Algorithms ID White Men Fine—Black Women, Not So Much". WIRED. Retrieved March 24, 2018.
  20. ^ Kleinman, Zoe (April 14, 2017). "Is artificial intelligence racist?". BBC News. Retrieved March 24, 2018.
  21. ^ Buolamwini, Joy (2018). "Gender shades: Intersectional accuracy disparities in commercial gender classification". Conference on Fairness, Accountability and Transparency. 81: 77–91 – via
  22. ^ "Mitigating Bias in Artificial Intelligence (AI) Models -- IBM Research". May 16, 2016. Retrieved March 24, 2018.
  23. ^ "Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification" (PDF). Proceedings of Machine Learning Research. 2018. Retrieved March 24, 2018.
  24. ^ "Aspire Mirror". Aspire Mirror. Retrieved March 24, 2018.
  25. ^ International, Youth Radio-- Youth Media (February 28, 2017). "A Search For 'Hidden Figures' Finds Joy". Huffington Post. Retrieved March 24, 2018.
  26. ^ "Filmmakers Collaborative | Code4Rights". Retrieved March 24, 2018.
  27. ^ "Filmmakers Collaborative | Algorithmic Justice League: Unmasking Bias". Retrieved March 24, 2018.
  28. ^ Mar 2; Burt, 2020 | Chris (March 2, 2020). "Tech giants pressured to follow Google in removing gender labels from computer vision services". Biometric Update. Retrieved March 9, 2020.
  29. ^ "Mission, Team and Story - The Algorithmic Justice League". Retrieved May 9, 2021.
  30. ^ a b "Spotlight - Coded Bias Documentary". Retrieved May 9, 2021.
  31. ^ "Voicing Erasure". Retrieved May 9, 2021.
  32. ^ Voicing Erasure - A Spoken Word Piece Exploring Bias in Voice Recognition Technology, retrieved May 9, 2021
  33. ^ "The Coded Gaze: Unpacking Biases in Algorithms That Perpetuate Inequity". Rockefellerfoundation. Retrieved May 15, 2021.
  34. ^ "The Coded Gaze: Unpacking Biases in Algorithms That Perpetuate Inequity". The Rockefeller Foundation. Retrieved June 20, 2021.
  35. ^ "Here's AOC calling out the vicious circle of white men building biased face AI". Fastcompany. May 22, 2019. Retrieved May 15, 2021.
  36. ^ "Coded Bias | Films | PBS". Independent Lens. Retrieved May 9, 2021.
  37. ^ "RAND Review: March-April 2021". 2021. doi:10.7249/cpa682-4. Cite journal requires |journal= (help)
  38. ^ "Hidden No More: STEM Spotlight Shines On 'Hidden Figures' Like MIT's Joy Buolamwini". Youth Radio. February 27, 2017. Retrieved March 24, 2018.
  39. ^ ""Hidden Figures" Inspires A Scholarship Contest For Minority STEM Aspirants". Fast Company. January 19, 2017. Retrieved March 24, 2018.
  40. ^ "Speaker Joy Buolamwini: How I'm Fighting Bias in Algorithms". Retrieved March 24, 2018.
  41. ^ Buolamwini, Joy. "How I'm fighting bias in algorithms – MIT Media Lab". MIT Media Lab. Retrieved March 24, 2018.
  42. ^ TED (March 29, 2017), How I'm fighting bias in algorithms | Joy Buolamwini, retrieved March 24, 2018
  43. ^ Joy Buolamwini: How Does Facial Recognition Software See Skin Color?, retrieved March 24, 2018
  44. ^ Schwab, Katharine (July 3, 2018), Meet 4 design heroes who are defending democracy online, Fast Company Magazine, retrieved July 21, 2018
  45. ^ "BBC 100 Women 2018: Who is on the list?". BBC News. November 19, 2018. Retrieved November 21, 2018.
  46. ^ "Joy Buolamwini". Fortune. Retrieved November 26, 2019.
  47. ^ "TIME 100 Next 2019: Joy Buolamwini". Time. Retrieved December 16, 2019.
  48. ^ "She's Rewriting the Code". Off The Cuff. Retrieved March 9, 2020.
  49. ^ "New Documentary 'Coded Bias' Explores How Tech Can Be Racist And Sexist : Code Switch". NPR. Retrieved December 12, 2020.