Sandra Wachter

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
Sandra Wachter
Sandra Wachter at Berkman Klein Center for Internet & Society.jpg
Sandra Wachter speaks at the Berkman Klein Center for Internet & Society in 2018
Born
Austria
Alma materUniversity of Vienna
University of Oxford
Scientific career
InstitutionsOxford Internet Institute
Alan Turing Institute
Royal Academy of Engineering

Sandra Wachter is as Associate Professor and Senior Research Fellow in Data Ethics, Artificial Intelligence, robotics and internet regulation at the Oxford Internet Institute, University of Oxford.[1] She is also a Fellow at The Alan Turing Institute.[2]

Early life and education[edit]

Wachter grew up in Austria.[3] She studied law at the University of Vienna.[4] Wachter has said that she was inspired to work in technology because of her grandmother, who was one of three women admitted to the Austrian technical university.[3] She completed her Master of Law in 2009, before starting as a legal counsel in the Austrian Federal Ministry of Health. During this time she joined the faculty at the University of Vienna, where she started a doctoral degree in technology, intellectual property and regulation. She completed her PhD in 2015, and simultaneously earned a master's degree in social sciences at the University of Oxford. After earning her doctorate Wachter joined the Royal Academy of Engineering where she worked in public policy. She returned to the University of Vienna where she worked on the ethical aspects of innovation.[5]

Research and career[edit]

Her work covers legal and ethical issues associated with Big Data, AI, algorithms and data protection.[6][7] She believes that there needs to be a balance between technical innovation and personal control of information.[8] Wachter was made a Research Fellow at the Alan Turing Institute in 2016. In this capacity she has evaluated the ethical and legal aspects of data science. She has argued that artificial intelligence should be more transparent and accountable, and that people have a "right to reasonable inferences".[9][10][11] She has highlighted cases where opaque algorithms have become racist and sexist; such as discrimination in applications to St George's Hospital and Medical School in the 1970s and overestimations of black defendants reoffending when using the program COMPAS.[9] Whilst Wachter appreciates that it is difficult to eliminate bias from training data sets, she believes that is possible to develop tools to identify and eliminate them.[9][12] She has looked at ways to audit artificial intelligence to tackle discrimination and promote fairness.[4][13] In this capacity she has argued that Facebook should continue to use human moderators.[14]

She has argued that General Data Protection Regulation (GDPR)[15] is in need of reform, as despite attention being paid to the input stage, less time is spent on how the data is assessed.[16][17] She believes that privacy must mean more than data protection, focussing on data evaluation and ways for people to control how information about them is stored and shared.[16][18]

Working with Brent Mittelstadt and Chris Russell, Wachter suggested counterfactual explanations – statements of how different the world would be to result in a different outcome. When decisions are made by an algorithm it can be difficult for people to understand why they are being made, especially without revealing trade secrets about an algorithm. Counterfactual explanations would permit the interrogation of algorithms without the need to reveal secrets. The approach of using counterfactual explanations was adopted by Google on What If, a feature on TensorBoard, a Google Open Source web application that uses machine learning.[3] Counterfactual explanations without opening the black box: automated decisions and the GDPR,[19] a paper written by Wachter, Brent Mittelstadt and Chris Russell, has been featured by the press[3] and is widely cited in scholarly literature.

Academic service[edit]

She was made an Associate Professor at the University of Oxford in 2019.[20] She is a Visiting Professor at Harvard University from Spring 2020.[4] Wachter is a member of the World Economic Forum Council on Values, Ethics and Innovation, an affiliate at the Bonavero Institute of Human Rights and a member of the European Commission Expert Group on Autonomous Cars.[21][22]

Awards and honours[edit]

  • 2017 CognitionX AI Superhero Award[23]
  • 2019 The Next Web Most influential people in AI in 2019[24]
  • 2019 Privacy Law Scholars Conference Junior Scholars Award[25]
  • 2019 Business Insider Nordic AI Trailblazer[26]
  • 2019 Business Insider UK Tech 100[27]

References[edit]

  1. ^ "Dr Sandra Wachter — Oxford Internet Institute". www.oii.ox.ac.uk. Retrieved 2019-03-10.
  2. ^ "Sandra Wachter". The Alan Turing Institute. Retrieved 2019-03-10.
  3. ^ a b c d Katwala, Amit (2018-12-11). "How to make algorithms fair when you don't know what they're doing". Wired UK. ISSN 1357-0978. Retrieved 2019-03-10.
  4. ^ a b c School, Harvard Law. "Sandra Wachter | Harvard Law School". Retrieved 2019-10-30.
  5. ^ Admin, QEPrize (2016-06-06). "Robots: Faithful servants or existential threat?". Create the Future. Retrieved 2019-10-30.
  6. ^ "Why it's totally unsurprising that Amazon's recruitment AI was biased against women". nordic.businessinsider.com. 2018-10-13. Retrieved 2019-03-10.
  7. ^ Baraniuk, Chris. "Exclusive: UK police wants AI to stop violent crime before it happens". New Scientist. Retrieved 2019-03-10.
  8. ^ CPDP 2019: Profiling, microtargeting and a right to reasonable algorithmic inferences., retrieved 2019-10-30
  9. ^ a b c HutsonMay. 31, Matthew; 2017; Pm, 2:00 (2017-05-31). "Q&A: Should artificial intelligence be legally required to explain itself?". Science | AAAS. Retrieved 2019-10-30.CS1 maint: numeric names: authors list (link)
  10. ^ "OII London Lecture: Show Me Your Data and I'll Tell You Who You Are — Oxford Internet Institute". www.oii.ox.ac.uk. Retrieved 2019-10-30.
  11. ^ Privacy, identity, and autonomy in the age of big data and AI - Sandra Wachter, University of Oxford, retrieved 2019-10-30
  12. ^ "Subscribe to read". Financial Times. Retrieved 2019-10-30.
  13. ^ "What Does a Fair Algorithm Actually Look Like?". Wired. ISSN 1059-1028. Retrieved 2019-10-30.
  14. ^ Vincent, James (2019-02-27). "AI won't relieve the misery of Facebook's human moderators". The Verge. Retrieved 2019-10-30.
  15. ^ Artificial Intelligence: GDPR and beyond - Dr. Sandra Wachter, University of Oxford, retrieved 2019-10-30
  16. ^ a b Shah, Sooraj. "This Lawyer Believes GDPR Is Failing To Protect You - Here's What She Would Change". Forbes. Retrieved 2019-03-10.
  17. ^ Wachter, Dr Sandra (2018-04-30). "Will our online lives soon become 'private' again?". Retrieved 2019-10-30.
  18. ^ "Privacy, Identity, & Autonomy in the age of Big Data and AI". TechNative. 2019-06-03. Retrieved 2019-10-30.
  19. ^ Wachter, Sandra; Mittelstadt, Brent; Russell, Chris (2017). "Counterfactual Explanations Without Opening the Black Box: Automated Decisions and the GDPR". arXiv:1711.00399. Bibcode:2017arXiv171100399W. doi:10.2139/ssrn.3063289. ISSN 1556-5068. Cite journal requires |journal= (help)
  20. ^ "Professor Sandra Wachter — Oxford Internet Institute". www.oii.ox.ac.uk. Retrieved 2019-10-30.
  21. ^ "Sandra Wachter". World Economic Forum. Retrieved 2019-10-30.
  22. ^ "Academic Affiliates of the Bonavero Institute of Human Rights". Oxford Law Faculty. 2018-01-25. Retrieved 2019-10-30.
  23. ^ "Turing partners with Cog X London 2017 to explore the impact of AI across sectors". The Alan Turing Institute. Retrieved 2019-10-30.
  24. ^ Greene, Tristan (2019-02-28). "Here's who has the most juice in Twitter's AI influencer community". The Next Web. Retrieved 2019-03-10.
  25. ^ "PLSC Paper Awards". Berkeley Law. Retrieved 2019-10-30.
  26. ^ Hamilton, Isobel Asher. "3 female AI trailblazers reveal how they beat the odds and overcame sexism to become leaders in their field". Business Insider. Retrieved 2019-10-30.
  27. ^ Wood, Mary Hanbury, Isobel Asher Hamilton, Charlie. "UK Tech 100: The 30 most important, interesting, and impactful women shaping British technology in 2019". Business Insider. Retrieved 2019-10-30.