Wikipedia:Community health initiative on English Wikipedia
Community health initiative
Helping the Wikimedia volunteer community to reduce the level of harassment and disruptive behavior on our projects.
- 1 The Community health initiative
- 2 More background about the Community health initiative
- 3 The team
- 4 How to get involved
The Community health initiative
Over the past several years the Wikimedia Foundation has researched and learned how harassment affects participation on Wikipedia and have received numerous requests from the English Wikipedia community for better tools and preparation to deal with negative behavior. In January[year needed] we received funding via a Newmark Foundation grant to address this problem over the next two years.
We're calling these efforts the 'Community health initiative' and our work contains four equally important parts:
- Researching the causes of harassment and workflows of those who report and resolve harassment
- Building anti-harassment tools for the MediaWiki platform
- Fostering an environment for the Wikipedia community to evaluate and grow their policies on harassment
- Identifying and training Wikipedia contributors to properly mitigate reports of harassment
The Research team's Anti-Harassment Research project will aim to understand and model the characteristics of harassment in Wikimedia projects in order to inform the development of anti-harassment tools and recommendations for community-specific behavioral policies and enforcement processes.
We want to build software that empowers contributors and administrators to make timely, informed decisions when harassment occurs. Four focus areas have been identified where new tools could be beneficial in addressing and responding to harassment:
- Detection: We want to make it easier and more efficient for editors to identify and flag harassing behavior. We are currently questioning how harassment can be prevented before it begins, and how minor incidents be resolved before they snowball into larger uncivil problems.
- Reporting: According to Detox research, harassment is underreported on English Wikipedia. No victim of harassment should abandon editing because they feel powerless to report abuse. We want to provide victims improved ways to report instances that are more respectful of their privacy, less chaotic and less stressful than the current workflow. Currently the burden of proof is on the victim to prove their own innocence and the harasser's fault, while we believe the MediaWiki software should perform the heavy lifting.
- Evaluating: We want to build tools to help volunteers better understand and evaluate harassment cases, and inform the best way to respond. Current processes are time consuming and a high level of proficiency is necessary for an user to be able to analyze and evaluate the true sequence of events of a conduct dispute. We want to reduce the workload on people evaluating cases.
- Blocking: We want to improve existing tools and create new tools, if appropriate, to remove troublesome actors from a wiki, or certain areas within a wiki, and to make it more difficult for someone who's blocked from the site to return.
We will work with the community to research and analyze how behavioral issues on English Wikipedia are a) covered in policy, and b) enforced in the community, particularly noticeboards where problems are discussed and actioned. We will provide research on alternate forms of addressing specific issues, researching effectiveness, and identifying different approaches that have found success on other Wikimedia projects. We believe this will help the Wikipedia community make informed changes to existing policies and guidelines.
To help functionary and community governance groups better coordinate their work, we will facilitate in the development of a training platform and will guide the establishment of a modules based around the critical area of addressing harassment.
After consultation with functionaries (stewards, global admins, Arbitration Committees, admins), community members, and outside experts an initial group of modules about Online harassment and Keeping events safe was created and is now available for training. We will further collaborate with the community to development of future training modules.
More background about the Community health initiative
For more background about the Community health initiative see Community health initiative
The Anti-Harassment Tools team includes four Wikimedia Foundation employees, partnering with members of the Wikipedia community who want to participate. The software we build will be useless if it doesn't address real-world workflow problems for the existing Wikipedia community, so we will heavily rely on your input to make our efforts a success.
Sydney Poore (User:SPoore (WMF)) is a Trust & Safety Specialist on the Trust and Safety team at the Wikimedia Foundation. Sydney is a long-time English Wikipedia contributor and administrator and previously served on the FDC and ArbCom. As Trust & Safety Specialist, Sydney is responsible for facilitating the crucial lines of communication between this team and the Wikimedia community and representing the community during every team decision.
Trevor Bolliger (User:TBolliger (WMF)) is a Product Manager at the WMF. Trevor previously worked for six years on MediaWiki contribution and moderation features at Wikia. As Product Manager, Trevor is responsible for prioritization of projects and tickets, project documentation, and setting project goals and scope.
David Barratt (User:DBarratt (WMF)) is a Software Developer at the WMF. David previously worked as a Software Engineer for iHeartMedia and Golf Channel. David is also a core contributor to the Drupal project. As software developer, David is responsible for writing and testing the Anti-Harassment Tools software.
Dayllan Maza (User:DMaza (WMF)) is a Software Developer at the WMF. Dayllan previously worked as a Software Engineer for various organizations in New York and Florida. As software developer, Dayllan is responsible for writing and testing the Anti-Harassment Tools software.
Thalia Chan is a Software Developer at Wikimedia Foundation. Thalia is responsible for writing and testing Anti-Harassment Tools software.
Key affiliated WMF staff
- Community Tech, Alex, Ezell, Danny Horn Ryan Kaldari, Moriel Schottlender
- Trust and Safety, Patrick Early, Joe Sutherland, and Christel Steigenberger.
How to get involved
We're just getting started, and we look forward to your participation every step along the way. As we've prepared for the grant and and on-boarded the new team members we've collated some notes on meta, most of which we've moved here to Wikipedia:Community health initiative. These plans and notes will almost certainly change base on the Wikipedia community's input.
Want updates or learn more about how to participate? Sign up for the Community health initiative mailing list or the Community health initiative Newsletter or follow our progress on the Community health initiative on English Wikipedia work space.
We'd love to hear your initial thoughts on Wikipedia talk:Community health initiative on English Wikipedia. There's a lot to discuss and we hope to hear from you. Thank you! —
- meta.wikimedia.org — Community health initiative#Harassment on Wikimedia projects
- meta.wikimedia.org — Community health initiative#Community requests for new tools
- meta.wikimedia.org — Community health initiative#External funding
- meta.wikimedia.org — Research:Detox
- meta.wikimedia.org — Community health initiative