User:Thorncrag/Workshop/Counter-abuse

From Wikipedia, the free encyclopedia

Wikipedia Counter-Abuse Strategy Initiative (WCSI)[edit]

Welcome to the Wikipedia Counter-abuse Strategy Initiative workshop. This workshop aims to:

  • Define types of abuse faced by Wikipedia.
  • Analyze the current counter-abuse activities currently in place.
  • Conduct a threat assessment and consider which threats are not currently being adequately addressed.
  • Develop a strong, cohesive, comprehensive strategy set that encompasses each type of abuse and how the community will counter each of these abuses.

Rationale[edit]

To predicate the need for this workshop, we must first establish the need for a reorganization. In its current state, counter-abuse activities lack a cohesive plan that binds them into a singular, coordinated strategy. This leads to potential confusion, duplication of effort, loss of vital information, misidentification of abusers, and abuse that goes unchecked. In many workings as it currently stands, it is frequent that the proverbial left hand does not know what the right hand is doing due to this lack of overall coordination. In other scenarios, some counter-abuse activities receive dutiful attention by the community, while others go virtually forgotten. It may also be, that some philosophies such as deterrence become largely ineffectual due to inconsistent application of thse activities. A sturdy foundation needs to be laid upon which all of these activities can take place, interlinked and to complement each other, a condition ostensibly not currently met by the current disharmony.

While still in an organizational phase, this workshop aims to gain a wide breadth of participation given the nature and impact of the potentially sweeping proposed changes. Any community member is invited to contribute to this workshop at any point.

Workshop Below Line


Workshop organizational plan[edit]

I.

Analyze and define each type of abuse[edit]

Phase I of the workshop is to analyze and define all of the current abuse faced by Wikipedia.

 Not done
II.

Analyze and define Wikipedia's current counter-abuse activities[edit]

Phase I of the workshop is to analyze and define all of the current counter-abuse activities currently employed on Wikipedia. This is an important step because it is not possible to efficiently strategize improvements without first defining and understanding the current state of affairs.

 Not done
III.

Identify shortcomings[edit]

Phase III of the workshop is to identify, outline, and thoroughly describe the shortcomings of the current counter-abuse strategies and explain the harm experienced and why they need to be improved upon.

 Not done
IV.

Threat assessment[edit]

Phase IV of the workshop involves analyzing the shortcomings and the potential threats that Wikipedia faces due to particularly meticulous abuse such as agenda-driven editing and so forth.

 Not done
V.

Development of strategy[edit]

Phase V of the workshop will be to take into account all that is previously supposed and to develop a single unified strategy for countering each type of abuse faced by Wikipedia.

 Not done
VI.

Proposal of strategy[edit]

Phase VI of the workshop is to propose to the community to generate a consensus for approval.

 Not done

Describe ALL type of abuse current faced by Wikipedia[edit]

In this section we will describe and define every type of abuse currently faced by Wikipedia.

  • Simple vandalism
  • Complex vandalism
  • Trolling
  • Agenda-driven editing (NPOV, COI, etc)
    • Paid image-upkeep editing
    • Subsets
  • Open proxy
  • Sockpuppetry
  • Meatpuppetry

Describe ALL aspects of Wikipedia's current counter-abuse activities[edit]

In this section we will describe and define every single counter-abuse activity that takes place on Wikipedia. This is what actually takes place now, not what should be or what is proposed. NOTE: Stage 1 are incidents dealt with immediately but which still may comprise a pattern of abuse leading to stage 2

Stage 1[edit]

Acute symptoms of abuse that can be cured relatively simply and straightforward.

Revert & Warn[edit]

We revert the edit that constitutes vandalism and then issue the editor a warning.

Edit Filter[edit]

The purpose of the edit filter is to tag suspicious edits and then begin to warn them and finally if they do not stop, the edit filter has the power to block said user

Administrator Intervention against Vandalism[edit]

Open Proxies[edit]

Abuse Bots[edit]

Pending Changes[edit]

Stage 2[edit]

Chronic forms of abuse that require thorough tracking and investigation to be effectively countered.

Sockpuppet Investigations[edit]

Long Term Abuse[edit]

Abuse Response[edit]

Questions[edit]

General Questions[edit]

What is abuse as far as Wikipedia is concerned?[edit]

According to Wiktionary, abuse is "Improper treatment or usage; application to a wrong or bad purpose; misuse; perversion." In terms of Wikipedia, the community considers any behavior which is deliberately impudent to the goal of building an encyclopedia as abuse. Note that mere impudence of established guidelines and customs is not necessarily abuse; only when it obstructs this goal in a deliberate and malicious fashion is it considered abuse.
Reply 2
...

Describe all of the abuses currently faced by Wikipedia[edit]

  • Vandalism
  • Open Proxy
  • Multiple username
  • Multiple IP
  • Username
Reply 2
...

Outline what processes are needed to respond to abuse on Wikipedia. Note: not what is, as discussed above.[edit]

Prevention
Prevent abusers from being able to abuse Wikipedia by removing their choice in the matter.
Deterrence
Deter abusers from abusing by making known to them that their abuse may incur a response.
Rectification
Reverting abuse and restoring content to its non-abused state.
Remedial
Remove the choice of abusers by removing their ability to edit.
Tracking abuse and trends
Monitor disruptive user's edits.
Deciphering abuse and trends
Once you have gained information from monitoring, you then have to analyse it to discover the disruptive editors habits and what kind of disruptive edits he makes.
Reporting abuse to interested parties
Sending a report to the responsible organization regarding disruptive editing.
(Subquestion) How do the current systems in place fulfill those needs?[edit]
Prevention
The edit filter prevents abuse before it is committed by blocking the action by the abuser. The pending changes system prevents abuse from being seen by the public at large.
Deterrence
The edit filter helps prevent abuse by warning editors prior to acting. Reporting abusers to their responsible organization is intended to help deter abuse.
Rectification
Contributors plus automated bots revert, rollback, or otherwise restore content to its non-abused states. Automated tools help contributors rectify abuse.
Remedial
Blocking editors from editing removes their choice. This is accomplished by combination of administrators and by contributors via AN/I. Protecting pages accomplishes the same.

When abuse from a user becomes chronic and needs to be tracked, how should that case be filed? (e.g., under a title of the user, under the IP address, and so forth)[edit]

Reply 1
Reply 2
...

For cases filed at any of the current counter-abuse units, which are filed on a per-incident basis, and which span multiple incidents?[edit]

Reply 1
Reply 2
...

At what point is abuse to Wikipedia considered egregious?[edit]

Reply 1
Reply 2
...

In general, under the current setup, which counter-abuse cases tend to serve as starting places for other activities?[edit]

Reply 1
Reply 2
...

Activity-specific questions[edit]

Long Term Abuse[edit]

What is LTA?[edit]
LTA is is project which aims to log the actions and sockpuppets of long term abusers of Wikipedia. It is only for the most egregious and obvious form of abuse.
Reply 2
...

What is the objective of LTA and what is its primary function?[edit]

The objective of LTA is to have a single database where users can list long time Wikipedia vandals/trolls and their sockpuppets, so that users and administrators have easy access to reports on troublesome users. This is helpful to look out for users sneaky sockpuppet masters by matching their behavioral patterns with reports.
Reply 2
...
Describe what LTA is, and what it is not.[edit]
LTA is a centralized database of the most egregious and obvious long time Wikipedia vandals, trolls or sockpuppet masters. LTA is not a place to report an instance of vandalism or to report subtle, hard to confirm vandalism.
Reply 2
...
How does LTA fit into Wikipedia's overall counter-abuse strategy?[edit]
Reply 1
Reply 2
...
Who is the main consumer of LTA cases? Be specific.[edit]
Reply 1
Reply 2
...
How does LTA assist the community in combating abuse?[edit]
Reply 1
Reply 2
...

Sock Puppet Investigations[edit]

Q[edit]
Reply 1
Reply 2
...
Q[edit]
Reply 1
Reply 2
...


Propositions[edit]

The following propositions have/will come as a result of the workshop.


Proposed Proposition #1[edit]

  • All stage 2 abuse-related activities should revolve around a single case, spinning-off secondary cases depending on the nature of the abuse.
    • Rationale...
    • Background...
    • Details...


Proposal Below Line

Extended content

Wikipedia-EN
Counter-Abuse Strategy
Report & Proposal



http://en.wikipedia.org/wiki/WP:ABUSE/Strategy

Wikipedia Abuse Response Team
irc #wikipedia-en-abuse
<wikimedia-en-abuse@lists.wikimedia.org>

Counter-Abuse Strategy
English Wikipedia

Prepared for: Wikipedia-EN
Prepared by: Wikipedia Abuse Response Team, et. al

August 6, 2010
Wikipedia-EN Counter-Abuse Strategy Report & Proposal
1

Executive Summary[edit]

Objective[edit]

While largely considered successful--that is, depending on which benchmarks one might use to define success, the current organization of the English Wikipedia’s (herein “Wikipedia”) counter-abuse strategy belies a sense of organization which actually is not present. It lacks a unified vision to bring together into focus the underlying purpose and goal of each of its constituent counter-abuse projects.

Goals[edit]

The goal of this report is to articulate the current functionality of Wikipedia’s current anti-abuse activities, to analyze those activities, and to then propose a unified strategy and vision that current and future counter-abuse projects should molded to.

Rationale[edit]

In realizing that the strategy is inadequate...

Singular Purpose[edit]

Discuss how each project might go in different directions.

Accessibility[edit]

Many editors and administrators might not be aware of the resources available to them. Discuss.

Efficiency[edit]

Wikipedia’s counter-abuse activities are not as efficient as they could be. Discuss duplication of effort.

Modularity & Adaptability[edit]

Discuss how the strategy and the counter-abuse activities need to be able to adapt to future changes to Wikipedia policy, such as pending changes, according to community consensus.

Laying It All Out[edit]

The first step is to analyze and articulate each of the present counter-abuse activities currently deployed.

Editors reverting and warning[edit]

On the front line is the every day faithful editor, noticing trouble, reverting, then warning the offender. It perhaps goes without saying that this is perhaps the most crucial counter-abuse activity. This makes up for probably ninety-percent of the counter-abuse activity that occurs.

Administrator intervention[edit]

The next level of activity takes up where warnings fall ineffective: administrator intervention. This occurs namely on Wikipedia:AIV, Wikipedia:UAA, and [the other various noticeboards]. Abusers found to have violated pertinent policy are blocked.

Sock-puppet investigation[edit]

Where blocking has failed, this is where abusers who have abused using multiple accounts or IP addresses are investigated by check-users, and blocked by administrators.

Long-term abuse[edit]

Where multiple blocks on multiple accounts and IP addresses have failed, the community turns to documenting the patterns of abusers so that they can be identified and dealt with appropriately.

Abuse response[edit]

When severe abuse is not curtailed by multiple blocks, abusers are reported to the internet service provider, or other responsible or-Wikipedia-EN Counter-Abuse Strategy Report & Proposal

Figure 1. A visual representation of current counter-abuse activities.

Wikipedia-EN Counter-Abuse Strategy Report & Proposal
2