Jump to content

Wikipedia:Editorial oversight and control

Page semi-protected
From Wikipedia, the free encyclopedia
Video guided tour #2: Why does Wikipedia work even though anyone can edit it?

This page summarizes the various processes and structures by which Wikipedia articles and their editing are editorially controlled, and the processes which are built into that model to ensure quality of article content.

Rather than one sole form of control, Wikipedia relies upon multiple approaches, and these overlap to provide more robust coverage and resilience.


Overview of editorial structure

There are tens of thousands of regular editors – everyone from expert scholars to casual readers. With the exception of blocked users, anyone who visits the site can edit it, and this fact has encouraged contribution of a tremendous amount of content. There are mechanisms that help community members watch for bad edits, a few hundred administrators with special powers to enforce good behavior, and a judicial-style arbitration committee that considers the few situations remaining unresolved, and decides on withdrawal or restriction of editing privileges or other sanctions when needed, after all other consensus remedies have been tried.

As it's a wiki, anyone can contribute to Wikipedia, and everyone is encouraged to. Overall, Wikipedia gets hundreds of times more well-meaning editors than bad ones, so problematic editors rarely obtain much of a foothold. In the normal course of events, the primary control over editorship is the effective utilization of the large number of well-intentioned editors to overcome issues raised by the much smaller number of problematic editors. It is inherent in the Wikipedia model's approach that poor information can be added, but that over time those editing articles reach strong consensus, and quality improves in a form of group learning, so that substandard edits will rapidly be removed. This assumption is still being tested and its limitations and reliability are not yet a settled matter – Wikipedia is a pioneer in communal knowledge building of this kind.

Balancing this, there is also a wide range of resources for editors seeking to improve their articles or within their areas of interest. These include several routes for general and specialist peer review, and thousands of editors in a wide variety of focus groups working on specific types of issue, reference desks and copyright resource checking to help source missing information, expert groups in various subjects for technical input, and subject related 'WikiProjects' which provide a comprehensive unified approach to editorial quality control and article rating in their respective subject areas.

The Wikipedia community is largely self-organizing, so that anyone may build a reputation as a competent editor and become involved in any role they may choose, subject to peer approval. Individuals often will choose to become involved in specialized tasks, such as reviewing articles at others' request, watching current edits for vandalism, or watching newly created articles for quality control purposes, or similar roles. Editors who find that editorial administrator responsibility would benefit their ability to help the community may ask their peers in the community for agreement to undertake such roles; a structure which enforces meritocracy and communal standards of editorship and conduct. At present around a 75–80% approval rating after a communal "no holds barred" inquiry, is considered the requirement for such a role, a standard which tends to ensure a high level of experience, trust and familiarity across a broad front of projects within Wikipedia.

(Such rights are stringently restricted, ensuring that editorial and administrative matters are separated powers and only rarely lead to editorial conflict of interest.)

Wikipedia's editorial control process

Wikipedia has somewhat more formal editorial systems of control than are apparent to a newcomer, with ten main areas of overlapping control in three main areas primarily responsible:

Core community level controls
  • The degree of oversight possible with tens of thousands of bona fide editors.
  • The wiki system itself, which as operated, appears to strongly select for robust and best collaborative knowledge of many people (even on contentious topics), rather than the unrepresentative viewpoint or negative impact of a few.
Editorial panels and processes
  • Widely respected and enforced policies which provide all editors with a solid basis to take matters into their own hands in addressing both deliberate and innocent bad edits.
  • A consensus-based ethos, which beneficially impacts the decision-making process.
  • Escalation processes whereby poor conduct or articles being problematically edited will tend to come to the attention of a wider range of editors with authority or willingness to act on them, making vandalism very short term and ultimately somewhat futile.
  • A wide range of fine-grained editorial processes such as dispute resolution, third-party opinion, and requests for comment and consultation within the wider Wikipedia community.
Software-facilitated controls
  • Systems built into its editing software that make it easy for a large number of editors to watch for vandalism, monitor recent changes, and check activity in articles in personalised watchlists, in real time.
  • Design decisions in the software that make identifying and reverting any number of bad edits possible at the click of a button, whereas vandalism itself takes longer to do.
  • Ability to set fine-grained software blocks on problematic editors, and partially or fully protect targeted articles.
  • Standardized alerts, known as tags, which can be added to any fact or article, and which allow individual facts (or entire sections and articles) to be highlighted as questionable or brought immediately to others' attention.
Controls under development
  • The control known as flagged revisions is being rolled out as of 2007. It aims to differentiate the version shown to most readers, from the draft "cutting edge" version being edited, and in the first instance to only show the latter when it has been checked for reasonableness. This system is expected to provide a powerful way to prevent most vandalism or poor quality edits from being seen by readers, once it is fully operational.

Types of control

User oversight

Wikipedia's primary editorial control, that ensures the bulk of its quality, is simply the sheer volume of well-intentioned editors who regularly and constantly watch over its articles. At any given time, a large number of the thousands of active Wikipedians will be using, checking, or editing the articles held. Each of these has their own watchlist, a special page that lists changes to the articles they have worked on or are otherwise choosing to watch. Hundreds of Wikipedians use automated software tools (described below) to watch edits en masse. On average, only a few minutes lie between a blatantly bad or harmful edit, and some editor noticing and acting on it. Repeated edits tend to lead rapidly to escalation of the process, further safeguards and actions, and the involvement of others, including possible use of administrator powers or dispute resolution depending on the situation.

The primary control therefore is not so much that "only approved editors" can update and improve articles. Even bad editors can edit – but any vandalism and errors they add rarely get much of a foothold and their bad edits are rapidly spotted and reversed by others. This is different from traditional knowledge and publishing, which attempts to limit content creation to a relatively small circle of approved editors in an attempt to exercise strong hierarchical control.

A 2002 study by IBM found that as a result of this process, most vandalism on the English Wikipedia is reverted within five minutes:

We've examined many pages on Wikipedia that treat controversial topics, and have discovered that most have, in fact, been vandalized at some point in their history. But we've also found that vandalism is usually repaired extremely quickly—so quickly that most users will never see its effects.[1]

Actually, on Wikipedia the truth usually prevails because everyone can correct the articles:

"It is a piece of idle sentimentality that truth, merely as truth, has any inherent power denied to error, of prevailing against the dungeon and the stake. Men are not more zealous for truth than they often are for error, and a sufficient application of legal or even of social penalties will generally succeed in stopping the propagation of either. The real advantage which truth has, consists in this, that when an opinion is true, it may be extinguished once, twice, or many times, but in the course of ages there will generally be found persons to rediscover it, until some one of its reappearances falls on a time when from favourable circumstances it escapes persecution until it has made such head as to withstand all subsequent attempts to suppress it."

User collaborative knowledge-building

Unusually, Wikipedia relies for a large part of its editorial work upon editors drawn from the general public, who may well lack relevant qualifications in the subjects they edit. Experience suggests that any appearance of weakness which may be created is deceptive.

It turns out that in some ways, analytic skills and neutrality often play a greater role than specialisation; editors who have worked for a time on a variety of articles usually become quite capable of making good quality editorial decisions regarding specialist material, even on unfamiliar technical subjects.[2] Again, questionable edits will usually be caught and explained by others more experienced.

In general, the role of Wikipedia editors is guided by two principles. 1) Most editors will choose to edit subjects where they have personal interest, knowledge, and familiarity. 2) The editorial role in Wikipedia is not to produce original research so much as to collate and source existing reputable knowledge in an encyclopedic form, under strict policies of neutrality of viewpoint and verifiability of information thus added.

Attempts to add information which is of poor quality or questionable are easy to spot, by the many other editors reviewing a given topic, who generally come with different viewpoints and understandings initially. For facts to remain in an article requires consensus amongst (often dozens or hundreds of) diverse editors with an interest in the article, that the fact is agreed, and neutrally and appropriately presented in a balanced manner, with any statement considered to require citation being properly sourced. Editors on most articles will often include coverage of a range of viewpoints on the subject, and will often include a number of specialists.

"Although it depends a bit on the field, the question is whether something is more likely to be true coming from ... a source that has been viewed by hundreds of thousands of people (with the ability to comment) and has survived."

— Joi Ito, technology figure [3]

In addition, one should not overlook the effect of reader involvement – the millions of readers of articles are themselves encouraged to be bold and correct or improve any article they read.

Over time, experience suggests that as a result of this collaboration on a large scale, articles do usually rise to this general standard, and many long-standing articles having survived this process of examination over the years are stable, robust, and well written as a result. Controversial articles often highlight the success of this approach - the process of developing a wording that satisfies a consensus of often-opposed editors is not a trivial one and can be watched repeatedly playing out on articles over time.

The Wiki structure

It is possible that this selectivity for collaboration is in part due to the Wiki structure. Editors who disagree are unable to write alternative articles or versions to express their differing viewpoints. Ultimately there is only one page upon which all must edit. Since other aspects of the editorial process tend to reduce sustained "edit warring", and strong universally accepted viewpoints describe how opposing views are to be neutrally included and presented, ultimately there is great pressure in the long term, for a common agreed version to emerge on that one page. Once it has done so, then it is the usual stance of editors who have worked for this goal, no matter their viewpoint, that it will only be replaced by a better version.

Another aspect is that because of the wide-open nature of the editorial process, there is no bottleneck of control through which the content can readily be controlled or massaged by any given individual or interest group. As well, all edits and actions, including past historical versions, are visible to all editors. The Wiki model itself mitigates extremely strongly against control of articles being manipulated by any one interest group, as there are no obvious applicable points of weakness or "approved circle", through which editorial decisions must pass. As a result, maintaining vandalism or a specific viewpoint is all but impossible in the long term, and Wikipedia is extremely resilient long-term against bias, censorship, or manipulation of its articles.

An article examining Wikipedia's approach and outcome in depth, for the Canadian Library Association (CLA) commented that in controversial topics, "what is most remarkable is that the two sides actually engaged each other and negotiated a version of the article that both can more or less live with".[4]

Respect for policies and principles

Rules and policies must strike a fine balance between good and necessary practice, and abuse or game-playing, in order to be effective in dealing with would-be disruptive contributors. Wikipedia's policies reflect this dynamic tension quite strongly, with policies on user conduct and appropriate editorial approach, and also meta-policies - policies and guidelines which provide guidance on how policy is to be used, in order to ensure commonsense prevails over both disruptive editing and gaming the system.

Examples of the former include core policies on neutral presentation and balance, proper verifiability and citation of sources, and policies on editorial conduct, dispute and disruption, and types of acceptable content. These policies are substantially agreed by the entire community as the basis for the entire editorial approach, and have very high "buy in".

Examples of the latter include guidelines on how policy should and should not be used, such as Don't disrupt Wikipedia to prove a point, The rules are principles, Don't be a fanatic, Ignore all rules (for exceptional cases where some rule inhibits good quality and appropriate work), Avoid instruction creep, Avoid wikilawyering (That is, follow the spirit of the policies rather than fuss over minor technicalities), and the quite to-the-point Wikipedia:Don't be a jerk.

These meta-policies in turn are unlikely to be sanctioned if it is the perception that their use is motivated by a wish to game the system rather than bona fide reasons.

(Full lists: Official policies and Official guidelines).

Consensus based ethos

The community has a very strong buy-in to consensus decision-making, underscored by guidelines such as Wikipedia:Consensus, and Wikipedia:Polling is not a substitute for discussion. Consensus is not the same as majority, it signifies that the concerns and views of minorities should be taken into account in the attempt to gain a decision which reflects community values and which most can live by to some extent or other. Most policies and procedures also develop and become refined in this same manner.

The time taken to reach some decisions is often considered to be outweighed by the wide agreement when decisions are reached. Editorially, article by article, Wikipedia editing ethos strongly encourages the incorporation of views in a policy-compliant encyclopedic style, when they meet content criteria, and the seeking of independent others' input when consensus is unclear. Even in the event of dispute and escalation, the process remains the same -- even Arbitration Committee decisions are based upon communal input, consensus, and transparency.

Escalation processes and dispute resolution

There are a number of escalation processes inherent in the Wikipedia model. Some function autonomously, others are accessible to anybody who notes a concern.

Autonomous escalation includes, as a simple example, that repeated vandalism of an article will tend to gather attention from more editors, who will begin to specifically watch that article for changes, or who add it to their Vandalism Software (if in use) to flag every edit as needing checking.

Articles in good order and lacking obvious problems also have a comprehensive review system, in this case one which obtains communal input and addresses quality and standards compliance, including quality based peer review upwards.

Other editor-instigated escalation processes include the entirety of the dispute resolution process (i.e., Request for comment and Request for formal arbitration). Editorial decisions such as page deletions likewise have fine grained policies and escalation processed, with speedy delete for obvious nonsense and prodding for almost-certain violations, which can be escalated into the full communal review system of Articles for deletion in which articles and their justifications are discussed communally for up to a week in order to reach consensus on their treatment.

As well as editors' pages, pages such as Administrator Notices/Incidents are used to report current status quo and problems to interested users in general, and serve as a noticeboard for current situations and developments worth watching.

An arbitration committee sits at the top of all editorial and editor conduct disputes.[5] Its members are elected in three regularly rotated tranches by an established inquiry and decision-making process in which all regular editors can equally participate.

Edit monitoring and software facilitation

Wikipedia:List of Wikipedians by number of edits lists some statistics on editorial involvement. However this page only lists edits made by the 3 million or so editors; it does not show editors' monitoring of articles and edits in cases where no correction was deemed necessary.

Reputable editors who decide to monitor recent edits more seriously will often use software such as VandalProof, a program written for Wikipedia by AmiDaniel, as well as functionality that automatically flags changes by known problem editors. They will use this software to watch hundreds of recent edits in "real time" as they happen. Other automated corrections, such as bad links, typographic errors and spellchecking, bot-assisted identification of unused fair-use images, and some forms of vandalism, are automatically fixed by bots, automated programs written by Wikipedians and operated by authorisation. There are also large user-groups dedicated to rapid reversal of vandalism, such as Recent changes patrol and the Counter-Vandalism Unit.

These systems are often near-immediate. For example, the article on the United Kingdom, vandalised at 06.55 10 Jan 2007, was detected and repaired by AntiVandalBot, also at 06.55 10 January 2007.

Other tools and user groups focussing on monitoring edits as they happen or subsequently, are listed at: Category:Wikipedia counter-vandalism tools.

Blocking and protection systems

A variety of timed and untimed controls for blocking problematic editors and protecting pages from poor editors are accessible within the Wikipedia software. These can intelligently filter out combinations of accounts, IPs or named users, and protect pages from IP, new or non-established editors.

They are used to enforce both short and long term blocking decisions, and to lock pages and deter vandalism, as necessary, if lesser steps seem to be inappropriate.

Tagging of information

Articles and individual facts can also be brought to others' attention by means of a wide range of inline and article tags, used to flag individual statements and citations, or articles as a whole, to request checking or citation, and to indicate to other editors and readers that a fact or presentation is unsupported or questionable as it stands. A number of editors deliberately look for such tagged articles to work on them. For example: Category:Articles needing expert attention, and the assistance with neutrality user-group.

Effects of control systems

An average time to revert edits is usually a few minutes on most articles, and if an article is hit with repeated vandalism then more editors will tend to notice, and start to actively watch the article to reduce the risk of recurrence (or "lock" it if it becomes necessary).

Popular articles (especially on current affairs) might get hundreds of edits a day, and be reviewed by dozens of editors out of the several hundred thousand on Wikipedia. This degree of watchfulness around the clock makes it hard for vandalism to get established in most articles.

Types of access

There are various permissions within the Mediawiki software, allowing users to perform various communal functions. The most commonly known of these are:

1.    Any editor, whether with an account or otherwise. Editors are encouraged to be bold and become involved at all levels. In the early days of Wikipedia all editors acted as administrators, and in principle they are encouraged to act with similar responsibility today as well.
2.    Administrators (also known as 'admins' or 'sysops') are users trusted to be responsible with a range of Wikipedia's blocking, deletion and protection tools, to review and close various forms of discussions and to enforce rulings and policies. Any editor in good standing with a strong track record of experience can be nominated for adminship, a process that is based upon communal approval by editors at large, in which any established editor may express an opinion. Significantly, administrators do not have a 'privileged voice' or overriding status, in any editorial matter. Unless an actual administrative issue arises, administrators edit like any other user. Respect is not gained as a result of being an administrator; rather being an administrator is a result of respect gained, combined with a wish to undertake more responsibility on janitorial tasks.
3.    Bureaucrats are administrators who are entrusted to effect the decision of the community in appointing and removing administrators and in granting and removing bot rights from the advice of the Bot Approvals Group. They have few other additional rights.
4.    CheckUser and Oversight access relate respectively to the examination of users' activities when sock-puppetry is suspected and in situations where access to certain historic versions of pages will be removed for legal and safety purposes. As these roles require a high level of trust, they are granted only to those users approved by the Arbitration Committee or by direct appointment and their actions are subject to regular monitoring.

Individual editors' power to control and correct poor editorship

A typical case inquired about by a reader: A vandalistic edit was added to the article Global warming on January 7 2007 (UTC). It was noticed and reported by a reader, but when the reader went to check it again, it seemed to have vanished.

In this case study, the reader had noticed vandalism added by user Arnold19 at 04:55, January 7 2007 (UCT). The vandalism had been reversed by Raymond arritt at 05:11, 16 minutes later. A vandalism warning was separately added to Arnold19's user talk page at User_talk:Arnold19 just three minutes later at 05.14, by another user, Amos Han, who also spotted it. By the time the original reader had sought to quote it in their vandalism report, the vandalism had already been fully removed and the user warned, by two separate people.

One can actually see the "differences" of those two edits, known as "diffs", here (vandalistic edit) and here (fixing edit), which highlight the changes made in the vandalistic edit, and in the rectifying edit, respectively. These diffs are the authoritative version of "who changed what with which edit". If there is ever any question of bad editorship, one will see people requesting (or citing) "diffs" as evidence of who did what to an article.

In the two DIFFS linked it can be seen that the vandalistic text was added in the 1st edit and then removed in the 2nd.

The editing history of an article and the list of edits to date can be looked up by any user, by clicking "HISTORY" at the top of the article page, which will list the history of edits to the article. Clicking on DIFF next to any edit will show the details of any changes made at that time, old text on the left, new text on the right.

All users have a "watchlist". Its a way to keep an eye on articles which they are interested in. It will list changes to these articles. Editors can list and de-list articles that way for their own personal interest.

The following pages contain further information and resources, since this form of editorial control is probably one not much seen outside electronic collaboration systems: About Wikipedia, Researching with Wikipedia and Reliability of Wikipedia might all be useful. For dealing with vandalism see Wikipedia:Vandalism. For editing Wikipedia oneself to fix obvious vandalism and errors, see the section Contributing to Wikipedia on the 'About' page.

Note that editors are encouraged to fix errors themselves; however if a mistake is made, other more experienced users will usually step in to help fix these if the original editor does not.

Editorial quality review and article improvement

As well as systems to catch and control low quality contributions, Wikipedia also has a variety of positive systems for article review and improvement. Examples of the processes involved include:

  • Quality-based peer review, where editors who have not been involved in the article are invited to review and comment upon its quality, balance, readability, citation of sources, and other policy-compliance and content issues.
  • Wikipedia:Good articles - a system whereby articles can be rated and broadly established as being of reasonable quality, while being commented upon by independent review.
  • Wikipedia:Featured articles - a rigorous review of articles which are desired to meet the highest standards and showcase Wikipedia's capability to produce high quality work.

Additionally, specific types of article or fields often have their own specialized and comprehensive supervisory projects (such as the WikiProject on Military History), assessment processes (such as biographical article assessment), or are the subject of specific focus under projects such as the Neutrality Project, or covered under editorial drives by user groups such as the Cleanup Taskforce.

Examples

Some examples of Wikipedia's editorial control system at work:

  1. AFD ('Articles for Deletion') discussions, in which editors of all views can examine an article critically to discuss (independent of subject matter) whether it is policy compliant, or should be removed for failure to meet content criteria. AFD:Liza Wright, AFD:Stephanie Sarkis.
  2. An article talk page discussion, to which anyone may contribute, in which interested editors consider a question from the point of view of best practice.
  3. A talk page on an aspect of a technical subject, illustrating specialist and non-specialist editors working together to develop an article that is both technically accurate and also useful to lay-readers.
  4. A talk page on a technical subject, showing how editors lacking relevant technical skills can competently understand and improve article.
  5. A current vandalism alert posted onto the administrators noticeboard.
  6. An arbitration committee review of an editorial dispute on the Elvis article, and the associated discussion by committee members.
  7. An example of fine grain page protection - the About Wikipedia page is semi-protected to prevent edits by new or inexperienced editors but allow edits by established editors.
  8. Examples of the tags which the Wikipedia software allows editors to add to articles.

See also

For dealing with vandalism see Wikipedia:Vandalism.
For editing Wikipedia yourself to fix obvious vandalism and errors, see Wikipedia:Contributing to Wikipedia.

General editorial groups-

Specialized working groups-

Editorial assistance software coded for Wikipedia:

Philosophy and broader structure-

References

  1. ^ Fernanda B. Viégas; Martin Wattenberg; Kushal Dave (2004). "History flow: results". IBM Collaborative User Experience Research Group. Retrieved July 7, 2016.
  2. ^ Because once material is added, a major role of editors is to locate and adjudicate the value of sources, faithfully summarize the differing views, and review uncertainties dispassionately and logically together -- all of which are types of analytic skill. Further, most articles of a technical nature have at least some editors with specialised knowledge watching for errors of principle.
  3. ^ Joi Ito, "Wikipedia attacked by ignorant reporter", Joi Ito's Web, August 29, 2004.
  4. ^ Peter Binkley, "Wikipedia Grows Up", Feliciter 52 (2006), no. 2, 59-61 [1]
  5. ^ The founder of Wikipedia is the sole individual empowered to override this process, but has stated in public that extreme circumstances aside, he will not do so. In 2007 he added that he will consider himself bound in the event of a ruling of the Arbitration Committee.[verification needed]