Jump to content

Wikipedia:Censorship and official point of view

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Wnt (talk | contribs) at 00:10, 10 September 2014 (Well, let's get started. Odds that un-written policy permits such description of de facto practice: not good.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Censorship on Wikipedia is formally against policy, and in fact the English Wikipedia frequently resists external requests to remove information. This is frequently taken as an indication that the site is free, crowdsourced, and offers a neutral point of view to which any editor can contribute. However, Wikipedia has developed an extensive hierarchy, in which those with higher-level permissions can take action to exclude contributors and contributions based on their own initiative, without needing to defend these actions according to any stated policy. As a result, certain governments and individuals have broad power to exclude unwanted information by means which are little-known to the reader and which are immune to contradiction or criticism by most editors.

Mechanisms

Edit filters

The first level of censorship, still relatively infrequent, is automated through the use of an "abuse filter". Ostensibly created as a method of trapping spam and common forms of vandalism, the abuse filter is frequently used for such excusable ends. However, abuse filters exist whose contents are not viewable by ordinary editors, and they are presently being used to defeat the addition of content to the encyclopedia. For example, the name of the current ISIS hostage threatened during the murder of Steven Sotloff is apparently subject to such a filter,[1] and edits containing it are likely to be removed shortly afterward, even though it is very widely covered in the media. (The filter is not accessible by ordinary users, and the logs do not reveal its use) Edit filters can be set to notify editors who then remove the content, often citing unrelated policy reasons, using WP:POPUPS and WP:Twinkle.[Verify] They can also be used to automatically reject the edit; if the editor then tries to repeat the edit several times it may then be classed as "long-term abuse". The filters can also automatically block editors from further contributions.

Reverts

Removal of information usually starts with the humble revert. Done by other editors, this involves no special power, and when employed by those with no special pull, may lead only to embarrassment. Nonetheless, it can be effective for a time,[2] and often serves as the basis for further action by establishing an "edit war", which presents administrators with the opportunity to decide how content will be presented.

Automated reversion

In addition to the semi-automated reverts with WP:POPUPS and WP:Twinkle, bots are given the opportunity to revert contributions that appear problematic. In the case of the David S. Rohde kidnapping, User:XLinkBot was used to revert several contributions based on the presence of a typepad.com link among them.[3][4][5]

Page protection and Pending Changes

WP:Page protection and WP:Pending Changes are the workhorses of Wikipedia administrators, used to prevent short-term editors from changing an article or to prevent all editing of it. Page protection is nominally applied to a page to stop further editing without regard for the version at the time, but this is very rarely the case; often admins will edit the page to their liking after preventing anyone else from editing. Protecting a page in the right version frequently serves to establish the all-important "status quo"; as anyone can say that WP:consensus has not been reached, and no action is taken when there is no consensus, this usually gives the admin the ability to determine how the page will read even if protection is ended.

Pending Changes adds a layer of "reviewers" who manually examine edits on behalf of admins, whose permission to do so is granted and revoked according to their compliance with both formal and undeclared policy.

Blocks and bans

Nothing serves so well as the practice of blocking to demonstrate that censorship is not merely a priority, but the priority at Wikipedia. Simply adding a link from an actress to a porn movie she made can result in very long term sanctions if the right subject has asked the right people to keep it suppressed, no matter how much the result appears in her search results. Because the same administrators who protect articles make ban decisions, the threat of banning greatly increases the reach of changes that they make under article protection.

Revision deletion and "oversight"

WP:Revdeling and WP:Oversight are two separate mechanisms developed to conceal portions of the article history. With little regard to the formal limits on their usage, these are not infrequently used to conceal material which is not obviously problematic, such as the name of the hostage mentioned above which is in all the news, from a vandal edit of Template:User wikipedia/Oversighter. The "oversight team" can be obtained by a list of software tags,[6] but there is little clarity on who is involved in a given oversight decision or how to appeal one.

Office actions

In theory, WP:Office is the one acknowledged form of censorship, restricted to cases in which Wikimedia executives and council find compelling legal reason. In practice, any decision by WMF employees appears to have the same status as a formal office action; as with the imposition of the Media Viewer, they are beyond evaluation or negation by the community. WMF personnel have recently granted them a power of "superprotection", to overrule admins hierarchically, but in the first month this was not observed in use on specific articles.