Wikipedia:Why is Wikipedia losing contributors - Thinking about remedies
|This unofficial guidance essay contains comments and advice of one or more Wikipedia contributors. It is not a Wikipedia policy or guideline, though it may be consulted for assistance. It may contain opinions that are shared by few or no other editors; potential measure of how the community views this essay may be gained by consulting the history and talk pages, and checking What links here.|
|This page is related to preservation of Wikipedia itself.|
||This is a constructive article! Topics are not related to criticism|
|This idea is in the brainstorming stage. Feel free to add new ideas; improve, clarify and classify the ideas already here; and discuss the merits of these ideas on the talk page.|
|This page in a nutshell:
- 1 Introduction
- 2 Examples of problems
- 3 Possible solutions
- 3.1 Slowing down
- 3.2 User dashboards
- 3.3 Channel deletionists
- 3.4 Users vs. Admins
- 3.5 Software innovation
- 3.6 Automated user conduct handling
- 3.7 WP:OR/WP:RS and common sense
- 3.8 Create a better introductory system
- 3.9 Facts vs. Truth, Inclusionism vs. Deletionism
- 3.10 Article arbiter: all are equal but some more equal than others
- 4 From Wikipedia Signpost
- 5 Articles of interest
- 6 See also
- 7 References
Wikipedia has worldwide influence: "Dr. Wikipedia," for instance, is now the number #1 source for healthcare information. Wikipedia is regularly among the top 10 most visited websites. It has become the arbiter of who is who, and what is what.
Yet Wikipedia is losing editors. While the number of internet users continues to grow, the number of new editors to Wikipedia is falling. This may be a long-term risk for Wikipedia. This page is looking at why Wikipedia is losing contributors, and what possible remedies could help.
Please contribute to this page...
- Try, whenever possible, to integrate your comment to existing sections
- Keep as short as possible but don't be too short!
- Pointing out a problem is only half the work - you are invited to suggest also a solution!
- Add first-person comments to talk-page
- Add general comments to main-page
GOAL1: let's try to keep cool and objective/impersonal GOAL2: a simple not-too-long structured article is much more readable GOAL3: let's find multiple solutions to each problem GOAL4: best solutions are described by short user-cases
- Think about the big picture, but try to make specific suggestions which can be realistically implemented by the existing community.
Who contributes to Wikipedia?
Contributions come from diverse demographic and ethnographic segments:
What motivates contribution?
Several motivations lead people to contribute:
Examples of problems
Interaction-related problems discourage new editors.
- Too many editors are flamed, new edits automatically reverted...
- There is no real recourse against incivility (only endless RfC's)
- I want to be free to call someone X when he deserves it!
- There are aggressive users, ready to bite me at any step I do...
- There are frustrated users looking for somebody to witchhunt / to stalk...
- Technically difficult
- How can I possibly correct the error in this table if I can't see it?
- Why do we need to learn a markup in 2012?
- Shouldn't signing posts be done automatically like on news sites?
- Confusing policies
- Editing is too stressing...
- There's too much to learn and too many guidelines to read!
- I tried but no one would help me...
- There are users that love to apply any existing WP:* against me/against the article and are proud to think about themselves as defensor fidei
- Some articles are too biased! WP is slowly turning into a newspaper/a blog/...
Lack of collaboration and "lone wolf" culture repels new editors
The MediaWiki platform, at present, is fair to poor at supporting collaboration, and moribund WikiProject corpses are increasingly blocking the path forward toward a new paradigm for a more collaborative Wikipedia. WikiProject History's collaboration of the month is listed as October...October 2007, from the previous decade.
Some attempts to revive teamwork and restart collaboration that have never been answered by anyone:
We must put ourselves in the shoes of a brand new, yet talented, editor approaching en.wikipedia for collaboration on an article rewrite. Among editors, especially the newer editors, WikiProjects create the impression that collaboration is ongoing when it often isn't. Thus, it helps prevent new blood from launching new collaborations, stifling the collaborative environment that improves articles, fosters peace and understanding, and retains talented writers. The absence of a cordial, supportive, collaborative platform hurts retention and I think it's fair to say this absence often leaves behind an often caustic "lone wolf" culture that can repel women from the project.
The Wikimedia team has understood for years that in order to close the startling gender gap, editing must be a much more social experience. The potential "whittling down" each year of our pool of talented writers is the greatest threat to Wikipedia, as we must increase the number of editors, especially expert editors, to be able to fix the sprawling hellscape of weak, inaccurate and incomplete articles that drag down the project (especially in the area of the social sciences and humanities, which User:Sue_Gardner correctly pointed out at the 2011 Wikipedia in Higher Education summit). Retaining good people is the greatest danger to en.wikipedia's success; though edit warring gets more attention, WP:DONTBEADICK and dispute resolution is crucial to the extent it effects retention of editors. We really need to keep good editors around, and I believe a more social, collaborative platform would go a long way toward that goal. We also must transition to what the Wikimedia Strategic Plan to 2015 foresees as "topical groups" based on editing interests.
The philosophy of deletionism has the side-effect of killing too much good along with the bad. Nobody gets excited to join a project when they write up something meaningful only to see someone scrap it all, then use "dispute resolution" to bring them to a halt.
There is a risk of ignoring NOR or BLP when taking stuff out of an article because "they don't think it's important", or right, based on their three-minute thumbnail knowledge of the topic, without regard to the sources which do think it's important.
Deletionists often don't fix things. They revert your entire edit, and may write many paragraphs into a talk or discussion page.
On the other hand, overenthusiastic fans often bloat articles with trivia and minutiae, often poorly researched and cited, if cited at all, that are of no use to the general public and best belong on fan sites. If deletionism keeps those kinds of editors away, it may be a good thing. What gets deleted, in most cases, is content that fails to live up to core policies such as no original research, neutral point of view, and reliable-source verification.
Abuse of power (mainly by Deletionists)
Wikipedia editors can hold a few types of power over newbies:
- detailed knowledge of Wikipedia guidelines, procedures and biases - Everyone knows intuitively what an encyclopedia is -- it's a place to look up information -- but people should not be confused or smacked down for being confused about all the things Wikipedia:What Wikipedia is not, or oddities like its inclusion of Pokémon but exclusion of most academics, or the peculiar guidelines for Wikipedia:Biographies of living persons. There's still too steep a learning curve for newbies to get the Wikipedia approach. There should be at least 100 templates or article guides for various types of topics. It's not uncommon for multiple policies to be tossed into deletion discussions, and the editor has no chance to defend their article without taking a crash-course in Wikipedia war tactics.
- understanding of the conflict style, which is semi-legalese and a punchy "alpha-male" style of writing - Effective Wikipedia arguing in AfD or other discussions is very curt. Like in a courtroom, the arguments become quite mathematical/logical, and often the subjective/quantitative feel of the discussion is missing. Politeness and courtesy are often missing too; or they are added in an insincere way (thanks for your good faith edit, but I just deleted it as per WP:XYZ.).
- patience and persistence - Newbie or casual editors just want to help out; it's easy to out-wait an editor if you spend two weeks dedicated to getting something deleted or changed. The perception of another editor was "Deletionists can't be bargained with. They can't be reasoned with. They don't feel pity, or remorse, or fear. And they absolutely will not stop, ever, until your contributions are reverted."
- 1,545 are actually administrators - but I don't think admin abuse is that common, nor is it the focus of this discussion.
- bots - Bots or Firefox extensions create an imbalance because it could take someone 30-60 minutes to work on new content, but 1-3 seconds for it to be tagged for deletion or other problems. This makes it too easy for fly-by tagging (e.g., WP:user warning templates) that alienates editors. If tagging was more difficult, then editors would only tag if they really mean it.
- friends - You will (eventually) make friends on Wikipedia and you will (eventually) find it very hard to rule against them even when they deserve it.
Facts & Truth
At the very center of Wikipedia is a rejection of thousands of years of philosophy holding that truth has an independent existence. Without that basic principle at the center, any notions of "knowledge," "authority," and "reference" fall apart. And they collapse not just in the airy abstract, but in the gritty day to day. Most "edit wars" here are between people who value facts, and others who do not. Wikipedia's organizing principle holds that fact and truth are whatever a consensus of editors say they are. That's how you work, both philosophically and practically. The result is that all editors are equal, except those who have somehow obtained administrative rights, in which case they are equal to other administrators; that an editor or administrator need not be familiar with a subject to edit or administer an article; that all issues will be decided by consensus, which by definition at Wikipedia is correct.
Any expert or authority on a topic immediately learns that appeals to fact are meaningless. If a majority at Wikipedia decided that 2 + 2 = 5, then that's what Wikipedia will publish. Similarly, if a majority decides that "4" shall not be mentioned, then it will not be mentioned. On Wikipedia, fact, and truth, have no validity in themselves. This means that they are subject to political determination, i.e., the process of obtaining agreement. Those who believe that 2 + 2 = 4 will need to lobby for 4. Wikipedia has an exhaustive set of rules that purport to guide such discussions, but those rules are routinely ignored. For instance, debates will rage within an article over whether a bit of information is "notable" enough for inclusion. Wikipedia's clear rule is that "NOTABILITY" applies only to whether an assertion has significant coverage in verfiable sources, not to whether or not it is true.
Editorial Equality is Unrealistic, Perhaps Unsustainable
Wikipedia regards all editors as equal. Editors enjoy equal ownership of contributions, which is to say: none. Articles enjoy equal permanence, which is to say: none. What a PhD contributed today a high school kid reverts tomorrow, or next week, or next year.
An editor who has worked hard develops a proprietary interest, becomes protective of his work. This is human nature, this is how we behave in the real world--never mind Wikipedia's "no ownership " ideal. An editor who knows something about a subject is infuriated to find a know-nothing undoing his contribution. And that good editor is battling not only the know-nothings of today, but the never ending future crops of know-nothings. He can only protect a good article for so long; there comes a time when finally all editors do become equal, the only genuine equality, the equality of the grave.
So good editors after some experience of know-nothing reversions, and abuse, and watching yesterday's perfect page become today's shambles, decide it just is not worth the effort. Editorial equality; universal transience--the policy drives away experienced editors, by the very experience of it.
It is not a policy that afflicted the print encyclopedias of yesteryear. They did not make the mistake of assuming all contributors, and contributions, are equal. They vetted very carefully. And it was prestigious to be invited to contribute. Perhaps it is time for Wikipedia to create a mechanism for separating the wheat from the chaff.
A presiding judge keeps order in his court with the contempt power, an instant strike of his gavel. There is no judge keeping order in a Wikipedia discussion. With all editors having equal power to stop abuse, naysaying, and escalating hostility which is to say: none, there is no effective recourse. Especially not against wiley editors who have learned what they can get away with. Reasonable, well meaning editors flee, driven away; unreasonable, power-seeking editors concentrate, like an acid. As good editors leave, bad ones multiply, new editors find savagery, a deadly acid bath awaits them--a death spiral develops. Wikipedia withers.
Any solutions need to acknowledge that Wikipedia, for all its flaws, is unparalleled in being a huge, productive crowd sourcing machine. But can the machine be tweaked? Any tweaks must work for the vast majority of contributors, support the motivations people have to contribute, and reduce the turnoffs.
Wikipedia could evolve, aiming to simplify and automate some user interaction and avoid most common stress sources. This should be adherent with WP main goal since editors are not here to act as a social forum.
Most edit warring takes place in a short time. Worst behaviours emerge due to very short editing life cycle. Users being able to commit only once an hour/a day/X same article would be encouraged towards first-think-then-post. An intermediate sandbox tool could be very helpful. For example: linux operating systems provide full 'per-user-system-vision'. Similarly everybody could be able to commit whenever he wants to the intermediate sandbox (and view it as user-X immediately). Once an hour/a day/... system will merge the intermediate sandbox with a simple algorithm: edits related to distinc sections immediately merged; edits related to a common section with a X policy (First In First Out/Last In First Out/...).
People love dashboards. e.g., Google Analytics, Bitly stats, and various Twitter statistics.
Currently, only experts can find their way through the maze of wikilinks to statistics, like these.
WikiMedia could recruit interface designers to come up with ideas (this could be crowd sourced, there would be an outpouring of designs) for key graphs and statistics, and also improve the speed of data from toolserver.org. The tools chosen would be computationally realistic.
Statistics should include smart breakdowns, perhaps including how many reverts are successful or undisputed. We'd have to think about the metrics -- but set it up so that what's measured and readily viewed directly from a user's user page supported constructive work.
This could include a visualization of the results of a rating system in the edit history itself (from user Adjwilley in the discussion). For example, there could be little thumbs up buttons, so when you see an edit you like in the edit history, you hit the button, which records your action. It would be a means of giving some validation to editors who go around making good edits.
A priori deletionism. Deletionists probably act in good faith, and they play an incredibly important role of keeping spam out of Wikipedia. The problem is when they become immune to the value of new content, or insensitive to the motivations of other contributors. Perhaps one type of solution is to help deletionists differentiate valuable and reckless edits. This could be done via statistics (see above). Deletionists should see some visual overview of how much of their deletes are good (e.g., removing spam) vs. unnecessary or harmful.
On the other hand, Wikipedia places a high value on edits containing no original research, a neutral point of view, and reliable-source verification. It may be counter-intuitive to make it harder to delete than to put questionable content in in the first place.
Users vs. Admins
Some users are concerned about a lack of freedom. Other users are concerned about an excess of freedom. Freedom, like the nature of human being itself is something very difficult to investigate. Other wiki-parameters are much simpler to talk about:
- There are millions of users and millions of articles
- Users come from different cultures, with different personal/interaction attitudes
- Newbies need to learn - through interaction - a lot before become Wikipedians
The learning and interaction process, through years, has proven to scale well and could maybe work well even applied to billions of users and billions of articles. This works perfectly from an abstract point of view. Other web-based realities scale also well and turn into wiki competitors because of
- The lack of a starting training: newbies are likely to shift towards simpler web realities
- The lack of a strict user/admin interaction: newbies are not likely to be pleased by a there's somebody watching over you feeling...
Strict human interaction is very likely to rise rage and conflict. Being prevented by Mr. X to access a page is likely to be felt as a dumb decision/a revenge/a misunderstanding. Being prevented by automatic filtering to access a page (because you posted a well known X-bad-link, because you named somebody 'nazi', ...) is more impersonal. Mr. X is still the one that reads the filter log and applies the ban/rewards you with -1 AL/... but you won't take it as a personal or direct offense.
Too much freedom is really tempting problematic users. There are a lot of people who feel themselves as powerful being able to open accusation against other users. RfC - a simple and polite invite to other editors to join talk page discussion and review - should be everything that users should refer to. Administrative tools should be posted and performed by admin only. Somebody could feel this like an authoritative evolution but it could the chance to dramatically simplify interaction and attract a lot of new users - now shifting towards social networks.
Too much freedom is really tempting problematic admins. Some admins may be involved in a too strict editor/admin COI. Editors and admins should be kept as separate roles, according to an non-cumulative group policy. Trusted people willing to help as an admin, after being elected, should give up their editor access.
Lack of response
Getting no response. One problem could be that a lot of users get no response when they ask a question on a talk page, ask for feedback on articles they created, or ask for an editor review. This is very discouraging, and creates the impression that their input is not important to the community. In many ways, this is unavoidable for a talk page where there is no active watcher of the article, but in other contexts it could be avoided.
- 1 - Every user is granted the right not to interact with you, even if asked in the talk page, unless you're pointing some bad misconduct...
- 2 - Every user has the right to ask other users for review/RfC
- 3 - Discussion about articles, taking placing in a personal talk page, is 1-1 not many-to-many, other users will not see it and get involved...
- 4 - There are so many bureaucratic pages that users in their spare time are not very likely to read RfC page...
- 5 - Getting rid of all that bureaucratic pages and have editors refer ONLY to a main/cathegorized RfC page will greatly help interaction...
- 6 - No matter who answers you - getting answered and reviewd is the goal
A big categorized RfC page, and nothing more... - Application 1 : Lack of response
Categories will play a growing role. A big unique RfC page is something like a bottle in the sea... Editors will "subscribe" to some channels and will promptly assist new RfCs. Editors subscribing to RfC channel X will also be able to help reviewing X-related contents, not only user X conduct or written English style.
Solution: categorized RfC channels + users subscribing RfC channels = you're never alone + no more orphan articles
Looking for all template codes is really time-consuming. Another possibly overlooked reason is the way the Wiki software works. Even experienced internet users, when looking at a Talk page discussion, give up and walk away. The interface isn't designed for creating clear threaded discussions, and the text really just runs together. Most of us here are used to it, but it required time. Also on the page editing side, people have to remember strange codes and templates in order to do certain things.
WP recognized by UNESCO tenwiki:World Heritage could be a great chance to invest in required software innovation!
Moving to new platforms for collaboration
New platforms for collaboration. In order to reverse the troubling trend of editors leaving Wikipedia (i.e. improve the recruitment and retention of new writers) it's necessary we move beyond moribund WikiProjects to new platforms for collaboration. This is already addressed, in part, by the "strategy:Attracting and retaining participants" portion of the current Wikimedia Strategic Plan. My proposal deals with how we get from where we are now (in en.wikipedia, littered with moribund WikiProjects) to where the Strategic Plan takes us: the introduction of more social/collaborative tools to the Wiki, including "Users would be able to join topical groups, based on their editing interests (e.g., “18th century American history)". My proposal is about how we get from here to there.
Your thoughts are appreciated!
Automated user conduct handling
Adaptive granular access control (AC)
Let's imagine a unix-like access level.
- User is given an access level (AL): -1 (=banned), 0 (=just created), ..., 10 (=allowed to edit any non-system page)
- Wikipedia articles are given a protection level (PL): -1 (edit by banned users), 0 (=edit with AL >= 0), ... 10 (=edit with AL == 10)
- User automatically gains +1 AL every X time and Y good edits (not raising any accusation/dicking/wikilove/... filter)
An adaptive and granular access control system could handle article protection in a very effective way - like linux. Article rates PL 7, you've just joined WP at AL 0 and you can only read. You take actively part to WP N times without rising any filter and after a week/a month/... you gain an extra point... Normal articles rate PL 0 and disputed articles rate PL 10 so to be edited only by cool editors.
Human interaction is still granted by an admin that reviews the filter report and decides wheter to reward with -1 AL.
True behavioural redemption and guidelines conformity is granted by time and good edits number required to regain AL points.
Adaptive AC - Apllication 1 : auto-protecting articles of a certain level
Articles reaching Ga, FA or FL status. IP edits and non-autoconfirmed users should be prevented (using protection) from editing GA/... articles. The major driving force that makes this more of a chore than a pleasure is having to wade through edit after edit to find out which is vandalism, which is misguided and which actually provides useful information, even on a locked article since a new user can affect them.
On GA+ articles, the problem becomes bigger because a group or individual has generally put a lot of effort into these pages including research, writing, sourcing, etc. Then, in the case of something like Transformers (film), a new film in the series comes out and people feel it necessary to change these articles, almost always by bloating plot but in other ways as well.
If these articles were giving moderate protection that required newer users to request edits, it would go a long way to preserving editors work on what should be the flagship articles of the project and perhaps lessening the impact that negative editing can have on the editor's work. It would also mean that there may be an endpoint to your work where you get it to a certain status and can entrust that edits being made are almost certainly in the positive by established and/or learned users.
Solution: granting articles an adaptive PL: Stubs rate PL 0, ... FAs rate PL 10
Having a "work in progress" status - and approval before published for new editors
I have a little proposal to this whole method of handling articles. The action of letting anyone publish new articles, and then delete if not found good enough seems to be a bit strange. There are many services for publication where you can submit and edit your work, and then get it approved by the editorial panel. If not approved, it still has the status as “work in progress” until you re-submit for approval. In this way, new authors of articles can go their way creating good content whiteout being snapped on their fingers, like today.
Creating good articles can sometimes be a time-consuming task. There is no way to mark articles as “work in progress, do not list”. Having an article in work where you self decide when it is ready to be evaluated/published would in general be good advice. Such articles could be linked for ease of reference when discussing articles with the editorial panel, but not be available for search.
Auto approval could be a status set by administrators in the user profile, for those gained a certain experience level.
WP:OR/WP:RS and common sense
OR/RS and common sense intersect and conflict a lot of times. It should be allowed to split articles: first part has to be completely referenced and readable by any unexperienced user, second part can be experts-oriented and reference to usually-taken-for-granted advanced topics.
Create a better introductory system
We need a better introduction to editing. While we have several pages designed to help out newcomers, they either are too basic (you can edit this page, see how!) or too detailed (such as Help: Wikipedia: The Missing Manual – a great page, but overly detailed for a casual newbie). The Wikipedia:New contributors' help page, viewed by 500+ people a day, is a particularly bad introduction.
We need a beginner's guide that gives a basic overview of policy as well as formatting and community expectations, written solely to prevent newbies' first edits from being reverted. Most importantly, the page should be made prominent so that new editors will find it before they make their first edit. Perhaps we could link it in bold from the template that pops up when you edit from an IP. When new editors have a basic understanding of our expectations, their edits will be reverted less often and their self-confidence (as a whole) will be boosted. This should (if the recent data on the signpost is correct) lead to a higher retention rate. A page like Wikipedia:Your first article would be good, but written from a broader perspective. Perhaps create a page called Wikipedia:Your first edits? Or revamp and repurpose one of the existing help pages?
Facts vs. Truth, Inclusionism vs. Deletionism
WP has been created according to few principles, designed like the rules of a complex adaptive system game: you put a lot of people together, thousands, millions, ... you design just a few behavioural guidelines (not even related to principles) and then you start playing. The players (we), after a lot of game iterations, will design their own principles as a result of emerging collective behaviour.
The only problem could be having a non-homogeneous emerging collective behaviour, having this big experiment clearly split into two secluded macro-areas: inclusionsm and deletionism. Inclusionist players are always diffident to remove something because fear censorship, incompleteness and are not interested in objective truth. Deletionist players are always diffident to add something because fear loosing WP public reliability and are not interested in completeness or extending WP.
Solution: there are no wrong people, just the right people in the wrong jobs.
Article arbiter: all are equal but some more equal than others
To address the problem that not all editors, or editorial contributions, are really equal--why not allow an article to have a designated Arbiter? That is, one active editor, or maybe three, who by consensus have made a lot of good contributions to it? The Arbiter would have a fixed term of one year, or three years, or whatever. If there is an edit war an Arbiter would have to power to freeze the page, blocking changes for 24 hours to let things cool; or block warring editors from contributing to that page--again, just for a specified time. This would have several good effects. It would be a pat on the back for constructive editors. It would help preserve good text; work is no longer flung into the void. Edit warring or incivility would have an instant and effective response, which now it does not. And finally, this is consistent with Wikipedia's general policy of equality since the Arbiter is appointed by consensus.
From Wikipedia Signpost
There's a ton of outside writing about Wikipedia, much related to the challenge of losing contributors. Here's some starting points...
- Wikipedia's gender gap examined further - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-02-14/News_and_notes
- Fine art and Wikimedia - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-07-18/In_the_news
- Meta-research: Trying to survey existing research literature on Wikipedia - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-04-11/Recent_research
- Classifying newbies and veterans as experts, gnomes, vandal fighters or social networkers - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-04-11/Recent_research
- New editors on the English Wikipedia: More warnings and malicious edits, but majority still in good faith - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-05-09/News_and_notes
- Guardian: "Wikipedia wants more contributions from academics" - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-04-04/In_the_news
- "Critical Point of View" book of Wikipedia research published - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-05-09/In_the_news
- U.S. Supreme court hopeful suspected of COI editing - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2010-04-12/In_the_news
- German Wikipedia under fire from inclusionists - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2009-11-09/German_controversy
- tips on Responding to criticism - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2008-06-09/Dispatches
- Is Wikipedia a cult? - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2010-06-14/In_the_news
- Quality of newbie contributions decreases since 2004, but majority still constructive - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-04-18/News_and_notes
- 2010 review of the year - http://en.wikipedia.org/wiki/Wikipedia%3AWikipedia_Signpost/2011-01-03/2010_in_review
- Collaborating with the cultural sector - http://en.wikipedia.org/wiki/Wikipedia%3AWikipedia_Signpost/2011-01-03/2010_in_review
- WikiProject Feminism, and the gender gap. links to two other Signposts. - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-03-07/WikiProject_report
- Anniversary coverage continues - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-01-17/In_the_news
- AEsh case comes to a close - what does the decision tell us? - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-05-09/Arbitration_report
- Newbies and patrollers: "Every now and then a nun or a tourist wanders in front of the rifle sights" - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-02-28/News_and_notes
- The Huggle Experiment: interview with the research team (where do newbies go for help?) - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-08-01/Research_interview
- Citing editor statistics, Foundation presents upcoming product plans - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-03-14/News_and_notes
- Deletion of article about website angers gaming community - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-03-07/Deletion_controversy
- New inclusionist alternative project announced - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2011-01-10/In_the_news
- Inclusionists and deletionists - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2008-01-02/In_the_news
- Deletionists and inclusionists - http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2007-10-15/In_the_news
Articles of interest
- NY Times Article - Here's an interesting article I read today in the NYTimes. It talks about problems with only using printed material as sources. When Knowledge Isn’t Written, Does It Still Count? - Critics of Wikipedia are pushing it to expand beyond the traditional Western model of scholarship and authority: the written word. (originally added by Adjwilley to the discussion for this page.)
- The Atlantic - Wikipedia Adds WikiLove Button in Attempt to Stem Criticism http://www.theatlantic.com/technology/archive/2011/06/wikipedia-adds-wikilove-button-in-attempt-to-stem-criticism/241031/
- http://knol.google.com/k/criticism-of-wikipedia - well cited
- http://knol.google.com/k/wikipedia-criticism-of - lighter
- Wikipedia:Village pump (idea lab)
- Feltman, Rachel (January 21, 2014). "The #1 doctor in the world is Dr. Wikipedia". Quartz. Retrieved 2014-01-23.
- Keith Wagstaff (Jan 20, 2014). "Can you hear him now? Wikipedia's Jimmy Wales gets into the cellphone game". NBC News. Retrieved 2014-01-23.
- Judith Newman (January 8, 2014). "Wikipedia-Mania Wikipedia, What Does Judith Newman Have to Do to Get a Page?". New York Times. Retrieved 2014-01-23.