Wikipedia:Wikipedia Signpost/2012-03-26/Recent research: Difference between revisions
Tbayer (WMF) (talk | contribs) search papers, needs section title |
No edit summary |
||
(One intermediate revision by the same user not shown) | |||
Line 4: | Line 4: | ||
<div style="border: 1px dotted #CCC; padding: 1em; color: #666">A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, edited jointly with the ''Wikimedia Research Committee'' and republished as the [[m:Research:Newsletter|Wikimedia Research Newsletter]].</div> |
<div style="border: 1px dotted #CCC; padding: 1em; color: #666">A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, edited jointly with the ''Wikimedia Research Committee'' and republished as the [[m:Research:Newsletter|Wikimedia Research Newsletter]].</div> |
||
===Coercion or empowerment? Moderation of content in Wikipedia as ‘essentially contested’ bureaucratic rules=== |
|||
In this article, to be published in [[Ethics and Information Technology|http://www.springer.com/computer/swe/journal/10676]], Paul B. de Laat proposes an analysis of the debates occurring in English, German and French project about the evolution of the rules governing the new edits. <ref>Paul B. de Laat, 2012, Coercion or empowerment? Moderation of content in Wikipedia as ‘essentially contested’ bureaucratic rules, Ethics and Information Technology, tbp, [http://www.springerlink.com/content/1k7h3t03507011l3/fulltext.pdf] {{Open access}}</ref>. |
|||
As explain by Butler et al., 2008<ref>Butler, Brian and Joyce, Elisabeth and Pike, Jacqueline, "Don't look now, but we've created a bureaucracy: the nature and roles of policies and rules in Wikipedia", in Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems (New York, NY, USA: ACM, 2008) [http://hci.uma.pt/courses/socialweb08F/5/butler.pdf] {{Open access}}</ref>, analyzing the English Wikipedia's rules, these rules are numerous, increasing in number and complexity, and ranging from the the more formal and explicit (intellectual property rights) to the more informal. |
|||
De Laat's work is based on the study of the discussions around the proposal to introduce a system of reviewing edits before they appear on |
|||
screen ([[flagged revisions]]). It spotlights on the permanent debate around the construction of knowledge commons, theorized by Elinor Ostrom<ref>Hess, Charlotte and Ostrom, Elinor, "A Framework for Analyzing the Knowledge Commons", in HessOstrom06, ed., Understanding Knowledge as a Commons. From Theory to Practice (, 2006), pp. 41--81</ref>: being a collective, open project, it must be accessible to the most, but, as its production becomes important for its "owners" (readers and producers), boundaries have to be settle to protect its integrity. De Laat's article remarkably describes and analyze the tensions and the permanent adjustments needed to manage these apparently opposed goals. |
|||
===How editors evaluate each other: effects of status and similarity=== |
===How editors evaluate each other: effects of status and similarity=== |
Revision as of 18:17, 26 March 2012
Article display preview: | This is a draft of a potential Signpost article, and should not be interpreted as a finished piece. Its content is subject to review by the editorial team and ultimately by JPxG, the editor in chief. Please do not link to this draft as it is unfinished and the URL will change upon publication. If you would like to contribute and are familiar with the requirements of a Signpost article, feel free to be bold in making improvements!
|
(Your article's descriptive subtitle here)
Coercion or empowerment? Moderation of content in Wikipedia as ‘essentially contested’ bureaucratic rules
In this article, to be published in http://www.springer.com/computer/swe/journal/10676, Paul B. de Laat proposes an analysis of the debates occurring in English, German and French project about the evolution of the rules governing the new edits. [1]. As explain by Butler et al., 2008[2], analyzing the English Wikipedia's rules, these rules are numerous, increasing in number and complexity, and ranging from the the more formal and explicit (intellectual property rights) to the more informal. De Laat's work is based on the study of the discussions around the proposal to introduce a system of reviewing edits before they appear on screen (flagged revisions). It spotlights on the permanent debate around the construction of knowledge commons, theorized by Elinor Ostrom[3]: being a collective, open project, it must be accessible to the most, but, as its production becomes important for its "owners" (readers and producers), boundaries have to be settle to protect its integrity. De Laat's article remarkably describes and analyze the tensions and the permanent adjustments needed to manage these apparently opposed goals.
How editors evaluate each other: effects of status and similarity
A team of social computing researchers based at Stanford and Cornell University studied how users evaluate each other in social media.[4] The paper, presented at the 5th ACM Web Search and Data Mining Conference (WSDM '12), focuses on three main case studies: Wikipedia, StackOverflow and Epinions. User-to-user evaluations, the authors note, are jointly influenced by the properties of the evaluator and the target, as a result we should expect difference in properties between the target and the evaluator to affect the evaluation. The study looks specifically at how differences in topic expertise and status affect peer evaluations. The Wikipedia case focuses on requests for Adminship (RfA), the most prominent example of peer evaluation in Wikipedia and a topic that has attracted considerable attention in the literature (see previous coverage in the research newsletter: September 2011, October 2011, January 2012). Similarity is measured based on article co-authorship and status as a function of an editor's number of contributions. Previous research by the same authors showed that the probability of the evaluator to provide a positive evaluation of the target user drops dramatically when the status of the two users is very similar and there is general evidence that homophily and similarity in editing activity have a strong influence on peer evaluation in RfAs. The study identifies two effects that jointly account for this singular finding:
- “Elite” or high-status users are more likely to participate in evaluations about other users who are active in their areas of interest or expertise.
- Low-status users tend to be judged differently than those with moderate or high status
In a direct application of these results, dubbed ballot-blind prediction, the authors show how the outcome of an RfA election can be accurately predicted by a model that simply considers the first few participants in a discussion and their attributes, without looking at their actual evaluations of the target.
In a paper published by the European chapter of the Association for Computational Linguistics[5], User:Oliver.ferschke and coauthors describe a study of Talk pages on Simple English Wikipedia. This paper uses speech act theory and dialog acts as a theoretical framework for studying how authors use discussion pages to collaborate on article improvement. They have released a freely-downloadable corpus of 100 segmented and annotated Talk page, called the Simple English Wikipedia Discussion Corpus, based on a new annotation schema for coordination-related dialog acts. Their schema uses 17 categories, grouped into these 4 top-level categories: Article Criticism, Explicit Performative Announce, Information Content, and Interpersonal. The authors use their corpus to develop a machine-learning-based UIMA pipeline for dialog act classification, which they describe but which is not freely available.
They also provide a useful discussion of conversational implicature theory and a good pointers to seminal and new research in dialog acts. A longer, editable summary is on AcaWiki.
Briefly
[6] [7] [8] [9] [10] [11] [12] [13] [14] [15] (see also) [16]
[17] [18] Gene Wiki, Template:SWL
Two short papers which are to be presented at a workshop titled "Searching 4 Fun!" collocated with the upcoming European Conference on Information Retrieval]] concern Wikipedia: "Serendipitous Browsing: Stumbling through Wikipedia"[19] examines which Wikipedia articles are being featured by users of the social bookmarking site StumbleUpon. Based on a sample consisting of a random selection of half of the articles from the October 2011 dump, 15.13% of the articles of the English Wikipedia are contained in StumbleUpon's index (as opposed to less than 1% of both the French and the German Wikipedia, according to an initial investigation). The 100 articles with the most views by StumbleUpon users contained only one featured article, but twelve lists - among them the number one, the list of unusual deaths, which belongs to the "Bizarre/Oddities" category on StumbleUpon, as do four other of the top ten articles. A second position paper is titled "Searching Wikipedia: Learning the Why, the How, and the Role Played by Emotion"[20] proposes to examine users' search behavior employing diary studies and a custom-built Firefox extention asking Wikipedia readers to "record details about his information need and the motivating situation surround the search.
References
- ^ Paul B. de Laat, 2012, Coercion or empowerment? Moderation of content in Wikipedia as ‘essentially contested’ bureaucratic rules, Ethics and Information Technology, tbp, [1]
- ^ Butler, Brian and Joyce, Elisabeth and Pike, Jacqueline, "Don't look now, but we've created a bureaucracy: the nature and roles of policies and rules in Wikipedia", in Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems (New York, NY, USA: ACM, 2008) [2]
- ^ Hess, Charlotte and Ostrom, Elinor, "A Framework for Analyzing the Knowledge Commons", in HessOstrom06, ed., Understanding Knowledge as a Commons. From Theory to Practice (, 2006), pp. 41--81
- ^ Anderson, A., Huttenlocher, D., Kleinberg, J., & Leskovec, J. (2012). Effects of user similarity in social media. Proceedings of the fifth ACM international conference on Web search and data mining - WSDM '12(p. 703). New York, New York, USA: ACM Press. DOI • PDF
- ^ Ferschke, O., Gurevych, I., & Chebotar, Y. (2012). Behind the Article: Recognizing Dialog Acts in Wikipedia Talk Pages. Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2012). PDF
- ^ Britt, B. C. (2011). System-level motivating factors for collaboration on Wikipedia: A longitudinal network analysis. Thesis, Purdue University. HTML
- ^ Tzekou, P., Stamou, S., Kirtsis, N., & Zotos, N. (2011). Quality assessment of Wikipedia external links. PDF
- ^ Laat, P. B. (2012). Coercion or empowerment? Moderation of content in Wikipedia as 'essentially contested' bureaucratic rules. Ethics and Information Technology, 1-13. Springer Netherlands. DOI
- ^ Ferschke, O., Gurevych, I., & Chebotar, Y. (2012). Behind the Article: Recognizing Dialog Acts in Wikipedia Talk Pages. Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2012). PDF
- ^ Morsey, M., Lehmann, J., Auer, S., Stadler, C., & Hellmann, S. (2012). DBpedia and the Live Extraction of Structured Data from Wikipedia. Program: Electronic library and information systems, 46(2), 2. Emerald Group Publishing Limited. PDF
- ^ Okoli, C., Mehdi, M., Mesgari, M., Nielsen, F. Å., & Lanamäki, A. (2012). The people's encyclopedia under the gaze of the sages: A systematic review of scholarly research on Wikipedia. SSRN eLibrary. SSRN. HTML
- ^ Hall, M. M., Clough, P. D., Lopez de Lacalle, O., Soroa, A., & Agirre, E. (2012). Enabling the Discovery of Digital Cultural Heritage Objects through Wikipedia. PDF
- ^ Knight, C., & Pryke, S. (2012). Wikipedia and the University, a case study. Teaching in Higher Education, 1-11. Routledge. DOI
- ^ Florin, F., Fung, H., Halfaker, A., Keyes, O., & Taraborelli, D. (2012). Helping readers improve Wikipedia: First results from Article Feedback v5. Wikimedia Foundation blog. HTML
- ^ Goodwin, D. (2012). Bing, Not Google, Favors Wikipedia More Often in Search Results. Search Engine Watch. HTML
- ^ Priem, J., Piwowar, H. A., & Hemminger, B. H. (2012). Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact. ArXiV. PDF
- ^ http://www.zerogeography.net/2012/03/few-days-ago-i-blogged-about-map-that-i.html
- ^ Good, B. M., Clarke, E. L., Loguercio, S., & Su, A. I. (2012). Building a biomedical semantic network in Wikipedia with Semantic Wiki Links. Database : The Journal of Biological Databases and Curation, 2012, DOI
- ^ Hauff, C., & Houben, G.-J. (2012). Serendipitous Browsing: Stumbling through Wikipedia. In D. Elsweiler, M. L. Wilson, & M. Harvey (Eds.), Proceedings of the “Searching 4 Fun!” workshop collocated with the annual European Conference on Information Retrieval (ECIR2012) Barcelona, Spain, April 1, 2012. (pp. 21-24)
- ^ Knäusl, H. (2012). Searching Wikipedia: Learning the Why, the How, and the Role Played by Emotion. In D. Elsweiler, M. L. Wilson, & M. Harvey (Eds.), Proceedings of the “Searching 4 Fun!” workshop collocated with the annual European Conference on Information Retrieval (ECIR2012) Barcelona, Spain, April 1, 2012. (pp. 14-15).
Discuss this story