Wikipedia:Wikipedia Signpost/2020-08-02/In focus
WikiLoop DoubleCheck, reviewing edits made easy
- Macruzbar formerly worked for the Wikimedia Foundation, as the communications lead in the Community Engagement department. She now works in the Google Open Source program office, where she leads the community engagement program.
An authority, according to Clay Shirky, "is a person or institution who has a process for lowering the likelihood that they are wrong to acceptably low levels." (see full quote at right). Do you want to develop into such an authority? Do you want to review other people's edits, using scores from ORES while helping to improve the ORES edit quality prediction model? WikiLoop DoubleCheck helps you do this by making the peer-review process of Wikipedia a collaborative effort that anyone, including non-registered users, can use. It is an open-source, crowd-sourced counter vandalism tool for Wikipedia and Wikidata. WikiLoop DoubleCheck is built on web technology and can be launched quickly from either desktop or mobile phone without installing resident software. Its goal is to reduce the barriers for editors to assist in patrolling Wikipedia revisions.
What is WikiLoop?
WikiLoop is an umbrella program for a series of technical projects intended to contribute datasets and editor tools from the technical industry back to the open knowledge world. This program was originally conceived as a virtuous circle: providing data and tools to enhance human editor's productivity, and making the Wikipedia editorial input more machine-readable for open knowledge institutions, academia and researchers interested in advancing machine learning technology. It originated at Google as the missing link in the data loop: the Knowledge Graph relies on the open knowledge source to be healthy. This is the reason why the program focuses its efforts on editor tools that can improve the content quality of Wikipedia.
Learn more about the program on its page on Meta. You can also try the tool now, and leave feedback or comments on the tool's talk page on English Wikipedia.
The encyclopedia that anyone can review
WikiLoop DoubleCheck (WLDC) works on a different premise compared to tools like STiki and Huggle, which both require rollback permission to use. WLDC intends to move to a tiered, trusted model: just like Wikipedia aspires to be the encyclopedia that anyone can edit, with permissions given to editors based on account seniority and their editing activity, WikiLoop DoubleCheck explores how to grant everyone an ability to review and label a revision with their opinion, while allowing higher-tiered (trusted) editors (such as admins or those with WP:Rollback permissions) to conduct faster and more powerful actions (e.g., direct-revert) with the tool. It allows anonymous users or less-experienced (or not-yet-trusted) editors to review and conduct actions with lower risks, while gradually building up their reputation using the tool.
Using DoubleCheck also helps to improve the ORES prediction model. While the tool displays scores from ORES and other anti-vandalism tools, like STiki and Huggle, there is also a feedback loop: tags added by editors on the tool are sent back through a route called JADE that improves this machine learning model with each revision.
Visit the WikiLoop DoubleCheck web app to start reviewing content on Wikipedia. For the direct-revert feature which is available to more experienced editors, visit DoubleCheck on WMF Labs, which is hosted on the Wikimedia Foundation's Cloud VPS.
Building WikiLoop DoubleCheck together
WikiLoop DoubleCheck didn't always have that name. About a year ago, when the prototype of the tool was launched and shared with the English Wikipedia community, several editors raised concerns about the tool's original name: Battlefield. With input from users Sadas, Xinbenlv, ElanHR, Nizil Shah, ToBeFree, Nick Moyes, and others who provided new name ideas, and a community vote, the tool was recently re-named DoubleCheck.
If you would like to get involved and contribute to WikiLoop DoubleCheck, here are two things you can do:
- Translate the user interface by editing on GitHub, e.g. for Arabic and other languages.
- Become a code contributor https://github.com/google/wikiloop-doublecheck
Discuss this story
Initial reactions
It sounds like a useful tool, but sorry to say, the article is rather incomprehensive for a layman. A dense combination of PR babble with techtalk. Taking it seriously, I have re-read it 3 times but could not make heads or tails of it: how can I use it and how specifically will it allow me to improve wikipedia. If the author wishes, I can comment on the text nearly line by line, but I have to be sure that I was heard, otherwise I'd rather waste my time on something equally useless, such as writing up something like "Administrative-command system" nobody seems to care about :-) Staszek Lem (talk) 22:38, 2 August 2020 (UTC)[reply]
I noticed the introduction mentions ORES' article quality model, but from reading the whole piece it seems it instead uses ORES' edit quality prediction models? The latter is what predicts reverts and bad faith edits (depending on the model), whereas the former predicts article quality classes (such as the English Wikipedia's content assessment ratings). Cheers, Nettrom (talk) 02:56, 4 August 2020 (UTC)[reply]
"ORES scores Considered Harmful"
This seems to be another interface for recent changes. I tried it a couple of times. The first time, the ORES prediction was wrong, saying it was bad faith when it wasn't. The second time, it was some sort of WikiData change, which was incomprehensible. What makes the tool useless for me is that there's no context or filter – it's just a stream of arbitrary, random changes. As it takes time to digest the context for each change, this is not efficient. Only button-pushing gnomes are likely to use this and the result seems likely to be low value-added. Andrew🐉(talk) 20:22, 4 August 2020 (UTC)[reply]
"Rat race" against bots
During a prolonged usage, several times when I clicked "revert" I was coming to a page from which I saw that someone else did this already. I do not mind if some quicker-minded Wikipedian beats me to a punch, but I hate the idea of competing with artificial intelligenicies :) Why don't you filter the feeds through the existing anti-vandal 'bots before pushing it to the live meat? So that I waste less of my editing time. Staszek Lem (talk) 17:31, 4 August 2020 (UTC)[reply]
"WMF" part of tool unreachable
Xinbenlv: As of this moment the link you provided for the version of the tool for trusted users is unreachable. Asaf (WMF) (talk) 02:16, 12 August 2020 (UTC)[reply]