Wikipedia:Village pump (proposals): Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
→‎Filling in blanks: option #5: still a waste of time...
Line 193: Line 193:
:Yes, I would accept this compromise. It does not really matter in the long run, as long as WMF will switch off when Wikipedians decide that the local descriptions are adequate. It will be up to Wikipedians to get the descriptions populated. Anyone who wants to get Wikidata descriptions shut down sooner can make it happen by adding more short descriptions. &middot; &middot; &middot; [[User:Pbsouthwood|Peter (Southwood)]] [[User talk:Pbsouthwood|<sup>(talk)</sup>]]: 18:11, 6 January 2018 (UTC)
:Yes, I would accept this compromise. It does not really matter in the long run, as long as WMF will switch off when Wikipedians decide that the local descriptions are adequate. It will be up to Wikipedians to get the descriptions populated. Anyone who wants to get Wikidata descriptions shut down sooner can make it happen by adding more short descriptions. &middot; &middot; &middot; [[User:Pbsouthwood|Peter (Southwood)]] [[User talk:Pbsouthwood|<sup>(talk)</sup>]]: 18:11, 6 January 2018 (UTC)
: It still seems like a waste of time to me. Just use the Wikidata descriptions, and improve them there. [[User:Mike Peel|Mike Peel]] ([[User talk:Mike Peel|talk]]) 19:05, 6 January 2018 (UTC)
: It still seems like a waste of time to me. Just use the Wikidata descriptions, and improve them there. [[User:Mike Peel|Mike Peel]] ([[User talk:Mike Peel|talk]]) 19:05, 6 January 2018 (UTC)
::As we discussed below, the WMF plan is to switch from a Wikidata-fallback to full enwiki control when there are 2 million non-blank short descriptions on enwiki, which is roughly comparable to the number of existing descriptions on Wikidata. That will help to ensure that the readers and editors who use these descriptions won't notice a sudden degradation of the feature. -- [[User:DannyH (WMF)|DannyH (WMF)]] ([[User talk:DannyH (WMF)|talk]]) 19:16, 6 January 2018 (UTC)


===Other discussion===
===Other discussion===

Revision as of 19:16, 6 January 2018

 Policy Technical Proposals Idea lab WMF Miscellaneous 

New ideas and proposals are discussed here. Before submitting:


RfC: Populating article descriptions magic word

In late March - early April 2017, Wikipedia:Village pump (proposals)/Archive 138#Rfc: Remove description taken from Wikidata from mobile view of en-WP ended with the WMF declaring[1] "we have decided to turn the wikidata descriptions feature off for enwiki for the time being."

In September 2017, it was found that through misunderstanding or miscommunication, this feature was only turned off for one subset of cases, but remained on enwiki for other things (in some apps, search results, ...) The effect of this description is that e.g. for 2 hours this week, everyone who searched for Henry VIII of England or saw it through those apps or in "related pages" or some such got the description "obey hitler"[2] (no idea how many people actually saw this, this Good Article is viewed some 13,000 times a day and is indefinitely semi-protected here to protect against such vandalism).

The discussion about this started in Wikipedia:Village pump (policy)/Archive 137#Wikidata descriptions still used on enwiki and continued mainly on Wikipedia talk:Wikidata/2017 State of affairs (you can find the discussions in Archive 5 up to Archive 12!). In the end, the WMF agreed to create a new magic word (name to be decided), to be implemented if all goes well near the end of February 2018, which will replace the use of the Wikidata descriptions on enwiki in all cases.

We now need to decide two things. Fram (talk) 09:58, 8 December 2017 (UTC)[reply]

How will we populate the magic word with local descriptions?

  1. Initially, copy the Wikidata descriptions by bot
  2. With a bot, use a stripped version of the first sentence of the article (the method described by User:David Eppstein and User:Alsee in Wikipedia talk:Wikidata/2017 State of affairs/Archive 5#Wikipedia descriptions vs Wikidata descriptions)
  3. With a bot, use information from the infobox (e.g. for people a country + occupation combination: "American singer", "Nepali politician", ...)
  4. Start with blanks and fill in manually (for all articles, or just for BLPs)
  5. Start with blanks, allowing to fill in manually and/or by bot (bot-filling after successful bot approval per usual procedures)
  6. Other

Discussion on initial population

  • #5 – allows bot operations for larger or smaller sets of articles per criteria that don't have to be decided all at once, and manual overrides at all times. --Francis Schonken (talk) 10:28, 8 December 2017 (UTC)[reply]
  • #5 is my preference following the reasoning below:
    Option 1, copying from Wikidata, will populate Wikipedia with a lot of really bad descriptions, which will remain until someone gets around to fixing them. My initial rough estimates are that there are more bad/nonexistent Wikidata descriptions than good ones. I strongly oppose this option unless and until someone comes up with solid data indicating that it will be a net gain.
    Option 2, extracting a useful description from the first sentence or paragraph seems a nice idea at first glance, but how will it be done? Has anyone promoting this option a good idea of how effective it would be, how long it would take, and if it would on average produce better descriptions than option 1? This option should be considered unsuitable until some evidence is provided that it is reasonably practicable and will do more good than harm.
    Option 3, copying from the infobox, may work for some of the articles that actually have an infobox with a useful short description, or components that can be assembled into a useful short description. This may work for a useful subset of articles, but it is not known yet how many. I would guess way less than half so not a good primary option.
    Option 4, start with blanks and fill in manually, is probably the only thing that can be done for a large proportion of articles, my guess in the order of half. It will have to be done, and is probably the de facto default. It is easy, quick and will do no harm. It is totally compatible with option 5, for which it is the first step.
    Option 5 is starting with option 4 and applying ad hoc local solutions which can be shown to be useful. Any harm is localised, Wikidata descriptions can be used when they are appropriate, extracts from leads can be used when appropriate, mashups from infoboxes can be used when appropriate, and manual input from people who actually know what the article is about can be used when appropriate. I think there is no better, simpler, and more practical option than this, and suggest that projects should consider how to deal with their articles. WPSCUBA already has manually entered short descriptions ready for use for more than half of its articles, which I provided as an experiment. It is fairly time consuming, but gets easier with practice. Some editors may find that this is a fun project, others will not, and there will inevitably be conflicts, which I suggest should be managed by BRD as simple content disagreements, to be discussed on talk pages and finalised by consensus. In effect, option 5 is the wiki way. It is simple and flexible, and likely to produce the best results with the least amount of damage. · · · Peter (Southwood) (talk): 11:26, 8 December 2017 (UTC)[reply]
    (There was an edit conflict here and I chose to group all my comments together · · · Peter (Southwood) (talk): 11:26, 8 December 2017 (UTC))[reply]
  • #5. Whether a Wikidata description is suitable or not is very different across many groups of articles. It should be decided (and possibly bot populated) per group, sometimes per small group, and for that we need to start from blank descriptions.--Ymblanter (talk) 11:23, 8 December 2017 (UTC)[reply]
  • Start with not using it anywhere, only use it as override per situation. —TheDJ (talkcontribs) 12:09, 8 December 2017 (UTC)[reply]
    TheDJ, To clarify, is this an Option 6: Other that you are proposing here? i.e. Only add the magic word to articles where the Wikidata short description is unsuitable, and use Wikidata description as default in all cases until someone finds a problem and adds a magic word, after which the short description will be taken from the magic word? If this is the case, what is your opinion on reverting to Wikidata description for any reason at a later date? · · · Peter (Southwood) (talk): 14:53, 8 December 2017 (UTC)[reply]
    @Pbsouthwood: Correct. I have no opinions on reverting at a later moment. —TheDJ (talkcontribs) 13:42, 11 December 2017 (UTC)[reply]
  • #5. That doesn't deal with all the issues, but it comes closest to my views, given the choices. See also my comments at WT:Wikidata/2017 State of affairs/Archive 12 and WT:Wikidata/2017 State of affairs. - Dank (push to talk) 14:06, 8 December 2017 (UTC)[reply]
    • Peter asked me for clarification. If people have specific questions, or if they want a summary of my previous posts, I'll do my best to answer. - Dank (push to talk) 15:30, 8 December 2017 (UTC)[reply]
  • #5 but don't wait too long to fill in where possible. Fram (talk) 14:14, 8 December 2017 (UTC)[reply]
    Fram Are you recommending a massive short term drive to produce short descriptions to make the system useful? · · · Peter (Southwood) (talk): 15:09, 8 December 2017 (UTC)[reply]
    • Yes, although this isn't in my view necessary to proceed with this, only preferable. No descriptions is better than the current situations, but decent enwiki-based descriptions is in many cases better than no descriptions. No need to throw out the baby (descriptions to be shown in search and so on) with the bathwater. Fram (talk) 15:21, 8 December 2017 (UTC)[reply]
  • #6 - don't use it. There has been no consensus to have this magic word in the first place - that is the question that should have been asked in this RfC (see discussion here). I personally think it is a bad idea and a waste of developer time. It's better to focus on improving the descriptions on Wikidata instead. Mike Peel (talk) 15:27, 8 December 2017 (UTC)[reply]
  • #6 — Find a solution that monitors and updates Wikidata descriptions — If a description is good enough for Wikipedia(ns), it should be on Wikidata. If vandalism is blocked on Wikipedia, it should be simultaneously reverted on Wikidata. Wikidata is the hub for interwiki links and a storage site for both descriptions and structured data that then are harvested by external knowledge-based search engines (think Siri, Alexa, and Google's Knowledge Graph). For interwiki purposes, we should want to ensure that short descriptions at Wikidata are accurate, facilitating other language Wikipedias when they interlink to en.wiki. For external harvesting, we should want to prevent vandalism from being propagated. The problems regarding vandalism and sourcing on Wikidata are real, but the solution is for Wikipedians and our anti-vandalism bots to be able to easily monitor and edit the relevant Wikidata material. Possible solutions would include: (a) Implementing a pending changes-like functionality for changes to descriptions on high-traffic or contentious pages; (b) Make changes to short descriptions prominently visible on Wikipedia watchlists, inside the VisualEditor, and as a preference option for Wikipedia editors; (c) Develop and implement in-Wikipedia editing of Wikidata short descriptions using some kind of click-on-this-pencil tool.--Carwil (talk) 15:46, 8 December 2017 (UTC)[reply]
    • After those solutions are implemented, you are free to ask for an rfC to overturn the consensus of the previous RfC which decided not to have these descriptions. This RfC is a discussion to get solutions which give you what you want on enwiki (descriptions in VE, mobile, ...) without interfering in what Wikidata does (they are free to have their own descriptions or to import ours). Fram (talk) 16:01, 8 December 2017 (UTC)[reply]
    Carwil, How do you propose that Wikipedia controls access by vandals to Wikidata? Are you suggesting that Wikipedia admins should be able to protect Wikidata items and block Wikidata users?
    The easy options are… "undo" functionality for Wikidata descriptions in Wikipedia watchlists, and option (a) I proposed above, something like pending-changes that protects pages on Wikipedia from unreviewed changes from Wikidata. Transferring anti-vandalism bots from Wikipedia to Wikidata would also be helpful.--Carwil (talk) 16:47, 8 December 2017 (UTC)[reply]
    Cluebot already runs on Wikidata. ChristianKl (talk) 15:19, 17 December 2017 (UTC)[reply]
  • Strongly oppose 4 and strongly oppose 5—Let's reject any solution that mass-blanks short descriptions: these are a functional part of mobile browsing and of the VisualEditor. As an editor and a teacher who brings students into editing Wikipedia, the latter functionality is a crucial timesaver. Wikipedia is increasingly accessed by mobile devices and short descriptions prevent clicking through to a page only to find it's not the one you are looking for.--Carwil (talk) 15:46, 8 December 2017 (UTC)[reply]
    Have you analysed the overall usefulness of Wikidata descriptions and found that there are more good descriptions than bad, or found a way to find all the bad ones so they can be changed to good? If so please point to your methods and results, as they would be extremely valuable. What methods have you used to indicate the comparative harm done by bad descriptions versus the good done by good descriptions? · · · Peter (Southwood) (talk): 16:14, 8 December 2017 (UTC)[reply]
Yes, I have analyzed a sample here. I found that 13 of 30 had adequate descriptions (though 7 of them could be improved), 13 had no descriptions at all, 1 was incorrectly described (not vandalism), 2 were redundant with the article title (i.e., they should be overridden with a blank), and 1 represented a case where the Wikipedia article and the Wikidata entity were not identical and shouldn't share the same description. The redundant descriptions would cause no harm. Mislabelling "Administrative divisions of Bolivia" with the subheading "administrative territorial entity of Bolivia" would cause mild confusion. The legibility provided by descriptions easily outweigh the harms. (The only compelling harm is due to vandalism, which should be addressed by improving vandalism tools not forking the descriptions between the projects.)--Carwil (talk) 16:47, 8 December 2017 (UTC)[reply]
The options 4 and 5 are not to blank anything, they are to put short descriptions, which are text content, into the article they describe, where they can be properly, (or at least better), maintained, by people who may actually know what the article is about. Wikidata can use them if their terms of use allow, and if they are actually better for Wikidata's purposes, which is by no means clear at present. · · · Peter (Southwood) (talk): 16:14, 8 December 2017 (UTC)[reply]
Options 4 and 5 involve starting with blanks everywhere. The whole proposal assumes that we should fork a dataset describing Wikipedia articles into two independently editable versions. Forking a dataset always creates inconsistencies and reduces the visibility of problems by splitting the number of eyes to watch for problems. Better to make Wikipedians' eyes more powerful and spotting problems (which are unusual) rather than to throw up wall between the two projects. My sample suggests that 90% of the time, or more, the two projects are working towards the same goal here.--Carwil (talk) 16:47, 8 December 2017 (UTC)[reply]
They do, but it is unlikely the WMF will switch until the Wikipedia results are no worse than the Wikidata results, though I have no idea how they would measure that, since they don't seem to have much idea of the quality they will be comparing against, or if they do, are not keen on sharing it.
The dataset does not suit Wikipedia. We should not be forced to use it. A dataset that suits Wikipedia may not suit Wikidata. Should we force it on them? Two datasets means Wikipedia can look after their own, and Wikidata can use what they find useful from it, and Wikipedians are not coerced into editing a project they did not sign up for. Using shitty quality data on Wikipedia to exert pressure on Wikipedians to edit Wikidata may have a backlash that will harm either or both projects, not a risk I would be willing to take, if it could affect my employment, unless of course I was being paid to damage the WMF, but that would be conspiracy theory, and frankly I think it unlikely.
I also did a bit of a survey, my results do not agree with yours, and they are also from such a small sample as to be statistically unreliable. I also wrote short descriptions for about 600 articles in WPSCUBA, but did not keep records. Most (more than half) articles needed a new description as the Wikidata one either did nor exist or was inappropriate. There were some which were perfectly adequate, but less than half of the ones that actually existed, from memory. It would be possible to go back and count, but I think it would be a better use of my time to do new ones, if anyone is willing to join such a project. Maybe Wikiproject Medicine, or Biography, where quality actually may have real life consequences, but I don't usually work much in those fields and hesitate to move into them without some project participation. I have already run into occasional unfriendly reactions where projects overlap, but fortunately very few. · · · Peter (Southwood) (talk): 18:18, 8 December 2017 (UTC)[reply]
  • @Pbsouthwood: I don't think there's much daylight between Wikipedia's purpose for these descriptions (which hasn't been written yet), the value of them for the mobile app, the value of them for the VisualEditor (as disambiguators for making links), and the value for Wikidata as discussed here. There, the requirements include: "a short phrase designed to disambiguate items with the same or similar labels"; avoiding POV, bias, promotion, and controversial claims; and avoiding "information that is likely to change." Only the last one seems likely to differ from the ideal Wikipedia description and only marginally: e.g., "current president of the United States" would have to be replaced with "45th president of the United States."--Carwil (talk) 22:11, 12 December 2017 (UTC)[reply]
  • There have been extensive discussions between community and WMF on the description issue. I wish this RFC had gone through a draft stage before posting. There may be other options or issues that may need to be sorted out, potentially affecting the outcome here. A followup RFC might be needed.
    The previous RFC[3] consensus was clearly to eliminate wikidata-descriptions, and that is definitely my position. An alternate option would be to skip creating a description-keyword at all, and just take the description from the lead sentence. That has the benefits of (1) ensuring all articles automatically have descriptions (2) avoiding any work to create and maintenance on descriptions and (3) it would avoid creating a new independent independent issue of description-vandalism. The downside is that the lead sentence doesn't always make for a great short description.
    If we go with a new description keyword, #5 #2 and #1 are all reasonable. (#3 and #4 are basically redundant to bot approval in #5). However as I note in the question below, #5 can be implemented with a temporary wikidata-default. This gives us time to start filling in local-descriptions before the wikidata-descriptions are shut off. This would avoid abruptly blanking descriptions. Alsee (talk) 21:49, 8 December 2017 (UTC)[reply]
  • #2, with #5 as a second preference. The autogenerated descriptions look like they're good enough for most purposes. Sandstein 16:14, 10 December 2017 (UTC)[reply]
    Sandstein, How big was your test sample, and how were the examples chosen? · · · Peter (Southwood) (talk): 16:36, 10 December 2017 (UTC)[reply]
  • 5. Mass-importing WD content defeats the purpose of getting rid of WD descriptions. James (talk/contribs) 16:30, 11 December 2017 (UTC)[reply]
    Only true for a limited period until someone gets round to changing them where necessary. If the problem is big enough, there will be bot runs to do fixes, so over a medium term it does not make much difference as once the descriptions are in Wikipedia we can fix them as fast as we can make arrangements to do so and will no longer be handicapped by WP:ICANTHEARTHAT obfuscations from WMF. The important part is to get them where we have the control so we can start work getting them right. · · · Peter (Southwood) (talk): 07:47, 13 December 2017 (UTC)[reply]
  • 5 Basically what Peter said. In some areas, the wikidata descriptions will be good. In others the first sentence stripping will be good. In some data from infoboxes can be used. Etcetera. Galobtter (pingó mió) 16:20, 19 December 2017 (UTC)[reply]
  • 5 - Having just read the discussions on this I'm absolutely astounded that so much vandalism has taken place, Anyway back on point Wikidata is beyond useless when it comes to dealing with vandalism and as such 5 is the best way of dealing with it!. –Davey2010 Merry Xmas / Happy New Year 23:24, 27 December 2017 (UTC)[reply]
  • Combination of 1 and 5. Important but keep hidden until reviewed on WP. Doc James (talk · contribs · email) 06:19, 30 December 2017 (UTC)[reply]
  • #6 Retain Wikidata descriptions and bypass only those not needed Eventually all Wikipedias will have to use Wikidata. Moving back and forth make no much sense. The only thing we could do is possibly add functions to update Wikidata directly and retain functionality to bypass magic word locally. -- Magioladitis (talk) 15:56, 30 December 2017 (UTC)[reply]

What to do with blanks

What should we do when there is no magic word, or the magic word has no value?

  1. Show the Wikidata description instead
  2. Show no description
  3. Show no description for a predefined list of cases (lists, disambiguation pages, ...) and the Wikidata one otherwise (this is the solution advocated by User:DannyH (WMF) at the moment)
  4. Other
  5. A transition from #1 to #2. In the initial stage, any article that lacks a local description will continue to draw a description from Wikidata. We deploy the new description keyword and start filling in local descriptions which override Wikidata descriptions. Once we have built a sufficient base of local descriptions, we finalize the transition by switching-off Wikidata descriptions completely. (Note: Added 16:34, 6 January 2018 (UTC). Previous discussion participants have been pinged to discuss this new option in subsection Filling in blanks: option #5.)

Discussion on blanks

  • #2 – comes closest to having no description per initial aborted RfC; those who want them can write them, or fill in automatically (per usual bot approval procedures). --Francis Schonken (talk) 10:28, 8 December 2017 (UTC)[reply]
  • #5 as a reasonable compromise per various discussions below. · · · Peter (Southwood) (talk): 18:24, 6 January 2018 (UTC) #2 The Wikidata description should not be allowed as a default where there is no useful purpose to be served by a short description. An empty parameter to the magic word must be respected as a Wikipedia editorial decision that no short description is wanted. This decision can always be discussed on the talk page. Under no circumstances should WMF force an unwanted short description from Wikidata as a default. Nothing stops anyone from manually adding a description which is also used by Wikidata, but that is a personal decision of the editor and they take personal responsibility as for any other edit. Automatically providing no description for a predefined list of classes has problems, in that those classes may not be as easily defined as some people might like to think. For example, most list articles don't need a short description, but some do. The same may be true for disambiguation pages. Leaving them blank as the first stage and not displaying a short description until a (hopefully competent) editor has added one is easy to manage for the edge cases, and may be managed by other methods per option 5 of population. It is flexible and can deal with all possibilities. There is no need to make it more complicated and liable to break some time. Ideally the magic word could be given a comment in place of a parameter where an explanation of why there should not be a short description would be useful. In this case the comment should not be displayed and is there to inform editors who might wonder if it had been missed. · · · Peter (Southwood) (talk): 11:43, 8 December 2017 (UTC)[reply]
  • #1 - Show the wikidata description in stead. —TheDJ (talkcontribs) 12:10, 8 December 2017 (UTC)[reply]
  • #2. No magic word (and magic word with no parameter) should result in no description, not some non-enwiki data being confusingly shown to readers (while being missed by most vandalism patrollers apparently). Today, for 8 hours, we had this blatant BLP violation on a page with 10,000 pageviews per day. Using these descriptions by default (or at all) is a bad idea, and was rejected at the previous RfC. Fram (talk) 14:21, 8 December 2017 (UTC)[reply]
  • #1 - From the WMF: We're proposing using Wikidata as the fallback default if there isn't a defined magic word on Wikipedia, because short descriptions are useful for readers (on the app in search results, in the top read module, at the top of article pages) and for editors (in the Visual Editor link dialog). For example: in the top read module from September pictured here, 3 of the 5 top articles benefit from having a short description -- I don't know who Gennady Golovkin and Canelo Álvarez are, and having them described as "Kazakhstani boxer" and "Mexican boxer" tells me whether I'm going to be interested in clicking on those. (The answer on that is no, I'm not really a boxing guy.) I know that Mother! is a 2017 film, but I'm sure there are lots of people who would find that article title completely baffling without the description. Clicking through to the full list of top read articles, there are a lot of names that people wouldn't know -- Amber Tamblyn, Arjan Singh, Goran Dragić. This is a really popular feature on the apps, and it would be next to useless without the descriptions.
We want to create the magic word, so that Wikipedia editors have editorial control over the descriptions, which they should. But if the magic word is left blank on Wikipedia -- especially in the cases where Wikipedia editors haven't written a description yet -- then for the vast majority of cases, showing the description from Wikidata is better than not showing anything at all. As a reader looking at that top read module, I want to know who Gennady Golovkin is, and the module should say "Kazakhstani boxer," whether that text comes from Wikipedia or Wikidata.
I know that a big reason why people are concerned about showing the Wikidata descriptions is that the Wikidata community may sometimes be slower than the Wikipedia community to pick up on specific examples of vandalism. The example that Fram cites of Henry VIII of England showing "obey hitler" for two hours is disappointing and frustrating. However, I think that the best solution there should be to improve the community's ability to monitor the short descriptions, so that vandalism or mistakes can be spotted and reverted more quickly. The Wikidata team has been working on providing more granular display in watchlists on Wikipedia, so that Wikipedia editors can see edits to the descriptions for the articles that they're watching, without getting buried by other irrelevant edits made to that Wikidata item. That work is being tracked in this Phabricator ticket -- phab:T90436 -- but I'm not sure what the current status is. Ping for User:Lydia Pintscher (WMDE) -- do you know how this is progressing?
Sorry for only getting back to this now. It slipped through. So we have continued working on improving which changes show up in the recent changes and watchlist here from Wikidata. Specifically we have put a lot of work into scaling the current system, which is a requirement for any further improvements. We have made the changes we are sending smaller and we have made it so that less changes are send from Wikidata to Wikipedia. We have also rolled out fine-grained usage tracking on more wikis (cawiki, cewiki, kowiki, trwiki) to see how it scales. With fine-grained usage tracking you will no longer see changes in recent changes and watchlist that do not actually affect an article like it is happening now. The roll-outs on these wikis so far looks promising. In January we will continue rolling it out to more wikis and see if it scales enough for enwiki. At the same time we will talk to various teams at the developer summit in January to brainstorm other ways to make the system scale better or overhaul it. --Lydia Pintscher (WMDE) (talk) 09:31, 19 December 2017 (UTC)[reply]
We've talked in the previous discussions about types of pages where the Wikidata descriptions aren't useful for article display, because they're describing the page itself, rather than the subject of the article. The examples that I know right now are category pages (currently "Wikimedia category page"), disambiguation pages ("Wikimedia disambiguation page"), list pages, and the main page. Those may be helpful in the case of the VE link dialog, especially "disambiguation page", but there's no reason to display those at the top of the article page, where they look redundant and kind of silly. We're proposing that we just filter those out of the article page display, and anywhere else where they're unnecessary. I'd like to know more examples of pages where short descriptions aren't useful, if people know any.
For article pages, I don't know of any examples so far where a blank description would be better for the people who need them (people reading, searching or adding links on VE). If we're going to build the "show a blank description" feature, then we need to talk about specific use cases where that would be the best outcome. That's how product development works -- you don't build a feature, if you don't have any examples for where it would be useful. If people have specific examples, then that would help a lot. -- DannyH (WMF) (talk) 14:58, 8 December 2017 (UTC)[reply]
"For article pages, I don't know of any examples so far where a blank description would be better " Check the two examples of vandalism on pages with 10K+ pageviews per day I gave in this very discussion, including one very blatant BLP violation which lasted for 8 hours today. In these examples, a blank description would have been far preferable over the vandalized one, no? Both articles, by the way, are semi-protected here, so that vandalism couldn't have done by the IPs here (and would very likely have been caught much earlier). "specific use cases where that would be the best outcome." = all articles, and certainly BLPs. Fram (talk) 15:19, 8 December 2017 (UTC)[reply]
Better than no description?
If you want another example of where no description would be preferable over the Wikidata one, look to the right. This is what people who search for WWII (or have it in "related articles", the mobile app, ... see right now and have seen since more than 5 hours (it will undoubtedly soon be reverted now that I have posted this here). This kind of thing happens every day, and way too often on some of our most-viewed pages. Fram (talk) 15:38, 8 December 2017 (UTC)[reply]
I agree that the vandalism response rate on Wikidata is sometimes too slow. I think the solution to that is to make that response rate better, by making it easier for Wikipedia editors to monitor and fix vandalism of the descriptions. I disagree that the best solution is to pre-emptively blank descriptions because we know that there's a possibility that they'll be vandalized. I'm asking for specific examples where editors would make the choice to not show a description on the article page, because a blank description is better than the majority of good-to-adequate descriptions already on Wikidata. -- DannyH (WMF) (talk) 16:10, 8 December 2017 (UTC)[reply]
And I am saying that this is a red herring. Firstly, you claim that there exists a majority of good-to adequate descriptions on Wikidata, without any convincing evidence that this is the case. I am stating that out of several hundred short descriptions that I produced, there were a non-zero number of cases where a short description made no apparent improvement over the article title by itself. · · · Peter (Southwood) (talk): 16:21, 8 December 2017 (UTC)[reply]
DannyH (WMF), Filtering descriptions out of the article page view means that they will be invisible for maintenance which is very bad, unless they are filtered out based on content, not on page type, which may be technically problematic - you tell me, I don't write filter code. Can you guarantee that no vandalism can sneak through by this route?. As long as they are visible anywhere in association with the Wikipedia article they are a Wikipedia editorial issue. · · · Peter (Southwood) (talk): 15:39, 8 December 2017 (UTC)[reply]
We are not asking for a development feature to leave out descriptions that don't exist, it is the simplest possible default. Please try to accept that simply displaying whatever content is in the magic word parameter is the simplest and most versatile solution, and that if we leave it blank that is because we prefer it to be left blank. If anyone prefers to have a short description in any of these cases, they can edit Wikipedia to put in the one they think is right, and if anyone disagrees strongly enough to want to remove it, they can follow standard procedure for editorial disagreement, which is get consensus on the talk page. It is not rocket science, it is the Wikipedia way of doing these things. If it is difficult for the magic word to handle a comment in the parameter we can simply put the comment outside. There may be a few more cases where people will fail to notice that it is there, but probably not a train smash. Is there any reason why a comment in the parameter space should not be parsed as equivalent to no description? I have asked this before, and am still waiting for an answer.· · · Peter (Southwood) (talk): 15:39, 8 December 2017 (UTC)[reply]
  • #1 - and focus on improving the descriptions on Wikidata. Mike Peel (talk) 15:27, 8 December 2017 (UTC)[reply]
    • See discussion at Wikipedia_talk:Wikidata/2017_State_of_affairs#Circular_"sourcing"_on_Wikidata - I've posted a random sample of 1,000 articles and descriptions, of which only 1 description had a typo and none seemed to be blatently wrong - although 39% don't yet have a description. So let's add those extra descriptions / improve the existing ones, rather that forking the system. Thanks. Mike Peel (talk) 00:14, 12 December 2017 (UTC)[reply]
      • That sample includes the many typical descriptions which are right on Wikidata and useless (or at least very unclear) for the average enwiki reader: "Wikimedia disambiguation page" (what is Wikimedia, shouldn't that be Wikipedia, and even then, I know I'm on Wikipedia, and we don't use "Wikipedia article" as description for standard articles either...) There are also further typos ("British Slavation Army officer"), useless descriptions ("human settlement", can we be slightly more precise please), redundant ones (Shine On (Ralph Stanley album) - "album by Ralph Stanley")... And the basic issue, that language-based issues shouldn't be maintained at Wikidata but at the specific languages, is not "forking", it is taking back content which doesn't belong at Wikidata but at enwiki. Fram (talk) 05:45, 12 December 2017 (UTC)[reply]
    • You may add "Descriptions not in English" to the problems list from that sample: "Engels; schilder; 1919; Londen (Engeland); 1984". Fram (talk) 06:01, 12 December 2017 (UTC)[reply]
    • And "determined sex of an animal or plant. Use Q6581097 for a male human" is not really suitable for use on enwiki either (but presumably perfect for Wikidata). Neeraj Grover murder case - "TV Executive" seems like the wrong description as well. Stefan Terzić - "Team handball" could also use some improvement. Fram (talk) 07:56, 12 December 2017 (UTC)[reply]
      • OK, so maybe 4/1000 have typos/aren't in English/are wrong - that's still not bad. Most of the rest seems to be WP:IDONTLIKEIT (where I'd say WP:SOFIXIT on Wikidata, but you don't want to do that). Yes, it is forking - the descriptions currently only exist on Wikidata (we've never had them on Wikipedia), and they aren't going away because of this - so you want to fork them, and in a way that means the two systems can't later be unforked (due to licensing issues). That's not helpful, particularly in the long term. Mike Peel (talk) 19:58, 12 December 2017 (UTC)[reply]
        • I gave more than 4 examples, some 40% don't have a description (so can hardly be wrong, even if many of those need a description), and many have descriptions we can't or shouldn't use. Basically, you started with 0.1% problem in your view, when it is closer to 50% in reality. Please indicate which licensing issues you see which would make unforking impossible. It seems that these non-issues would then also make it impossible to import the Wikidata descriptions, no? Seems like a red herring to me. By the way, have you ever complained about forking when Wikidata was populated with millions of items from enwiki (and other languages), where from then on they might evolve separately? Or is forking only an issue when it is done from Wikidata to enwiki, and not the reverse? Fram (talk) 22:28, 12 December 2017 (UTC)[reply]
          • Only a few of your example problems seem to be actual problems, the rest are subjective. You're proposing that we switch to 100% without description, so I can't see how you can argue about the 40% blank descriptions (and they weren't a problem at the start of this discussion). I'm not saying 0.1%, but ~1% seems reasonable here. Enwp descriptions are CC-BY-SA licensed, which means they can't be simply copied to Wikidata as that has a CC-0 license (and yes, this isn't great, and copyrighting the simple descriptions doesn't make any sense, but it is what it is) - although that means that we can still copy from Wikidata to here if needed. I'm complaining that we're forking things here to do the same task (describing topics), and that we're trying to do so using the wrong tool (free text with hacks) rather than a better tool (a structured database). Mike Peel (talk) 23:01, 12 December 2017 (UTC)[reply]
            • Ah, the old "structured database" vs "free text with hacks" claim, I wondered why it wasn't mentioned yet. In Wikidata, you are putting free text in aa database field, which then at runtime gets read and displayed. In enwiki, you are putting free text in a "magic word" template, which then at runtime gets read and displayed. Pretending that the descriptions in Wikidata aren't free text and in enwiki are free text is not really convincing. However, what is the wrong tool for the task is Wikidata, as that is not part of the enwiki page history and wikitext, and thus can't be adequately monitored, protected, ... The only "hack" is the current one, using Wikidata to do something enwiki can do better (and which philosophically also belongs on enwiki, as it is language-based text, not some universally accepted value). Fram (talk) 07:53, 13 December 2017 (UTC)[reply]
  • #2, or transition from #1 to #2. I have engaged significant discussions with the WMF on the descriptions-issue on the Wikidata/2017 State of affairs talk page. The WMF has valid concerns about abruptly blanking descriptions, and we should try to cooperate on those concerns. Temporarily letting a blank keyword default to wikidata (#1) will give us time to begin filling empty local descriptions before shutting off wikidata descriptions (#2). But in the long run my position is definitely #2. Alsee (talk) 21:02, 8 December 2017 (UTC) Adding explicit support for #5, which is essentially matches my original !vote. Alsee (talk) 16:36, 6 January 2018 (UTC)[reply]
    This could work. While we are filling in short descriptions, whenever we find an article that should not have a short description, we could put in a non-breaking space to override an unnecessary Wikidata description. We will need to see the actual display shown on mobile on desktop too, so we can see what we are doing. As long as there is a display of the short description in actual use on desktop, it might be unnecessary to switch. That would reduce the pressure to rush the process, which may be a good thing, but also may not. · · · Peter (Southwood) (talk): 10:12, 9 December 2017 (UTC)[reply]
    Alsee, thanks. I've been staying out of conversations about if/when/how the magic word gets used/populated, because I think those are the content decisions that need to be made by the English WP community. I want to figure out how we can get to the place where Wikipedia editors have proper editorial control over the short descriptions, without hurting the experience of the readers and editors who are using those descriptions now. -- DannyH (WMF) (talk) 23:29, 11 December 2017 (UTC)[reply]
You can enable a view of the Q-code, short description and alias via this script: [[4]].--Carwil (talk) 13:01, 9 December 2017 (UTC)[reply]
Carwil, This is exactly the kind of display I had in mind. It is easily visible, but obviously not part of the article per se, as it is displayed with other metadata in a different text size. To be useful it would have to be visible to all editors who might make improvements to poor quality descriptions, so would have to be a default display on desktop. This may not be well received by all, but it would be useful, maybe as an opt-out for those who really do not want to know. It still does not deal with the inherent problems of having the description on Wikidata, in that it is not Wikipedia and we do not dictate Wikidata's content policies, control their page protection, block their vandals etc, but it does let us see what is there, and fixing is actually quite easy, though maybe I am biased as I have done a fair amount of work on Wikidata. I would be interested to hear the opinions of people who have not previously edited Wikidata on using this script. I can definitely recommend it to anyone who wants to monitor the Wikidata description. Kudos to Yair rand.· · · Peter (Southwood) (talk): 16:15, 9 December 2017 (UTC)[reply]
It also does not solve the problem of different needs for the description. When the Wikidata description is unsuitable for Wikipedia, we should not arbitrarily change it if it is well suited to Wikidata's purposes, but if it is going to be used for Wikipedia, we may have to do just that.· · · Peter (Southwood) (talk): 16:21, 9 December 2017 (UTC)[reply]
  • #2. Any Wikidata import should be avoided because that content is not subject to Wikipedia editorial control and consensus. Sandstein 16:16, 10 December 2017 (UTC)[reply]
    Sandstein, My personal preference is that eventually all short descriptions should be part of Wikipedia, and not imported in run time, however, as an interim measure, to get things moving more quickly, I see some value in initially displaying the Wikidata description as a default for a blank magic word parameter, as it is no worse than what WMF are already doing, and in my opinion are likely to continue doing until they think the Wikipedia local descriptions are better on average. If anyone finds a Wikidata description on display that is unsuitable, all they have to do is insert a better one in the magic word and it immediately becomes a part of Wikipedia. If you find a Wikidata description that is good, you can also insert it into the magic word and make it local, as they are necessarily CC0 licensed. The only limitation on getting 100% local content is how much effort we as Wikipedians are prepared to put into it. Supporters of Wikidata can improve descriptions on Wikidata instead if that is what they prefer to do, and as long as a good short description is displayed, it may happen that nobody feels strongly enough to stop allowing it to be used. I predict that whenever a vandalised description is spotted, most Wikipedians will provide a local short description, so anyone in favour of using Wikidata descriptions would be encouraged to work out how to reduce vandalism and get it fixed faster, which will greatly improve Wikidata. Everybody wins, maybe not as much as either side would prefer, but more than they might otherwise. As it would happen, WMF win the most, but annoying as that may be to some, we can live with it as long as we also have a net gain for Wikipedia and Wikidata. · · · Peter (Southwood) (talk): 16:58, 10 December 2017 (UTC)[reply]
  • 2. We have neither the responsibility nor the authority to enforce WP guidelines on a project with diametrically opposed policies. Content outside of WP's editorial control should not appear on our pages, period. James (talk/contribs) 16:34, 11 December 2017 (UTC)[reply]
  • 2 comes closest to my views, given the choices. See my comments at WT:Wikidata/2017 State of affairs/Archive 12 and WT:Wikidata/2017 State of affairs. Also see the RfC from March; most of what was said there is equally relevant to the current question. - Dank (push to talk) 21:01, 11 December 2017 (UTC)[reply]
  • Comment from WMF: I want to say a word about compromise and consensus. I've been involved in these discussions for almost three months now, and there are a few things that I've been consistent about.
First is that I recognize and agree that the existing feature doesn't allow Wikipedia editors to have editorial control over the descriptions, and it's too difficult for Wikipedia editors to see the existing descriptions, monitor changes, and fix problems when they arise. Those are problems that need to be fixed, by the WMF product team and/or the Wikidata team.
Second: the way that we fix this problem doesn't involve us making the editorial decisions about the format or the content. That's up to the English Wikipedia and Wikidata communities, and if there's disagreement between people in those communities, then ultimate control should be located on Wikipedia and not on Wikidata. In other words: when we build the magic word, we're not going to control how it's used, how often, or what the format should be. I think that both of these two points are in line with what most of the people here are saying.
The third thing is that we're not going to agree to a course of action that results in the mass blanking of existing descriptions, for any meaningful length of time. I recognize that that's something that most of the people here want us to build, but that would be harmful to the readers and editors that use those descriptions, and that matters. This solution needs to have consensus with us, too, because we're the ones who are going to build it. I'm not saying that we're going to ignore the consensus of this discussion; I'm saying that we need to be a part of that consensus. -- DannyH (WMF) (talk) 15:13, 12 December 2017 (UTC)[reply]
How many people have actually complained in the 8 months or so that descriptions have now been disabled in mobile view? "readers and editors that use those descriptions": which editors would that be? Anyway, basically you are not going to interfere in content decisions, unless you don't like the result. But at the same time you can't be bothered to provide the necessary tools to patrol and control your features (and your first point is rather moot when this magic word goes live and works as requested anyway). Which is the same thing you did (personally and as WMF) with Flow, Gather, ... which then didn't get changed, improved, gradually accepted, but simply shot down in flames, at the same time creating lots of unnecessary friction and bad blood. Have you actually learned anything from those debacles? Most people here actually want to have descriptions, and these will be filled quite rapidly (likely to a higher percentage than what is provided now at Wikidata). But we will fill them where necessary, and we will leave them blank where we want them to be blank. You could have suggested over the past few months a compromise, where either "no magic word" or "magic word with no description" would mean "take the wikidata description", and the other meant "no description". You could have suggested "after the magic word is installed, we'll take a transitional period of three months, to see if the descriptions get populated here on enwiki; afterwards we'll disable the "fetch desc from wikidata" completely". Instead you insisted that the WMF would have the final say and would not allow blanks unless it was for a WMF-preapproved list of articles (or article groups). Why? No idea. If the WMF is so bothered that readers should get descriptions no matter what (even if many, many articles don't have Wikidata descriptions anyway in the first place), then they should hire and pay some people to monitor these and make sure that e.g. blatant BLP violations don't remain for hours or days. But forcing us to display non-enwiki content against our will and without providing any serious help in patrolling it is just not acceptable. Fram (talk) 15:44, 12 December 2017 (UTC)[reply]
Fram, those compromises are what I'm asking for us to discuss. I'm glad you're bringing them up, that's a conversation that we can have. I'm going to be talking to the Wikidata team next week about the progress on building the patrolling and moderation tools. We don't have direct control over what the Wikidata team chooses to do, but I want to talk with them about how the continued lack of a way to effectively monitor the short descriptions is affecting this conversation, this community, and the feature as a whole. English Wikipedia editors need to have the tools to effectively populate and monitor the descriptions, and you need to have that on a timeline that makes sense. I need to talk to more people, and keep working on how to make that happen. I'm going to talk with people internally about the transitional period that you're suggesting. -- DannyH (WMF) (talk) 16:04, 12 December 2017 (UTC)[reply]
I think the major concern is the lack of control over enwp content. There are currently only two outside sources of enwp content over which the local community has no control: Commons and Wikidata; it has taken some years for Commons to build a level of trust over their content policies and failsafes to prevent abuse at enwp through Commons. The only reason today that the use of Commons materials here is two-fold 1) they've proven they can handle their business, and 2) there exists local over-rides that are transparent and easy to enact. For Wikidata to be useful and to avoid the kind of acrimony we are seeing here, we would need the SAME thing from Wikidata. Point 1) can only occur over time, and Wikidata is far too new to be proven in that direction. Recent gaffes in allowing vandalism off-site at Wikidata to perpetuate at enwp does not help either. If the enwp community is going to feel good about allowing Wikidata to be useful going forward, until that trust reaches what Commons has achieved, we need point 2 more than anything. Defaulting to local control over off-site control is necessary, and any top-down policy that removes local control, either directly or as a fait accompli by subtling controlling the technology, is unlikely to be workable. If Wikidata can prove their ability to take care of their own business reliably over many years, the local community would feel better about handing some of that local control over to them, as works with Commons now. But that cannot happen today, and it cannot happen if local overrides are not simple, robust, and the default. --Jayron32 17:32, 12 December 2017 (UTC)[reply]
"English Wikipedia editors need to have the tools to effectively populate and monitor the descriptions, and you need to have that on a timeline that makes sense." You know, yiou have lost months doing this by continually stalling the discussions and "misinterpreting" comments (always in the same direction, which is strange for real misunderstandings and looks like wilful obstruction instead). You just give us the magic word, and then we have the tools to monitor the descriptions: recent changes, watchlists, page histories, ... plus tools like semi- or full protection and the like. We can even build filters to check for these changes specifically. We can build bots to populate them. From the very start, everyone or nearly everyone who was discussing these things with you has suggested or stated these things, you were the only one (or nearly the only one) creating obstacles and finding issues with these solutions where none existed. "I'm going to be talking to the Wikidata team next week about the progress on building the patrolling and moderation tools." is totally and utterly irrelevant for this discussion, even though it is something that is sorely needed in general. Patrolling and moderating Wikidata descriptions is something we are not going to do; we will patrol and moderate ENWIKI descriptions, and we have the tools to do so (a conversation may be needed whether the descriptions will be shown in the desktop version or not, this could best be a user preference, but that is not what you mean). Please stop fighting lost battles and get on with what is actually decided and needed instead. Fram (talk) 17:54, 12 December 2017 (UTC)[reply]
I'm talking to several different groups right now -- the community here, the WMF product team, and the Wikidata team -- and I'm trying to get all those groups to a compromise that gives Wikipedia editors the control over these descriptions that you need, and doesn't result in mass blanking of descriptions for a meaningful amount of time. That's a process that takes time, and I'm still working with each of those groups. I know that there isn't much of a reason for you to believe or trust me on this. I'm just saying that's what I'm doing. -- DannyH (WMF) (talk) 18:02, 12 December 2017 (UTC)[reply]
Indeed, I don't. I'm interested to hear why you would need to talk to the Wikidata team to find a compromise about something which won't affect the Wikidata team one bit, unless you still aren't planning on implementing the agreed upon solution and let enwiki decide how to deal with it. Fram (talk) 22:28, 12 December 2017 (UTC)[reply]
Fram, you're saying "us", "we", etc. here rather freely. Please do not speak for all editors here, particularly when putting your own views forward at the same time. There's a reason we have RfC's... Thanks. Mike Peel (talk) 21:11, 12 December 2017 (UTC)[reply]
Don't worry, I'm not speaking for you. But we (enwiki) had an RfC on this already, and it's the consensus from there (and what is currently the consensus at this RfC) I'm defending. There's indeed a reason we have RfC's, and some of us respect the results of those. Fram (talk) 22:28, 12 December 2017 (UTC)[reply]
I'm glad you're not speaking for me - but why are you trying to speak for everyone except for me? What consensus are you talking about, this RfC is still running (although I'm worried that potential participants are being scared off by these arguments in the !vote sections)? And what consensuses are you accusing me of disrespecting? Mike Peel (talk) 22:40, 12 December 2017 (UTC)[reply]
FWIW, Fram definitely speaks for me. James (talk/contribs) 23:24, 12 December 2017 (UTC)[reply]
You don't really seem to care about the results of the previous RfC on this, just like you didn't respect the result of the WHS RfC when your solution was not to revert to non-Wikidata versions, but to bot-move the template uses to a /Wikidata subpage which was identical to the rejected template. Basically, when you have to choose between defending Wikidata use on enwiki or respecting RfCs, you go with the former more than the latter. Fram (talk) 07:53, 13 December 2017 (UTC)[reply]
ok... I propose that from this point on, DannyH, User:Mike Peel and User:Fram, cease any further participation in this RfC. You three and your mutual disagreements are again completely dominating the discussion, the exact thing that the Arbcom case was warning against. This is NOT helping the result of this discussion. —TheDJ (talkcontribs) 14:24, 13 December 2017 (UTC)[reply]
  • #3 – This makes the most sense to me for reasons I stated above. I would amend #3 only by saying: Immediately populate a local description for any pages being actively protected from vandalism which could just mean protected pages, or could mean (where appropriate) pages subjected to arbitration enforcement as well.--Carwil (talk) 18:02, 13 December 2017 (UTC)[reply]
  • #1 This whole idea is just adding complexity over a rather small problem. The less duplication on the datas, the better. We should focus on ways to follow more project add glance and focus on better tools to follow change on Wikipedia rather than splitting the Wikimedian forces on all the different project. Co-operation and sharing are the essence of these projects, not control, defiance and data duplication. TomT0m (talk) 16:34, 15 December 2017 (UTC)[reply]
  • #1 per DannyH. Additionally, we can configure protected articles to never display data from Wikidata. It's worth noting that this option allows you to run a bot that puts " " as description for a specific class of articles when you don't like the kind of descriptions that Wikidata shows for those articles. ChristianKl (talk) 15:29, 17 December 2017 (UTC)[reply]
    For clarification, Is your claim that we can configure protected articles to never display data from Wikidata based on knowing how this could be done, and that it is a reasonably easy thing to do, or a conjecture? Bear in mind how WMF is using the data on the mobile display. I ask because I do not know how they do it, so cannot predict how easy or otherwise it would be to block from Wikipedia side. Ordinary logic suggests that it may not be so easy, or it would already have been done. · · · Peter (Southwood) (talk): 04:44, 18 December 2017 (UTC)[reply]
    Without the magic keyword being active, it's not possible to easily prevent the import. However, once the feature is implemented you will be able to run a bot quite easily that creates "magic keyword = ' '" for every article that's protected or for other classes of articles where there's the belief that the class of article shouldn't import Wikidata and is better of with showing the user a ' ' instead of the Wikidata description.
    Additionally, I think the WMF should hardcode a limitation that once a Wikipedia semiprotects an article the article stops displaying Wikidata derived information. That would take some work on the WMF, but if that's what they have to do to get a compromise I think they would be happy to provide that guarantee. ChristianKl (talk) 12:31, 18 December 2017 (UTC)[reply]
    I would be both encouraged and a bit surprised to see WMF provide a guarantee. So far they have been very careful to avoid making any commitments to anything we have requested. I will believe it when I see it. I have no personal knowledge of the complexity of coding a filter that checks whether the article is protected or semi-protected and using that to control whether a Wikidata description is used that is fast and efficient enough to run every time that a short description may be displayed. but I would guess that this is an additional overhead that WMF would prefer to avoid. Requiring such additional software could also delay getting the magic word implemented, which would be a major step in the wrong direction. This needs to be simple and efficient, so the bugs will be minimised and speed maximised. Putting in a blank string parameter that displays as a blank string is easy and simple and requires no complicated extra coding. This can be done by any admin protecting an article where there is no local short description. · · · Peter (Southwood) (talk): 16:33, 18 December 2017 (UTC)[reply]
    ChristianKl, Wouldn't it make the most sense to produce a description for each protected article, rather than produce a blank? We're talking about a considerable shorter list than all articles here. Then we would have functionality (what WMF says they want) and protection from vandalism on Wikidata.--Carwil (talk) 15:23, 19 December 2017 (UTC)[reply]
    I agree with you that writing description in those cases makes sense, but the people who voted #2 seem to have the opinion that this isn't enough protection from vandalism on Wikidata and don't want that Wikidata content is shown even if the 'magic word' is empty. ChristianKl (talk) 17:55, 19 December 2017 (UTC)[reply]
  • #1 Most of the time, for most purposes, the Wikidata descriptions are fine. #1 gives a sensible over-ride mechanism for cases where there is particular sensitivity. Jheald (talk) 20:55, 17 December 2017 (UTC)[reply]
    Unless you have a reasonably robust analysis to base this prediction on, it is speculation. However it will make little difference in the long run, as it will be easy to override the Wikidata descriptions through the magic word. It is just a question of how tedious it would be, depending on what proportion of 5.5 million will have to be done.· · · Peter (Southwood) (talk): 04:44, 18 December 2017 (UTC)[reply]
  • #2 Agree with Alsee. Most sensible. Wikidata entries can be imported initially, if needed. It's easiest if things are kept on the same area, with the same community and policies. I really don't see the advantage of having things on Wikidata - the description is different for every language. Galobtter (pingó mió) 11:54, 19 December 2017 (UTC) Peter Southwood makes excellent practical points - an interim measure, to get things moving more quickly, I see some value in initially displaying the Wikidata description as a default for a blank magic word parameter, as it is no worse than what WMF are already doing too. Galobtter (pingó mió) 11:58, 19 December 2017 (UTC)[reply]
Galobtter—Wikidata has a data structure that maintains separate descriptions for each language (or at least each language with a Wikipedia).--Carwil (talk) 15:23, 19 December 2017 (UTC)[reply]
I know that, what I'm saying is why have all that when it's mostly useful only to that language wikipedia. Other data on wikidata is useful across wikipedias - raw numbers, language links, etc Galobtter (pingó mió) 15:27, 19 December 2017 (UTC)[reply]
If the description is defined in the article text, then it can only be used in that article, on that language Wikipedia. If the description is on Wikidata, then you can access it from other articles (e.g., list articles), and places like Commons (descriptions for categories on the same topic as the articles), or on Wikivoyage etc. If it's in a data structure like on Wikidata, then it's a lot more easy to automatically reuse it than if it's embedded in free-form text. Thanks. Mike Peel (talk) 15:34, 19 December 2017 (UTC)[reply]
The description on Wikidata will remain available, if other projects or environments want to use it and think it suits their purpose (and can better access it than Enwiki magic words). This is rather out-of-scope for this discussion though, which won't change anything on Wikidata. Fram (talk) 15:47, 19 December 2017 (UTC)[reply]
  • 2 because Wikidata is shite and shouldn't ever be used regardless of how good it is in some articles, WMF have had ample oppertunity to patrol that site and do something about it however instead they've ignored requests time and time again so it's about time we did something ourselves, Anyway back on point using blanks is better - If someone adds a silly description they'll be reverted and no doubt one will be added. –Davey2010 Merry Xmas / Happy New Year 23:31, 27 December 2017 (UTC)[reply]
  • Combination of 2 and 1: Wikidata descriptions are off (except for maybe a brief run in) but will be possibly shown when ONLY changes to them can occur in the watchlist. Doc James (talk · contribs · email) 06:24, 30 December 2017 (UTC)[reply]

Filling in blanks: option #5

Pinging everyone who !voted in the What to do with blanks discussion: Francis Schonken, Peter (Southwood), TheDJ, Fram, DannyH (WMF), Mike Peel, Sandstein , James, Dank, Carwil, TomT0m,ChristianKl, Jheald, Galobtter, Davey2010, Doc James.

If the debate is viewed as a simple option #1 vs option #2 outcome, the situation is rather unpleasant. By my count there is currently a majority for #2, however it's not exactly overwhelming. On the other hand #1 has substantial minority support, and the WMF is strongly averse to the possibility that descriptions get mass-blanked before we can repopulate with new descriptions.

There has been substantial discussion of an alternative that was not presented in the original RFC choices: a transition from #1 to #2. In the initial stage, any article that lacks a local description will continue to draw a description from Wikidata. We deploy the new description keyword and start filling in local descriptions which override Wikidata descriptions. Once we have built a sufficient base of local descriptions, we finalize the transition by switching-off Wikidata descriptions completely.

I believe multiple people above have expressed support above for this kind of compromise. People on opposing sides may consider this less-than-ideal for opposing reasons, however I hope everyone will consider this plan in an effort to build a collaborative compromise-consensus. Alsee (talk) 16:24, 6 January 2018 (UTC)[reply]

I don't really care how the transition is done (as long as it's done within a few months ish) - main goal is to eventually have everything on enwiki and rely little or not at all on wikidata. Galobtter (pingó mió) 16:58, 6 January 2018 (UTC)[reply]
I would grudgingly support a transition as long as WMF commits to a hard temporal deadline for switching off Wikidata descriptions. James (talk/contribs) 18:06, 6 January 2018 (UTC)[reply]
Yes, I would accept this compromise. It does not really matter in the long run, as long as WMF will switch off when Wikipedians decide that the local descriptions are adequate. It will be up to Wikipedians to get the descriptions populated. Anyone who wants to get Wikidata descriptions shut down sooner can make it happen by adding more short descriptions. · · · Peter (Southwood) (talk): 18:11, 6 January 2018 (UTC)[reply]
It still seems like a waste of time to me. Just use the Wikidata descriptions, and improve them there. Mike Peel (talk) 19:05, 6 January 2018 (UTC)[reply]
As we discussed below, the WMF plan is to switch from a Wikidata-fallback to full enwiki control when there are 2 million non-blank short descriptions on enwiki, which is roughly comparable to the number of existing descriptions on Wikidata. That will help to ensure that the readers and editors who use these descriptions won't notice a sudden degradation of the feature. -- DannyH (WMF) (talk) 19:16, 6 January 2018 (UTC)[reply]

Other discussion

I don't understand any of this. Could someone please explain this to me? Hydra Tacoz (talk) 21:49, 18 December 2017 (UTC)[reply]
@Hydra Tacoz: Some uses of Wikiepdia have short descriptions of articles (e.g. in the Wikipedia app or for search engines). Where and how we get or store this info is what is being discussed. A logical place to keep it is Wikidata but there are some problems with that. ―Justin (koavf)TCM 21:55, 18 December 2017 (UTC)[reply]
@Koavf|Justin Thank you! So, how is the Wikidata a safe place? What exactly is the Wikidata? Hydra Tacoz (talk) 22:04, 18 December 2017 (UTC)[reply]
@Hydra Tacoz: Wikidata is a sister project of Wikipedia: it is a wiki but it is not an encyclopedia like this is but a place to store structured data. If you aren't familiar with databases, it may seem confusing at first but imagine a musician (e.g. John Coltrane) and at Wikipedia, we would write a biography of him, Wikiqoute would have quotes by and about him, Commons would have photos or recordings of or about him, etc. Wikidata would store individual facts such as his birth date, citizenship status, record labels signed to, etc. One function of Wikidata internally for projects like Wikipedia is to store short descriptions of the subject--in this case, something like "American jazz saxophonist and composer". ―Justin (koavf)TCM 22:07, 18 December 2017 (UTC)[reply]
To clarify: The function of the short description on Wikidata is an internal Wikidata function. It is not, (or was not originally), intended for use as a description of a Wikipedia article for use when displaying a Wikipedia article name by a Wikimedia project other than Wikidata. WMF currently use it for this purpose because they consider it the best available alternative (pretty much the only existing non-zero option). The intention is reasonable, as it helps users to identify which of a selected group of articles is most likely to be useful, but has the problem that it is outside the direct control of the Wikipedia editors of the articles it is used to describe, and there are problems with persistent vandalism, inappropriate or sub-optimal descriptions, which can only be edited on Wikidata, and that some Wikipedians are not keen on being coerced into editing and maintaining Wikidata to prevent vandalism appearing in connection with Wikipedia articles. There is also a technical problem in that the short description is currently not visible from desktop view,and does not show up usefully on watchlists, so vandalism can go undetected by Wikipedians. The proposed solution is to provide a short description on Wikipedia for each article which can be drawn on for any purpose where it may be useful, and the WMF devs say that must be done by a new "magic word" which as far as I can make out is like a more efficient template function. Whether the magic word is initially populated with blanks or Wikidata short descriptions is relatively unimportant over the long term, as once it exists, Wikipedians can watch and change the short descriptions as and when editors feel that it is needed, from within the Wikipedia editing environment of the associated article - The short description will be part of the article itself, and changes will show up on watchlists. · · · Peter (Southwood) (talk): 08:22, 19 December 2017 (UTC)[reply]
Hydra Tacoz if you click this search link and type Bar into the search box (do not press enter) you will see a list of articles. The first entry will probably be Bar: establishment serving alcoholic beverages for consumption on the premises. That text is the short-description being discussed here. To help small-screen mobile users, that description appears at the top of an article when it's read in the Wikipedia App. It used to appear appear on the mobile-browser view as well. It appears in the link-tool in Visual editor, and it might appear elsewhere. That text is not written anywhere at Wikipedia. It comes from the Wikidata entry for bar. You can edit it there. As others noted, Wikidata is a sister project of the Wikimedia family. Wikidata has been creating those descriptions for their own use, and the Foundation decided it would be convenient to re-use those descriptions here. The EnWiki community was rather surprised when we realized it was added and how it works. There are concerns that most EnWiki editors never see those descriptions, and even if they do see them, they often don't know who to fix them. There is concern/debate about how well the Wikidata community can catch and fix vandalism. There are concerns that the descriptions are not subject to EnWiki policies, page-protection, or user-blocks. Edits at wikidata (including vandalism, biased edit-warring, or otherwise) will bypass any page-protection we put on the article, and we can potentially be blocked from editing a description if a Wikidata admin disagrees with EnWiki policies such as BLP. Descriptions intended for Wikidata-purposes also may not always be best suited for our purposes. The discussion here is for adding something like {{description|a retail business establishment that serves alcoholic beverages}} at the top of our articles, and using that instead of Wikidata's descriptions. Then the description can be seen, edited, and controlled just like any other article wikitext. The downside is that EnWiki and Wikidata would have parallel systems managing similar descriptions for similar purposes. Alsee (talk) 11:25, 6 January 2018 (UTC)[reply]

WMF two-stage proposal for Wikipedia-hosted descriptions

We've been talking for a long time about how to give Wikipedia contributors editorial control over the short descriptions. I've got a new approach to a solution here, and I'd like to know what you think.

First up, to establish where this approach is coming from:

  • English Wikipedia editors need to be able to see, edit and effectively moderate the short descriptions on desktop and mobile web, without becoming active Wikidata editors. That requires meaningful integration in Wikipedia watchlists and page history. Those are features that the Wikidata team is working on, but they don't currently exist, and they don't have a timeline for it.
  • The short descriptions are very useful for readers of the mobile apps and for editors using VisualEditor, and blanking a significant number of descriptions for a meaningful period of time would harm the experience for those readers and editors.
  • English Wikipedia editors should make the content decisions about how to actually populate the descriptions, including being able to specify pages where a description isn't helpful.

That last point is the one we've been wrestling with for a while. I was asking for examples of article pages that shouldn't have a description, and several people brought up examples where the article had a disambiguation phrase, and the short description was only repeating information from that disambig phrase. Here's some examples:

I said above that I didn't think the redundancy was harmful, but some folks pointed out that I was moving the goalposts -- asking for examples, and then saying they didn't matter. I was trying to make content decisions about the format, and I'm not one of the people who are doing the actual work of writing and moderating them. Fair point.

So I wanted to estimate how many useful descriptions there currently are -- taking out "Wikimedia list page" and "Wikimedia disambiguation page", and also taking out pages where the description is just repeating a disambiguation phrase.

Last week, Mike Peel generated a list of 1,000 random articles and descriptions, which helped us to survey the quality of descriptions on a decent cross-section of pages. I wanted a bigger sample, so we generated a list of 10,000 articles and descriptions.

Here's the breakdown from that larger sample of 10,000 random articles. This is my current definition of "not useful" descriptions:

  • 39.82% have no short description on Wikidata
  • 7.76% have descriptions that include "Wikipedia" or "Wikimedia"
  • 1.16% have a disambiguation phrase in the article title, which is entirely duplicated in the description (116/10,000, marked on the page linked above)

Putting those together, that makes 48.75% blank/not useful descriptions, and 51.25% useful descriptions.

Extrapolating that out to 5.5 million articles, it suggests that about 2.82 million Wikipedia articles have useful descriptions. (I'm open to continuing to iterate on the definition of useful vs not useful, if people have thoughts about that.)

Now, from the WMF side, the thing that we want to avoid is switching suddenly from 2.8 million useful descriptions to a much lower number, which is what would happen if we built a magic word that's blank by default. That would hurt the experience of the readers and editors who rely on the descriptions.

So, the solution that I'm proposing is: WMF builds a magic word that Wikipedia editors can populate with descriptions.

  • Stage 1: Initially, the display of the descriptions pulls from the Wikipedia-hosted magic words -- but if there isn't one on Wikipedia, or the Wikipedia one is some version of blank, then it falls back to showing the Wikidata-hosted description.
  • Stage 2: When there are non-blank Wikipedia-hosted magic words on a number of articles that's roughly comparable to 2.8 million, then WMF switches to only pulling from the Wikipedia-hosted magic words, and we don't fall back to Wikidata-hosted descriptions. At that point, descriptions that Wikipedia editors leave blank will actually be blank on the site.

The decisions about how to generate ~2.8 million descriptions will be made by Wikipedia editors, and the timeline is up to you. I'll be interested to see how that process develops, but it's not our process. Our role is to switch to full Wikipedia-hosted descriptions when there are enough descriptions that the folks who use them won't notice a meaningful degredation. So that's the idea.

One final thing that I want to say is that there are a lot of people in the movement, including the WMF, who believe that more integration and interdependence between Wikidata and Wikipedia is going to be key to the movement's growth and success in the future. We're going to keep working on helping Wikidata to build features that make that integration realistic and practical.

Right now, Wikidata doesn't have the working features that would make short descriptions easy to see and moderate from Wikipedia. In the future, as the Wikidata team builds those kinds of features, we'll want to keep talking to folks about how to encourage productive interdependence between the two projects. I'm hoping that a positive resolution on this short descriptions question helps to keep the door open for those future conversations. What do you think? -- DannyH (WMF) (talk) 00:32, 22 December 2017 (UTC)[reply]

I'm having trouble following this. Admittedly I'm not the sharpest crayon in the box, so could you help by giving the Cliff's Notes version of this proposal? Shock Brigade Harvester Boris (talk) 01:54, 22 December 2017 (UTC)[reply]
The WMF claims, based on a sample of 10,000 articles and some dubious mathematics (116/10,000 is not 0.01% obviously, but 1.16%) that about 2.8 million enwiki articles have useful Wikidata descriptions; based on this, they will continue to use the Wikidata description when there is no enwiki magic word description, until enwiki has populated 2.8 million magic words. At that time, they will switch off the "use the Wikidata description when there is no enwiki description" and switich to "show a blank description when there is no enwiki magic word description". 08:09, 22 December 2017 (UTC)
Your count is a bit optimistic. Apart from the maths error explained above, I see in your sample "12. en:German International School New York - school". I also see that in yor count of the disambiguations, you don't count ones that aren't identical, even if it is less specific and useful than the disambiguation: "3. en:William McAdoo (New Jersey politician) - American politician", or add something which is hardly useful: "15. en:Matt Jones (golfer) - professional golfer". That's 3 out of the first 15 you don't count as useless descriptions, or some 20% more. Which would drop your 2.88 million to 1.8 million or so... Fram (talk) 08:09, 22 December 2017 (UTC)[reply]
And Fram's math is also wrong here. Just to be precise, DannyH (WMF) found 116 uninformative (because duplicating) parenthetical disambiguations. There are 1197 of those, 116 of which are already counted as faulty, so adding 20% of the remainder gets us to 116+216=332. So the faulty descriptions for parenthetical citations are only 3.32% of the total. We're still at about half valid descriptions. (From my perspective, "professional golfer" is more informative than "golfer" in the same way that "American politician" is less informative that "New Jersey politician.")--Carwil (talk) 12:46, 23 December 2017 (UTC)[reply]
I found 20% of the first 15 overall, not 20% of the remainder. Of course, nothing guarantees that this percentage will remain the same across all 10K, but it is not the calculation you are making. My 20% was not only about disambiguations, e.g. my first example (number 12) is not a disambiguation. Fram (talk) 15:03, 23 December 2017 (UTC)[reply]
I agree with Fram that the above-mentioned statistical analysis does not bear close inspection and claims a significantly inflated number of useful descriptions. Instead of pointlessly arguing the red herring of how many are good and how many are bad, I think we would actually be better off with populating the magic word with Wikidata descriptions at the start, and immediately switching to only showing descriptions from Wikipedia, as then we could get rid of the garbage more easily, and would be free of interference and externally imposed vandalism at the soonest possible date. This path should also reduce the coding to the minimum achievable required complexity, as all it would have to do is produce a description string from Wikipedia, without any conditionals. Blank description => Blank display, Non-blank description => Non-blank display. This should be the lowest possible overhead with the lowest probability of bugs, and should allow us to get started with fixing earlier. If the Wikidata description is good enough that no-one bothers to improve it, then it can stay. Empty descriptions will be empty at Wikidata too, so no disadvantage. Vandalism copied over from Wikidata will be absent from mobile and VE display after being deleted once. Crap copied over from Wikidata can be seen on Wikipedia and eliminated, either by providing a better short description, or simply deleting it. Way better than continuing to pull dubious material from Wikidata until a somewhat arbitrary number of non-blank descriptions have been produced on Wikipedia. Once the magic word syntax has been defined, a single bot run authorised by Wikipedians can populate the articles, even before the display code has been finalised. · · · Peter (Southwood) (talk): 15:01, 22 December 2017 (UTC)[reply]
DannyH (WMF), I don't think your proposal is an improvement on what I have described here, it prolongs the lack of internal control over Wikipedia content unneccessarily and to no advantage. · · · Peter (Southwood) (talk): 15:19, 22 December 2017 (UTC)[reply]
Pbsouthwood and Fram: We can get into the weeds on duplicates, but ultimately I don't think it's going to provide a lot of clarity for the amount of time and work it would take. This is the fairest option that we can provide, and I can't keep coming up with new solutions just because you say it's not an improvement. We need to bring this to a conclusion. -- DannyH (WMF) (talk) 17:56, 22 December 2017 (UTC)[reply]
DannyH (WMF), Firstly I have no idea what "get into the weeds on duplicates" is supposed to mean, so cannot comment on it further. Secondly, I don't think it is a fairer option than the one I have described, depending on how you define "fair" in this case. Thirdly, I don't see that your options are actually "solutions". Partial solutions, perhaps. Compromises, yes, but so are most of the other options discussed. I think you are missing the point again. Please read my suggestion above, and instead of a blanket dismissal, explain where it has technical problems and what they are. · · · Peter (Southwood) (talk): 03:40, 23 December 2017 (UTC)[reply]
DannyH, where did I say "it's not an improvement"? I have only commented on your poor calculations, that's all. If you can't accept any comments, then just shut down this RfC and declare what the WMF will do. Fram (talk) 11:11, 23 December 2017 (UTC)[reply]
Fram and Pbsouthwood: We've been talking about this for months, and I have agreed with many points that both of you have made. This compromise that I'm proposing will result in fully Wikipedia-hosted descriptions, with control over individual pages where WP editors think the description should be blank. This is the thing that you said that you wanted. The only compromise that I'm asking for on your side is for Wikipedia editors to actually write the short descriptions that you said that Wikipedia editors want to write. Do you think that this is an acceptable compromise? -- DannyH (WMF) (talk) 22:44, 23 December 2017 (UTC)[reply]
DannyH (WMF), The compromise you are suggesting requires Wikipedians to produce 2.8 million descriptions before you will agree to turn off Wikidata as the empty magic word default, meaning that the problems of vandalism remain for that period, which may be a long time, as Wikipedia is edited by volunteers who will edit as and where they choose. Fram disputes the validity of this number, and I consider a simpler option preferable which could result in a much earlier shutoff of Wikidata, without any loss of useful functionality for WMF. You have not made any reply to my query regarding technical objections, so should I assume there are none? Have you read and understood my suggestion above which I have now set in italics so you can find it more easily? These are not rhetorical questions, I ask them in order to get answers which may be relevant to getting closer tom a solution. I cannot force you to answer them, but you might find that discussions go more smoothly and productively when you answer questions instead of changing the subject.
In answer to your question, No, until you have answered our questions I cannot accept your proposal as an acceptable compromise because I am lacking what I consider to be important data for making that decision. However, I speak only for myself. Others may agree or disagree with my opinions, and I will go with the consensus. What I am trying to do is get us there by getting the options as clear as possible. I also think that most, if not all of the regular Wikipedians here are doing the same. · · · Peter (Southwood) (talk): 16:57, 24 December 2017 (UTC)[reply]
Pbsouthwood, what you're describing above is absolutely within the bounds of the proposal that I made. We can turn off the Wikidata fallback when there's a comparable number of good descriptions hosted on Wikipedia. We're not going to decide how to populate the magic words; those are content decisions that Wikipedia editors can make. If people want to copy all of the existing Wikidata descriptions, then that would hit the threshold of descriptions, and we could turn off the Wikidata fallback at that point. Does that answer your question? -- DannyH (WMF) (talk) 18:30, 26 December 2017 (UTC)[reply]
DannyH (WMF), That answers my question partly. If WMF is willing to commit to switching off the Wikidata fallback when Wikipedia decides that all the acceptable descriptions from Wikidata have been transferred, or have provided better ones, then I would accept the proposal, but not if WMF plans to hold out for an arbitrary and poorly defined number based on a small sample and a dubious analysis. · · · Peter (Southwood) (talk): 19:01, 26 December 2017 (UTC)[reply]
  • Support for either the 'two phase' solution (filling descriptions first and switching off wikidata later), or copying the wikidata descriptions and shutting off wikidata more quickly.
    After reviewing the 10k random Wikidata descriptions I see DannyH's calculation of 2.8 million 'useful' descriptions at wikidata is somewhat of an overestimate. It includes few percent that have zero value, and another few percent with negligible value. However I expect we all have reasonable flexibility on the vague threshold for local descriptions to credibly substitute for wikidata descriptions. Alsee (talk) 23:02, 27 December 2017 (UTC)[reply]
Pbsouthwood and Alsee: Okay, good. Yeah, we can work together on what the threshold for switching would be. I was hoping at first that we could pull the description for every page (5.5m), but the query would take days to run, and I wanted to go through a sample by hand anyway. That's why I used the random 10,000. If somebody wants to get a better estimate somehow, that's cool, or we just say a round number like 2 million. Or maybe someone has a better idea for how to judge that threshold. What do you think? -- DannyH (WMF) (talk) 21:21, 29 December 2017 (UTC)[reply]
DannyH (WMF), I think that if we can agree on a reasonably objective way to establish that the usefulness of Wikidata as a source for short descriptions has been effectively exhausted, we don't have to agree on any specific number. At some stage Wikipedians will suggest that we have taken as many of the short descriptions as is reasonably practicable and are actually useful, and request a shutdown of the Wikidata fallback. We would establish this point by internal discussion and consensus, as is traditional. At this point I suggest that WMF be allowed a two week period to check, and show statistically convincing evidence that it is worth the effort of finding and extracting more, and a way to find them, or do the shutdown without further delay. There is nothing to stop anyone who thinks that scavenging the last few useful descriptions is worth further effort from doing so at any time after the shutdown. Nobody gains by haggling over a specific number in the absence of reliable evidence. A few weeks or months of actually creating and transferring short descriptions is likely to inspire a whole range of more useful ideas on how to estimate the cutoff point. · · · Peter (Southwood) (talk): 05:08, 30 December 2017 (UTC)[reply]
Pbsouthwood, I think we need some kind of goal to shoot for. If we just say that the community will decide when they feel like it's done, then we're going to find ourselves in exactly the same place, however many months away it is. I'd like to have a clear line that we can agree on, so we can go through the rest of the process in amicable peace. -- DannyH (WMF) (talk) 19:08, 1 January 2018 (UTC)[reply]
DannyH (WMF). There are two problems with this latest proposal:
  1. What makes you think it will be easier to come up with a reasonable, generally acceptable fixed number now than later?
  2. Who is going to produce a credible number without generally acceptable evidence of statistical validity, or an actual count? Your proposals so far have appeared ingenuous. I doubt that Wikipedians would accept your suggestions without fairly convincing evidence, and it is you who is asking for a fixed number, therefore your burden to find one that we can accept. If you are trying to delay things as much as possible, this looks like a very effective method of stonewalling any progress. · · · Peter (Southwood) (talk): 05:16, 2 January 2018 (UTC)[reply]
Please confirm that development of the magic word is not being delayed until a final decision on numbers is reached. · · · Peter (Southwood) (talk): 05:16, 2 January 2018 (UTC)[reply]
Pbsouthwood, we are both operating in good faith here. I posted a list of 10k random Wikidata descriptions, with an explanation of the methodology I used to arrive at an estimate of 2.8 million useful descriptions. I've marked all of the descriptions in that list of 10,000 which are only repeating information from the disambiguation phrase, and not counting them. If somebody else wants to go through it and mark ones that they think I missed, that's fine, and I'll adjust the estimate.
What we're measuring is partially a judgment call -- what counts as a repeat of the disambiguation phrase? -- so it's not a task that a script can do accurately. It needs a person who can go through descriptions and make that judgment call. I've done that for 10,000 descriptions, and it took me a couple of hours. I can't spend more time looking at a bigger sample.
Development of the magic word is not being delayed until a final decision on numbers is reached. We will build the magic word that overrides the Wikidata description. Making the switch to shut off Wikidata descriptions and only pull from the Wikipedia descriptions will depend on the number that we're talking about. I am suggesting 2 million descriptions, which is significantly lower than my estimate of existing descriptions. -- DannyH (WMF) (talk) 18:58, 2 January 2018 (UTC)[reply]
DannyH (WMF), in my personal life I'm not a fan of carving long term specifics in stone, and the wiki-community also generally deals with things on the fly. If the descriptions get filled in quickly then any target number isn't going to matter much. If things proceed too slowly then some sort of examination of what's happening would probably be warranted anyway. However I can understand some people may be more comfortable if there is a clear target in place. If it's important, I guess I could sign onto a 2-million-or-earlier target. If necessary we could always meet that target by copying wikidata descriptions. Alsee (talk) 18:35, 5 January 2018 (UTC)[reply]
Alsee, I think it's helpful to have an estimate to shoot for, so that the community can make the kind of decision that you're referring to -- do we need to copy Wikidata descriptions, or should we try to write them all ourselves? "We need to write 2 million descriptions" is very different from "we need to write a lot of descriptions." -- DannyH (WMF) (talk) 22:40, 5 January 2018 (UTC)[reply]
Alsee, I agree with your most recent analysis and comment.· · · Peter (Southwood) (talk): 05:08, 30 December 2017 (UTC)[reply]

This is the key bit:

English Wikipedia editors need to be able to see, edit and effectively moderate the short descriptions on desktop and mobile web, without becoming active Wikidata editors. That requires meaningful integration in Wikipedia watchlists and page history. Those are features that the Wikidata team is working on, but they don't currently exist, and they don't have a timeline for it.

Integration is not possible until this capability is created. The current situation exposes us to prolonged vandalism and thus harms our reputation. Doc James (talk · contribs · email) 06:12, 30 December 2017 (UTC)[reply]

Free encyclopeadia that anyone can edit

I have been editing Wikipedia on and off for more than twelve years now, and I can remember the days when if one read an article in Wikipedia, it would have a sign saying "Wikipedia - the free encyclopeadia that anyone can edit" heading the article. My proposal is that we go back to having this sign, as this would inform newcomers to Wikipedia that Wikipedia is a wiki website.Vorbee (talk) 19:52, 17 December 2017 (UTC)[reply]

You want to readd the "anyone can edit" part? L3X1 (distænt write) 14:11, 18 December 2017 (UTC)[reply]
  • That was my proposal, but if anybody thinks that it is well-known enough these days that Wikipedia is a website that this is not necessary, I shall understand. Vorbee (talk) 09:38, 20 December 2017 (UTC)[reply]
If we're going to re-add that, can we change it to read "anyone can edit (but there's no guarantee that your edit will last more than a nanosecond before someone deletes it)"? Truth in advertising? Better for keeping the newcomers we attract with the motto? Regards, TransporterMan (TALK) 14:58, 18 December 2017 (UTC)[reply]
I see no need to have it atop every page. And surely there's no need for, "and so can everyone else" since it's obvious. Jim.henderson (talk) 15:10, 18 December 2017 (UTC)[reply]
I believe that the "anyone can edit" part was put in place before we had imposed permanent bans on several thousand people for a variety of issues. bd2412 T 22:01, 29 December 2017 (UTC)[reply]
I don't think this is necessary. Natureium (talk) 15:58, 18 December 2017 (UTC)[reply]
See also prior discussions at MediaWiki:tagline. — xaosflux Talk 16:35, 18 December 2017 (UTC)[reply]
Some food for thought. While we take for granted in the English-speaking parts of the world that everyone knows what Wikipedia is (and that you can edit it), research in other regions is showing that knowledge of Wikipedia in general is rather low. I wouldn't assume folks know, "Oh, I can fix this." even where English Wikipedia reaches. While not data, purely antidotal, as someone who contributes and works for the foundation, when I talk to non-Wikimedians about my contributions/work they are often surprised that anyone can edit. I can't tell you how many people get confused and think my job is editing Wikipedia. ಠ_ಠ Don't worry, I quickly correct them. :) CKoerner (WMF) (talk) 22:20, 21 December 2017 (UTC)[reply]
  • Main page still says "the free encyclopedia that anyone can edit." I think we can keep it shorter on other pages. Doc James (talk · contribs · email) 05:38, 28 December 2017 (UTC)[reply]
  • Thank you for drawing to my attention that it is still on the Main Page - I have had a look and seen it is there. Vorbee (talk) 16:31, 1 January 2018 (UTC)[reply]
  • No point stating the obvious, If an IP heads to an article and sees "Edit" then it's kinda obvious what clicking that button will do, No need for any tagline and all that. –Davey2010 Merry Xmas / Happy New Year 21:54, 29 December 2017 (UTC)[reply]

Three Strikes Rule for AfC submissions and reviews

There have been several recent discussions about problems at Articles for Creation. Here, here, and here.

To address some of these concerns I propose the following:

On the third decline of any AfC submission, a bot shall be programmed to list the draft for discussion at Miscellany for Deletion. At the end of the discussion, if the draft is 'keep' or 'no consensus' it should be moved immediately to main-space (same as at AfD) and put into the New Page Patrol queue. If the decision is 'delete' it should be deleted and subject to G4 CSD as per usual if recreated. Other potential outcomes such as merging/redirect/etc are possible, however, the draft should not be sent back to the draft space following the discussion, regardless of outcome.

This proposal is meant to address three primary issues:

  • Concerns that some AfC reviewers are being too strict (or at least more strict than NPP or AfD would be with a similar submission). --The MfD discussion would reveal notable topics through discussion.
  • Repeated submissions of clearly non-notable subjects (or borderline-looking, but non-notable subjects) that waste valuable reviewer time and contribute to a growing backlog. --The MfD discussion would reveal non-notable subjects and delete them so that they would then be subject to G4.
  • Lack of collaboration (some have raised concerns that drafts are only seen by a handful of reviewers, who decline over and over, sometimes on spurious reasoning, and that this is contrary to Wikipedia's goals of collaborative editing). --The eventual MfD discussion after three AfC reviews would create a place where additional eyes would be put on the article.

This is especially important to help resolve borderline topics that tend to languish at AfC, they are not listed for deletion because it is not obvious that they should be deleted (they are borderline after all), and instead they are just resubmitted over and over until the author loses patience and either moves it to main space where NPP deals with it, or abandons it and it eventually gets put up for G13 CSD. As far as I have seen, there is a significant proportion of abandoned drafts that are suitable for main-space, so this is a relatively common issue.

NOTE: This proposal is not intended as a way to "get rid of" drafts, rather it is as much a way of ensuring quality reviewing from AfC reviewers as it is about getting rid of improper submissions that clog AfC. Even submitters gain from this process, as they will be at least given a firm answer to whether their submission is notable or not and will escape AfC purgatory (for borderline drafts). Drafts that are unfinished, but clearly notable (as determined by editors doing a routine search for sources at the MfD discussion), should be moved to main-space anyway, where they can be improved by more sets of hands and eyes. The only group affected with a slightly larger workload will be contributors to MfD.

My hope is that this is a solution that can help pretty much everyone involved, and streamline several processes at the same time. I hope that you guys also agree.

NOTE#2: Others have suggested that this proposal could overrun MfD, but estimates indicate only 2-3 on average would be expected per day. Well within the current capabilities of MfD. See my comment to GMG below.

NOTE#3: The {{AFC_submission/declined}} template should be updated to indicate that a third decline will result in a discussion at MfD.

Insertcleverphrasehere (or here) 09:25, 21 December 2017 (UTC)[reply]

Discussion - three strikes proposal

Support, as proposer. — Insertcleverphrasehere (or here) 09:25, 21 December 2017 (UTC)[reply]

  • Support Will make the AfC process more manageable and streamline work flow. Suitable submissions into main space unsuitable submissions deleted. -- Dlohcierekim (talk) 15:12, 21 December 2017 (UTC)[reply]
  • Suppport-ish - I do like the idea of doing something about drafts that languish. However, if the draft passes MfD I see no reason to submit it to another review at NPP, or alternatively, why not submit to NPP directly and bypass MfD? I don't see the point of one review leading to an automatic second review, that's all. Also, why not do this for all G13-eligible drafts? Ivanvector (Talk/Edits) 16:16, 21 December 2017 (UTC)[reply]
There is other stuff that happens at NPP that is valuable other than article triage:tagging for cleanup, adding WikiProjects, stuff like that. There is no reason why the closer of the MfD discussion can't automatically tick it off if they are an admin though. Basically: our backlog at NPP isn't so bad that we can't support an initiative like this, so I thought, why not? Submitting directly to NPP will get shot down immediately, and I wouldn't support it; most of these are borderline and need discussion to sort them out one way or the other, it would subvert ACTRIAL, and we also shouldn't reward people repeatedly submitting trash drafts with automatic main-space publication. — Insertcleverphrasehere (or here) 17:53, 21 December 2017 (UTC)[reply]
I guess admin involvement is part of the problem here. Usually (but not always) the closer of an MfD will be an administrator, and administrators have the autopatrolled userright, so if moved as part of an admin MfD close the draft will be automatically patrolled and will not appear in the NPP queue. If this is to work, I think the onus is going to be on the MfD participants to reject unsuitable content (which is already done) and also tag drafts which are inclusion-worthy but need work, as NPP does now. I'll comment on the other suggestion below in a bit, I have to run for IRL things. Ivanvector (Talk/Edits) 19:31, 21 December 2017 (UTC)[reply]
  • Comment MfD is nice, except a lot of us NPPers (non-biting page patrollers) don't frequent it, probably because it doesn't show up in our RFA-I'm-so-pretty logs. Why not send it to AFD? I mean, I know its not an article in the sense of residing int he mainspace, but its an article wannabe and I think it should be treated as such. If drafts showed up at AFD you could get a lot more eyes on it. Granted, no small portion of those eyes will be long toothed deltionists who believe "If it's at AFD it should be deleted, ja, sandwhich make me strong!" which can spam the !vote count towards delete, but more eyes is better than 3 eyes. L3X1 (distænt write) 17:23, 21 December 2017 (UTC)[reply]
I did consider this; although MfD is the place for drafts, we are de-facto treating the draft like an article for the purposes of discussion. I'm not adverse to the idea, however, I was worried that people would shoot it down for being too radical. I'll ask it as an additional question below. — Insertcleverphrasehere (or here) 17:53, 21 December 2017 (UTC)[reply]
  • Support ANYTHING that means drafts by new editors are actually discussed by the community at large rather than the 1 on 1 'gauntlet' at AFC is good. AFC reviewers manage to eliminate the truly irredeemable, but unfortunately they also overreach far too often and condemn good potential drafts to a purgatory eventually leading to g13 deletion when the exasperated editor has given up. This policy would not only allow community oversight, but also allow a much wider range of people to actively help drafts reach a higher quality! I would also say that this policy, if adopted needs to have a clear warning for draft users, and perhaps the option at MFD or AFD for 'userfy' the article in cases where the writer wants to continue to work on it.Egaoblai (talk) 20:32, 21 December 2017 (UTC)[reply]
  • Support. This is a neat solution to a number of problems, and essentially just formalises the current ad hoc process of taking drafts to MfD when we get fed up of them at AfC. I especially like the idea of getting a definitive community consensus on a draft relatively soon rather than encouraging new editors to go through an endless cycle of accept-and-reject. However, I do agree that MfD may not be the best venue. Perhaps AfD, or given the potential volume, we could even create a new 'Drafts for Discussion' process. That can be decided later, though. – Joe (talk) 20:33, 21 December 2017 (UTC)[reply]
  • Uhh.... I think this proposal needs to come with some numbers. Actually a lot of numbers. The last stats I've seen were from October, and we had about 250 article reviews a day. I have no idea how current that number is, and no idea how many of those we would be pushing into XfD. But probably on the order of a regular 50 would be enough to overwhelm AfD (who all day today has had a grand total of 70 nominations), and MfD (heaven forbid) seldom gets more than a dozen a day. Nevermind the fact that this completely rewrites policy for at least AfD, and allows editors to nominate articles with basically no expectation that they do BEFORE. I have a hard time believing that there isn't a giant unintended consequence sitting out there somewhere. GMGtalk 20:41, 21 December 2017 (UTC)[reply]
Well, the before is that three reviewers thought there was nothing to the draft. I don't think the number that get rejected three times is so ridiculously high. Galobtter (pingó mió) 04:22, 22 December 2017 (UTC)[reply]
Ok. You asked for numbers: I looked at 50 random AfC drafts, of those; 29 had not been reviewed before, 15 had been declined once, 3 had been declined twice, and 3 had been declined 3 times. The three that had been declined three times were: Draft:Tony Parella, Draft:Win the Future, and Draft:WIN-911 Software. So about 6% of the AfC submissions currently in the backlog. If the backlog is ~2700 there should be about 160ish drafts that would qualify right now. Note that I would not recommend sending them all at once (what a disaster that would be). The bot could be programmed to send over a maximum of 5-10 per day (or some other number) until it is caught up. Once caught up I'm suspecting a rate of 2-3 per day is what we would be expecting (the backlog wait is about 2 months (!) at the moment, and some basic math: 160/60=2.7 . Rough estimates, but that is what we are looking at ballpark-wise. Not actually huge numbers. — Insertcleverphrasehere (or here) 08:58, 22 December 2017 (UTC)[reply]
  • Comment - I'm skeptical about this. I would like to see about how many (per month or week) drafts reach these three strikes only to be accepted later, compared to how many reach the three strikes threshold. RileyBugz会話投稿記録 21:18, 21 December 2017 (UTC)[reply]
  • I am worried that this will overwhelm MFD. Enough already would go down with a softdelete, so the persistent submitters would just ask for an undelete and then try again any way. In some cases the problem is with the decliners who are claiming problems that are not there or are insignificant. Yet we do not need an automatic report of the last decliner to ANI. Instead we shoul have a human decision. The page could goto the back of a priority queue, maybe we should have categories for heavily declined pages, and slightly more persistent than juat counting the current templates. Graeme Bartlett (talk) 21:25, 21 December 2017 (UTC)[reply]
  • Question: Would this include user sandbox drafts once they've been tagged by the author for review, or only AfC drafts? I'd be opposed to this if it includes user sandbox drafts. Regards, TransporterMan (TALK) 21:30, 21 December 2017 (UTC)[reply]
I would think Draft-space only per our usual YourUserspaceIsYourCastle policies. L3X1 (distænt write) 23:52, 21 December 2017 (UTC)[reply]
@TransporterMan: What exactly do you mean by "tagged for review"? As far as I'm aware the only way of doing that is to submit it to AfC, and then it's moved to draftspace anyway, so the point is moot. – Joe (talk) 00:15, 22 December 2017 (UTC)[reply]
It's my understanding, perhaps incorrect, that placing {{subst:submit}} on a user sandbox draft queues it for review without moving it to draft namespace. Am I wrong? Regards, TransporterMan (TALK) 04:16, 22 December 2017 (UTC)[reply]
I'm pretty sure that's the case - they have to be manually moved to draft space. Galobtter (pingó mió) 04:22, 22 December 2017 (UTC)[reply]
@TransporterMan: It's not automatic, but the current practice is to move anything with that template on it to draftspace, which somebody is usually quick to do. But moved or not moved, putting {{subst:submit}} on it submits it to AfC and therefore makes it an "AfC draft", as you put it. – Joe (talk) 09:28, 22 December 2017 (UTC)[reply]
Then I oppose this proposal unless the practice of moving user sandbox drafts to draftspace after such tagging is forbidden. It's objectionable to subject userspace drafts to 3-strikes peril merely because someone asks for review. Indeed, I'm not sure how I feel about this entire proposal, but I'm sure I'm opposed so long as that's the case. - TransporterMan (TALK) 14:17, 22 December 2017 (UTC)[reply]
It's not merely because they asked for a review. It's because they asked for a review 3 times. And it's not peril - it's quite possible that the review will figure out that it is indeed notable and put it to mainspace. The user can always move it to mainspace themselves too. Galobtter (pingó mió) 14:24, 22 December 2017 (UTC)[reply]
If the draft is in userspace, I oppose this proposal regardless of how many times they ask for review if the result of that request is that their draft stands any chance of being nominated for deletion because it gets moved to draftspace. Drafts in userspace should not be subject to being automatically nominated for deletion, however that may come to pass. — TransporterMan (TALK) 20:45, 25 December 2017 (UTC)[reply]
@TransporterMan. Because of the way AfC works, adding the submit template to a userspace draft is also de facto requesting that the userspace draft be moved to Draft space. However, this isn't always clear to new users, and perhaps I could add a caveat above that the draft can be userified if requested. G4 has an exemption for userfied content, though I suppose that if the user decided to submit the resulting userspace draft for AfC again (4th time) without substantial changes or move it to main space without substantial changes, it would then qualify for G4.
Would you still oppose if I changed the last line to: ...the draft should not be sent back to the draft space following the discussion, regardless of outcome, but may be userified upon request (subject to G4 if moved to main/submitted to AfC without substantial changes).? — Insertcleverphrasehere (or here) 21:36, 25 December 2017 (UTC)[reply]
Yes, I think I would still object. You say, "adding the submit template to a userspace draft is also de facto requesting that the userspace draft be moved to Draft space" but I cannot find anything in policy or guidelines which allows that (and, indeed, STALE would seem to, if not forbid it, at least not to specifically allow it) and I find nothing on the AFC page or on the {{submit}} template which would inform an editor that that's going to happen merely because they submit-tag their draft. People should be allowed to work on articles in their own userspace for as long as they care to do so, subject to what it says in STALE, and merely asking for a review should not result in it being moved from there and (worse) then being put at risk of being nominated for deletion. If AFC doesn't want to review a userspace draft more than a certain number of times or more than a certain number of times per time period, that's fine (if a bit petty and BITE-y), but not telling them that asking for review may get their draft deleted is just plain wrong. Regards, TransporterMan (TALK) 23:08, 25 December 2017 (UTC)[reply]
@TransporterMan. Well, if you add the {{submit}} template to a non-draft space submission it says "Warning: This page should probably be located at Draft:XXXX" or if put on a sandbox draft it will say "Warning: This page should probably be moved to the Draft namespace." and gives a handy button for easily moving it, so it kind of tells them it will be moved to draft. I agree with you that this process should be transparent to the submitter, and the place to add it is next to the 'Resubmit' button on {{AFC_submission/declined}}. This should be updated with something along the lines of "Please note that if the issues are not fixed, the draft will be rejected again. A draft that is rejected three or more times in a row will be listed for discussion at MfD to determine if the topic is notable and/or appropriate for Wikipedia, and may be deleted or moved to main space as appropriate." (added text underlined). Would you still Oppose if that template is updated so as to make it perfectly clear that submitting a third time could result in deletion (I will add it to the above RfC)? See NOTE#3 above. — Insertcleverphrasehere (or here) 18:35, 26 December 2017 (UTC)[reply]
While that would improve one issue, it would not cure my primary concern and I would (and do) continue to oppose. As I said above, people should be allowed to work on drafts in their own userspace for as long as they care to do so, subject to what it says in STALE, and merely asking for a review should not result in a draft being moved from there and/or (worse) being put at risk of being nominated for deletion. Regards, TransporterMan (TALK) 17:56, 27 December 2017 (UTC)[reply]
  • Support. It's a sound suggestion to address a definite problem. My experience of AFC is recent and limited, but I have seen a significant issue with articles that don't meet the criteria for acceptance being rejected, and then being quickly re-nominated. The suggestion, made here and at Village Pump, that AFC represents a "1 on 1 'gauntlet'" between good-faith newbies and cynical AFC'ers, is wide of the mark. I've been seriously surprised by the volume of articles that are either entirely authored by the article's subject, or are intended to promote a commercial enterprise, or both. An effective gateway process is certainly required and this proposal would work towards that. KJP1 (talk) 22:30, 21 December 2017 (UTC)[reply]
  • Oppose. Based on my own experience working #wikipedia-en-help on IRC, a massive chunk of the users writing drafts haven't even bothered to read about our notability criteria, our neutrality requirement, or even what sources we deem acceptable, and when told about these immediately turn around and try to use "But <foo> exists!" as an argument. These users only see Wikipedia for its Alexa and search engine ranking. There needs to be drastically more education on the matters I linked above before we can even consider taking this step. —Jeremy v^_^v Bori! 23:13, 21 December 2017 (UTC)[reply]
And this proposal will actually solve that problem. Users who make poor quality articles will not only be told by one reviewer that it is poor or not notable, but by a group of reviewers at XFD. Also, the existence of a lot of poor drafters does not exclude the existence of good drafters, even if they might be a minority. Let's not through the baby out with the bathwater when we talk about article drafters. It would be mean-spirited to go into this thinking that all new drafters are idiots or spammers. There are plenty who write articles in good faith and are let down by the review system. this policy helps both the poor and the good drafters by giving them a much needed jury rather than one reviewer.Egaoblai (talk) 23:38, 21 December 2017 (UTC)[reply]
This would not help that step; it's just torque them off. Again, as I said above, these people have zero idea what WP:N, WP:IRS, and WP:NPOV are and have no inclination to find the information for themselves; sending it to a deletion discussion would spike their blood pressure for no benefit. These users need to be explicitly pointed to those pages. —Jeremy v^_^v Bori! 21:33, 22 December 2017 (UTC)[reply]
And how many of those users should we be helping at all to begin with? The biggest problem with both AfC and -help today is that the overwhelming majority of resources goes to help people who are not good faith editors but are motivated to get their article into Wikipedia because of external pressures that good faith editors don't have. This drains resources that could be spent helping the people that AfC declines because the reviewers don't know how high quality non-web based sourcing works or the like. I don't have an opinion on this proposal, but I don't really consider making spammers not want to edit Wikipedia a bad thing. TonyBallioni (talk) 21:43, 22 December 2017 (UTC)[reply]
Reviewers would (and already do/should) point out the existence of WP:N etc on a first draft. We're talking about a third draft here, so presumably they would have more than enough time to become acquainted with those places. the 'explicit pointing' would have already happened.
In my experience, they gloss over any links in the decline reason and ignore comments left by reviewers. —Jeremy v^_^v Bori! 01:28, 23 December 2017 (UTC)[reply]
  • Support, but with caveats: I'm open to the idea, considering many drafts simply don't have a snowball's chance of ever being accepted; what I do is I simply put an IAR G6 tag on them and state that the draft would most likely be speedied if it were an article. However, from experience, since Wikipedia has no deadline, I've observed that some drafts, even those that were rejected multiple times, eventually could end up meeting Wikipedia's guidelines. Thus, while I can see the value in the proposal, I'm not sure if three strikes specifically a good number; how about five strikes instead? Or perhaps, we could have the three strikes thing, but have some leniency for promising articles, or at the very least allow requests for the deleted drafts to be recreated either at WP:REFUND or WP:DRV. In addition, if three/five/whatever number of strikes is implemented, perhaps on the last decline before when the draft would automatically be MfD'd (either the second or the fourth), an automated message should be left on the user's talkpage stating that the draft will be nominated for deletion the next time the draft is declined, while at the same time giving suggestions on how to improve the draft. Narutolovehinata5 tccsdnew 00:04, 22 December 2017 (UTC)[reply]
  • Support, but I think we'll need to make a special section (at MfD) for the drafts that get sent there, so people interested in article creation can browse that and the people who normally frequent MfD can skip it if they want to. Enterprisey (talk!) 05:06, 22 December 2017 (UTC)[reply]
    Either that or just a separate DfD. Galobtter (pingó mió) 05:12, 22 December 2017 (UTC)[reply]
  • Moral Support only moral support as I'm unsure whether MfD, AfD, or a new DfD would be the right venue, and can't support any final proposal yet. Perhaps we could get agreement for a limited test of both processes to see which works better? power~enwiki (π, ν) 06:22, 22 December 2017 (UTC)[reply]
@power~enwiki I am pretty sure that AfD is out as a possibility, and I am personally leaning toward a separate section at MfD for "Three Strikes AfC Drafts" to include auto-submitted ones. After all, slightly special instructions will apply to these articles. I don't think that we need an entirely new DfD for these given the numbers that we are expecting (in my (rough) estimation above I recon it is in the single digits per day). A new DfD would not have an existing base of users there to help discuss, and could end up with different standards being employed than at MfD/AfD, which is what we are trying to avoid in the first place. — Insertcleverphrasehere (or here) 06:33, 22 December 2017 (UTC)[reply]
  • Support (mostly): This is a well thought out proposal, Insertcleverphrasehere. My only concern is the lack of participation at MfD and the potential of overloading a board that is simply not accustomed to deal with this level of activity. --Majora (talk) 21:10, 22 December 2017 (UTC)[reply]
  • Oppose. As far as I can tell, this is a complicated scheme system to make sure that IPs have as much right (if with some extra hassle) to write bad articles that just barely won't get deleted as autoconfirmed users do. Why is that useful? If they want to write bad articles on notable topics, they can go ahead and register an account. --Trovatore (talk) 21:27, 22 December 2017 (UTC)[reply]
    I'm pretty certain most draft-writers *are* registered users, albeit not autoconfirmed. —Jeremy v^_^v Bori! 21:37, 22 December 2017 (UTC)[reply]
    Really? Then won't they be autoconfirmed by the time they've put an article through three AfC attempts? It's a pretty low bar. --Trovatore (talk) 22:30, 22 December 2017 (UTC)[reply]
    There are two elements to autoconfirmation: Time and edits. If a user creates and submits the draft in one edit and makes adjustments and resubmissions in one edit, or they do so at a rapid-fire pace, it's plausible that they won't be autoconfirmed at that time. —Jeremy v^_^v Bori! 23:06, 22 December 2017 (UTC)[reply]
    Sounds like an unusual case, and in any case easily fixed by making a few edits and waiting a few days. Why would we want to create all this complicated structure just to make it slightly less likely that a new editor will have difficulty creating a bad article? --Trovatore (talk) 23:44, 22 December 2017 (UTC)[reply]
if you are concerned about 'bad articles' then this system would actually help with that as it would mean that drafts would have the wider attention of the community, who might be more motivated to work on them together, rather than expecting the new editor to do all the work themselves (they'll still do most, but they'll have a hand from others when it comes to the XFD discussion. Egaoblai (talk) 00:47, 23 December 2017 (UTC)[reply]
I'm just wondering why we're bending over backwards to solve a problem related to non-autoconfirmed creations, when becoming autoconfirmed is so easy. --Trovatore (talk) 02:02, 23 December 2017 (UTC)[reply]
@Trovatore, this isn't aimed at any particular type of editor, the vast majority of drafts that I have reviewed were submitted by users, not IPs. I personally used AfC to submit my first article, even though I had many months of tenure and hundreds of edits. AfC is also used by autoconfirmed editors that simply want to contribute and thought that it is a place where they could get help. — Insertcleverphrasehere (or here) 05:25, 23 December 2017 (UTC)[reply]
Ah. Well, it's possible I haven't fully understood the context, then. I've never hung around AfC; I thought it was mainly there so IPs could create articles. I'll withdraw my objections and let the others figure it out. --Trovatore (talk) 07:21, 23 December 2017 (UTC)[reply]
AfC is primarily for new users who may not understand Wikipedia syntax and for COI editors to submit articles without fear of them being deleted out-of-hand in mainspace. This is part of why they require review: the reviewer is supposed to explain what the issues with the draft are, in general or specifically, when declining, and it's generally one of three things: Notability not shown, non-neutral writing, or the draft covers a topic already handled by an existing article. The overwhelming majority of drafts I have seen have been written by registered editors, not IPs. —Jeremy v^_^v Bori! 22:54, 23 December 2017 (UTC)[reply]
Oppose thought not for the reason you think. I agree that repeated resubmissions without improvement should be sent to MFD, but part of entertaining the resubmissions is establishing a pre-existing consensus by multiple AFC reviewers (see Wikipedia:Miscellany for deletion/Draft:The Ding Dongs as an example). One reviewer may be a stickler on RS, one may be a stickler on N, and a third may be a stickler on underlinking/overlinking pages, and a final one might have a conniption fit about REFSPAM. Automatically creating the MFD discussion by bot will only lead to more false positive MFDs and wasting time on things that have the potential to improve. Giving AFC volunteer reviewers the ability to exercise discretion is a much better option. Hasteur (talk) 02:59, 23 December 2017 (UTC)[reply]
Furthermore, I wonder if the better answer would be to stop patrolling the mouth end of the AFC queue so actively (go over the front for quickfail objections (Copyright, attacks, patent nonsense)), and work the tail end more actively. Giving these users feedback so rapidly only reinforces the cycle of "If I make one byte change, I might be able to get it in". If we instil the mentality of "Don't submit until you've really dealt with the issue" we could get a significant reduction in the overall backlog. Hasteur (talk) 03:07, 23 December 2017 (UTC)[reply]
If people are declining for under/over linking, then that is part of the problem this proposal is meant to address. — Insertcleverphrasehere (or here) 03:46, 23 December 2017 (UTC)[reply]
  • Oppose problem AfC drafts being wholesale offloaded onto MfD. It would not be workable. Instead, AfC interested editors should establish criteria for what should be (1) accepted, (2) rejected for improvement, or (3) rejected outright. Currently, it is pretty vague, with no clear difference between (2) and (3). Also failing is the style of communication with the newcomer authors. Personally, I think that non-autoconfirmed should not be allowed to start any new page, including a draft. They should spend at least four days and ten edits improving existing content first. "Anyone can edit" is misconstrued as "anyone can start a page on anything". Draft:Researchfish is simply appalling on multiple fronts. The author has not be {{Welcomed}}. No Wikipedian has engaged the author with human-style talk. The many rejections are not in a language a newcomer can understand. The author does not understand Wikipedia. It is an embarrassment all round, and sending it to MfD hides the evidence of failure of the AfC process. --SmokeyJoe (talk) 05:04, 23 December 2017 (UTC)[reply]
    There are a couple things that unambiguously fall into (3): Specifically, any draft that would fall under G11 and anything that an existing article already covers. Other than that, however, barring more information or some wordsmithing most everything else rejected should fall under (2). —Jeremy v^_^v Bori! 23:00, 23 December 2017 (UTC)[reply]
    And this is one thing where AfC is continually failing. (3) is way too thin, and (2) includes a very large amount of stuff that can never become suitable. The AfC review process lacks a template to say this, the reviewers are very hesitant to say so, the draft author only gets an unclear message that they should try a bit harder. Hopeless resubmissions are the obvious consequence. —SmokeyJoe (talk) 06:33, 25 December 2017 (UTC)[reply]
  • Support My only concern is the lack of participation at MfD, but it could draw other AfC contributors to vote. This is a great proposal. JTP (talkcontribs) 17:56, 23 December 2017 (UTC)[reply]
  • Support - sounds as if it could actually contribute to solving two opposite ends of the current spectrum of troubles with AfC - too little breadth of input on the part of reviewers, and too little willingness to get stuff in shape on the part of some submitters. However, as Hasteur notes, MfD isn't exactly bustling. If this was piped into AfD instead, it would at least reach the people who only subscribe to special topic areas (like yours truly), but I guess that's not done for drafts. Related question: would the Deletion Sorting wikiproject be interested in expanding their servives to MfD? With 4-6 daily entries, it might not be much of a hassle. --Elmidae (talk · contribs) 08:01, 24 December 2017 (UTC)[reply]
  • Oppose - Maybe I'm misunderstanding the proposal (I haven't read all the comments in this discussion), but I don't see how this would significantly reduce workload at AfC. The previous declines are right there in the reviewer's face and all it takes is quick diff to learn that it's been resubmitted again without significant improvement. Three strikes is overly bureaucratic and is punitive for no purpose but to be punitive. If it is clear that resubmissions are done in bad faith, we have other processes for dealing with that such as imposing a block for disruptive editing. ~Kvng (talk) 22:51, 26 December 2017 (UTC)[reply]
This proposal is not aimed at significantly reducing the workload at AfC, and nothing in the proposal claimed as much. It is as much aimed at incorrect reviewing as it is aimed at bad faith re-submissions. Generally this proposal is an attempt to address a number of issues that have been raised to a system that creates one-on-one interaction without collaboration.
I have made other suggestions elsewhere to discuss options to deal with the backlog at AfC, but largely it is my understanding that the backlog stems from too few active participants. Outreach and invitations to find new participants for the Wikiproject would likely be the best solution to the backlog and workload. — Insertcleverphrasehere (or here) 23:46, 26 December 2017 (UTC)[reply]
@Insertcleverphrasehere: I'm really not clear on what you're trying to address here; none of the 3 links in your proposal demonstrating the problem lead to anything specificall useful. If we have an "incorrect reviewing" problem at AfC, it is reviewers being too stingy with acceptance and this proposal does not address this. Truly bad-faith activities can be addressed as WP:DISRUPT if it rises to that level. Superfluous resubmission probably very rarely does. The best way to deal with such actions by newbies and IPs is to politely deflect it. Trying to impose a penalty for an ambiguous action is a waste of everyone's time and has the potential to make the problem worse. ~Kvng (talk) 14:04, 27 December 2017 (UTC)[reply]
hi Kvng, as someone who shares similar concerns as you, I was initially opposed to this proposal, but then I realised that it would actually help improve the AFC review process as eventually rejected drafts would come up in MFD, meaning that the wider community would get to provide oversight to AFC reviewers, without having to change the AFC system at all. Unless you are an AFC reviewer who wants to tediously go through all the rejected drafts, it's nearly impossible for the wider community to check that good potential drafts aren't being quietly deleted.Egaoblai (talk) 17:35, 27 December 2017 (UTC)[reply]
@Egaoblai: I didn't appreciate that a Keep result at MfD would result in the draft being automatically moved to mainspace. I doubt that would be the last word in some cases. This all just feels like we're building a bigger gauntlet. I'm afraid I remain opposed. ~Kvng (talk) 23:26, 27 December 2017 (UTC)[reply]
  • Oppose a robotic system, but certainly support taking no-hope drafts to MfD to get rid of them or for additional editors to review and pass AfC drafts that meet notability criteria, verifiability, and BLP. If there are reviewers who are utterly failing to apply the appropriate criteria at AfC, then they should be talked to gently. If that doesn't work, warnings and removal from the area might be the way to go. I don't think throwing almost-there-but-not-quite drafts into mainspace is productive. I especially worry how this interacts with BLP. What if we have a clearly notable but totally unsourced biography of a living person? We're going to put that in mainspace due to a mechanical process? ~ Rob13Talk 16:52, 27 December 2017 (UTC)[reply]
Most new editors who have ′the "almost-there-but-not-quite" drafts don't really know how to make them complete. Surely putting them into mainspace where the community of people who do know how to add the finishing touches would be the best solution?Egaoblai (talk) 17:35, 27 December 2017 (UTC)[reply]
  • (edit conflict)Support Per BU Rob13, actually, the FUD above makes me support placing multideclined drafts in front of MfD, because it allows for some definite broader community input as to what should happen with the draft. I trust the community to not place crap in mainspace, and I trust it better than one person deciding whether to decline/accept a draft. !dave 17:38, 27 December 2017 (UTC)[reply]
  • Support as one of the more active AfC reviewers, an active MfD participant, and processor of thousands of G13 eligible drafts I've got a pretty good handle on the state of Draft space. I can say with confidence that this proposal will benefit both new users and AfC volunteers. There are far too many drafts that have 3, 5 or even 10 or more rejections. One I saw recently had 16 declines. At some point a topic is either suitable as an article or not - we need to decide that. Most of the declines are on subjects that are NEVER going to be suitable so let's tell the creators that definatively sooner than later. Some subjects are suitable but the poor draft creator lacks the wikipedia experience to get it page past AfC on their own. Such pages would benefit from sending them to mainspace to be expanded or merged by the big world of editors. AfC should not be a permanent holding pen for Drafts - more of a gateway that weeds out the crap and promotes the good. Clear decisions are much better than putting new users off wit repeated rejections. Legacypac (talk) 17:54, 27 December 2017 (UTC)[reply]
  • Oppose currently We have tons of spammy articles getting through AfC. How is this going to help? We need more stringent notability requirements and should not become simply another social networking site on which people can promote themselves and their business. Doc James (talk · contribs · email) 06:03, 30 December 2017 (UTC)[reply]
@Doc James, I never claimed that this proposal was the answer to all the problems at AfC. I am not sure how to respond to this oppose as it seems to be a gripe that the proposal is not doing something that it never claimed to do (dealing with promotional articles was never a focus of this proposal). Other solutions may exist to that problem, but I agree that this proposal is not tailored to deal with those, although it might help to salt non-notable topics that are repeatedly submitted. — Insertcleverphrasehere (or here) 19:06, 5 January 2018 (UTC)[reply]
Thanks for the ping. I would support if those closed "no consensus" are deleted rather than moved to main space. Doc James (talk · contribs · email) 06:53, 6 January 2018 (UTC)[reply]
  • Support AfC isn't working well at the moment it would be great to have a change in it's working. In my own experience with having a biography of a person who's referenced in multiple peer reviwed papers and books not getting accepted because he's he doesn't hold mainstream views suggests that a lot of valuable content gets withold from Wikipedia by the current pratices of AfC. ChristianKl❫ 22:54, 31 December 2017 (UTC)[reply]
  • Support As someone who used to be very active at AfC, it seems like a win-win. Takes a load off of the AfC backlog, makes difficult borderline drafts get processed faster (since an AfC reviewer might be hesitant to single-handedly approve/reject it, and there's no good mechanism for wider consensus-building), and keeps AfC reviewers in check to make sure they are acting in-line with community consensus. --Ahecht (TALK
    PAGE
    ) 22:25, 2 January 2018 (UTC)[reply]
  • Oppose I don't think AfC's backlog should be unloaded to MfD, nor should the articles go into mainspace before they are actually ready. The problems include external links, references incorrectly formatted, unreliable sources, COPYVIOs, incomplete sentences, submissions not in English...Yes some of subjects may be notable, but the drafts are not ready for mainspace for other reasons. If anyone wants to improve the drafts and get them ready for mainspace they are free to do that, it is a free encyclopedia. What would help is if COI editors stopped submitting to AfC in the dim hopes that it will save their article from CSD. As for inexperienced editors and students submitting legitimate articles, I think what could help is additional categories so editors could more easily find drafts on topics they are knowledgeable about.SeraphWiki (talk) 07:24, 6 January 2018 (UTC)[reply]
I wish you would have a bit more thorough reading of the proposal and reconsider your opinion. First: if you had read NOTE#2, you would know that this proposal is by no means a plan to "unload AfC's backlog to MfD" and doesn't even represent that large of a percentage of drafts. Second: the proposal does not advocate moving drafts to mainspace that are inappropriate. Third: COI editors are told specifically that they should submit their articles through AfC (I agree that they shouldn't be submitting articles at all, but that is never going to happen). — Insertcleverphrasehere (or here) 10:50, 6 January 2018 (UTC)[reply]
I read it twice and I understand what it means. Automatically putting drafts that have been rejected three times through MfD and then keeping them if there is "no consensus" is not, in my view, a feasible proposal. Where there is a policy-based justification for nominating a draft for deletion, it should be nominated, but why would that have to be automated? I should explain that what I am concerned about is those articles that are notable but are declined for tone, essay and other reasons. There should be more eyes on the drafts, yes-with a view to improving them. MfD is not cleanup, so if they are notable we will just move them to mainspace before they are ready? Sometimes it takes multiple declines to get there. Some calls, like whether a draft meets one of the deletion criteria, should be made by human beings. SeraphWiki (talk) 11:28, 6 January 2018 (UTC)[reply]

Additional question:

The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.


Should AfD be used instead of MfD? -- we are de-facto treating the draft like an article for the purposes of the discussion anyway (in terms of the potential outcomes), why not take it to AfD instead where there are even more eyes? (this is a proposed modification to the RfC above) Proposal by L3X1 above. Pinging previous !voters...Insertcleverphrasehere (or here) 17:53, 21 December 2017 (UTC)[reply]

  • Support Drafts are basically articles and should be dealt with as such. L3X1 (distænt write) 17:56, 21 December 2017 (UTC)[reply]
  • Can just move the article to mainspace and insta nominate to AfD..doesn't really matter tho. Galobtter (pingó mió) 17:58, 21 December 2017 (UTC)[reply]
  • Works for me. The AfC process works in screening out the obviously unencyclopedic and allowing a single author to develop an article as fully as possible. AfD would bring greater review and allow the community to sort the questionable. -- Dlohcierekim (talk) 18:24, 21 December 2017 (UTC)[reply]
  • Oppose. GreenMeansGo makes a very good point above. The volume of drafts could very well overwhelm AfD, and it involves sending them there without any WP:BEFORE, which we are usually strict about. The administrative overhead at AfD, which has separate pages for discussion, logs, deletion lists, etc., is also higher than the other XfDs, and seems like overkill given that most of these will probably be uncontroversial deletes. At this point I'm thinking a new, separate process (DfD) would be best. – Joe (talk) 21:05, 21 December 2017 (UTC)[reply]
re:WP:BEFORE Things like WP:BEFORE and WP:ATD are extremely useful in preventing good quality articles from getting deleted. With drafts, it may not apply, but i'd argue the benefits outweigh the drawbacks. The problem is that the current system means that decent drafts can be deleted in a process where only one person has looked at the article. Some users and admins are blindly batch dropping and deleting hundreds of drafts at WP:CSD G13 without checking if the article has potential. This solution is the only one at present that solves the problem of a lack of transparency and collaboration at AFC without abolishing it completely. Egaoblai (talk) 23:31, 21 December 2017 (UTC)[reply]
Wait what?? Drafts aren't being judged on NOTABILITY btu on some other merit??? I don't care if is a frankly, crappy, article. If it is notable do a basic cleanup to the best of your ability and accept it. If it is, as Mabalu said once, a Promow*nk, then just TNT it! Nothing should be going to AfD sans BEFORE, and if it passes BEFORE (thereby being notable), it can be an article. 23:50, 21 December 2017 (UTC)[reply]
  • Oppose using AfD to review drafts. I am a long time AfD participant, and AfD is for articles not drafts. We need more active participants at AfD, and this may break the back of that established review process. The biggest problem with AFC reviewing is the cookie cutter responses and the consistent failure to try to triage notable topics from non-notable ones. At AfD and at the Teahouse, people provide personalized responses instead of cookie cutter templates. If a topic is clearly notable, we should bend over backwards to help whip the draft into shape and add it to main space. If there is no evidence of notability, the kind thing to do is to kill the draft informatively so that the draft author understands why it is unacceptable. Treating poor quality drafts about notable topics the same way as poor quality drafts about non-notable topics is a terrible disservice to the encyclopedia. Cullen328 Let's discuss it 07:59, 22 December 2017 (UTC)[reply]
The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.

How about allowing certain CSD criteria to be applied to drafts?

I know that the proper venue for this question is at WT:CSD, this is just to check if there's any support for this (if there is, a formal proposal at WT:CSD could follow). From experience, while not as common as in the article namespace, quite a few drafts would most likely qualify for speedy deletion if they were in the article namespace. In many of these cases, even MfD might be too much considering MfD usually takes a week, so in these cases it's probably better to put these pages out of their misery quickly instead of subjecting them to a long discussion with an inevitable result. So I was thinking: should we allow drafts, if they meet the criteria, to be tagged under some of the CSD criteria for articles? For example, would it be possible to allow articles to be tagged under A1 or A3, or perhaps more likely, under A7, A9, and A11? Narutolovehinata5 tccsdnew 00:12, 22 December 2017 (UTC)[reply]

Not as part of this proposal. This proposal is not just a proposal to deal with bad AfC submissions. It is also meant to deal with too-strict reviewing by separating notable from non-notable topics via discussion and CSD does exactly the opposite. That being said, I'm not adverse to the idea for A7 and ilk (I've actually seen quite a few A7 tagged draft articles, appropriate or not). A1 and A3 should not be extended to the draft space IMO.Insertcleverphrasehere (or here) 01:18, 22 December 2017 (UTC)[reply]
I recall having suggested this before, or more broadly that the article criteria should apply to drafts anyway since drafts are article drafts. There was very little support. Ivanvector (Talk/Edits) 14:29, 22 December 2017 (UTC)[reply]
For good reasons. Draft-space is designed to allow people to work on articles until they are ready for mainspace. If we started applying mainspace-criteria to them, we could just remove Draft-space altogether, because then where would be the difference? The last thing we need, not least in terms of editor retention, is to unleash a bunch of tagging-happy users (not saying that all taggers are such users, mind you) on pages that are in this namespace exactly for the reason of working on them without having to fear immediate deletion. Regards SoWhy 14:47, 22 December 2017 (UTC)[reply]
Oppose this strongly, CSD tagging drafts would be a disaster. we already allowed editors to tag drafts with G13, and as recent discussions have found out, some editors and admins aren't even bothering to check what is submitting to CSD and are just deleting things that are nominated without checking the article for alternatives. Adding more reasons for editors to use CSD for drafts would create more problems for the wiki and for the hope of keeping new editors. Egaoblai (talk) 17:14, 22 December 2017 (UTC)[reply]
@Narutolovehinata5: with respect, the G series of CSD apply to Draft pages. There has been significant opposition previously to users abusing process by promoting patently inappropriate drafts into mainspace in order to apply A series CSD rules to the page. If you think that the page has no hope and will never have hope, MFD is open for you to make your case as to why the page should be deleted. Hasteur (talk) 03:03, 23 December 2017 (UTC)[reply]
G11 and G12 are still valid deletion criteria for drafts (for G11, though, the draft would have to be so blatantly promotional that no amount of reasonable wordsmithing would fix it). —Jeremy v^_^v Bori! 17:40, 24 December 2017 (UTC)[reply]

*Support as one of the more active AfC reviewers, an active MfD participant, and processor of thousands of G13 eligible drafts I've got a pretty good handle on the stae of Draft space. I can say with confidence that this proposal will benefit both new users and AfC volunteers. There are far too many drafts that have 3, 5 or even 10 or more rejections. One I saw recently had 16 declines. At some point a topic is either suitable as an article or not - we need to decide that. Most of the declines are on subjects that are NEVER going to be suitable so let's tell the creators that definatively sooner than later. Some subjects are suitable but the poor draft creator lacks the wikipedia experience to get it page past AfC on their own. Such pages would benefit from sending them to mainspace to be expanded or merged by the big world of editors. AfC should not be a permanent holding pen for Drafts - more of a gateway that weeds out the crap and promotes the good. Legacypac (talk) 16:42, 27 December 2017 (UTC) Wrong section Legacypac (talk) 17:49, 27 December 2017 (UTC)[reply]

g13 already deals with this, and that itself is controversial. Giving people the power to delete active drafts (A real world equivalent would be snatching paper out of someone's hand and throwing it in a shredder) is far too much. If people's drafts are being rejected, they are free to keep trying, there is no current problem here that needs this drastic solution.Egaoblai (talk) 17:26, 27 December 2017 (UTC)[reply]
Agree. After all, does it really harm the encyclopedia to just wait in those few(!) cases until the draft is stale? And if a topic is sufficiently notable for mainspace and just needs some work, then send it to mainspace already. After all, the point of AFC is not to only allow GA-class articles to be sent to mainspace, is it? Regards SoWhy 17:35, 27 December 2017 (UTC)[reply]
@Legacypac, Was your comment meant to be in the above discussion section or this section about CSD? — Insertcleverphrasehere (or here) 17:40, 27 December 2017 (UTC)[reply]
Opps I placed it at the end of the discussion - which turned out to be the subsection not the main section. Thanks. Legacypac (talk) 17:49, 27 December 2017 (UTC)[reply]

A proposal to permanently semi-protect the Template space

Following a series of vandal edits to the template space (here and here, and the ANI about it), MusikAnimal template-protected (without opposition) templates with 5k+ transclusions and semi-protect all those with 1000+ transclusions.

Earlier today a template with 956 transclusions was vandalised.

Due to the fact that the template space isn't widely patrolled, and the potential for great harm to be done in a very short amount of time, I am proposing that all templates be semi-protected. This would likely involve a software change, but I think it's the only way to ensure that major vandalism on a relatively-unused template doesn't sit around for ages, to be seen by the unsuspecting public who might not know how to let us know to fix it.

I briefly discussed this off-wiki with Cyberpower678, who said they would support if there was a minimum transclusion count (10+ was discussed) so I am amenable to that option, but from an administrative standpoint I think it's easier to just batch-protect everything (unless we can get a protect-bot that can autoprotect templates with 10+ transclusions). Primefac (talk) 18:18, 21 December 2017 (UTC)[reply]

NOTE: This would not require a "software" change (to apply to an entire namespace), but would require a configuration change in InitialiseSettings.php. See example of auto confirmed being used for the Module: namespace on eswiki (scroll to the bottom of the page). — xaosflux Talk 19:30, 21 December 2017 (UTC)[reply]
  • Supportentire space a small transclusion number but not the entire space, per zzuuzz's argument below. — Insertcleverphrasehere (or here) 18:25, 21 December 2017 (UTC)[reply]
  • Yeah. OK make it so. net positive -- Dlohcierekim (talk) 18:26, 21 December 2017 (UTC)[reply]
  • Support Having a protect-bot is a little ehh, because it means that anyone can semi-protect any template with less than 10 transclusions by transcluding it..not too much potential for harm there, but still. Galobtter (pingó mió) 18:39, 21 December 2017 (UTC)[reply]
    Other than "winning" an edit war, I don't see the how it would be gaming the system to add a few more transclusions to result in semi-prot. Primefac (talk) 18:50, 21 December 2017 (UTC)[reply]
I mean, that is a vague problem. But rare. Galobtter (pingó mió) 03:58, 22 December 2017 (UTC)[reply]
  • Support I did indeed say I will support this and I can easily create a bot to enforce this. When I did say 10+ transclusions, I did mean in articlespace. So not too much potential for gaming as mentioned above.—CYBERPOWER (Merry Christmas) 18:42, 21 December 2017 (UTC)[reply]
  • Support Net positive. I love IPs and new editors, but they can either log in and wait 4 days or just wait 4 days to edit the Templates. L3X1 (distænt write) 18:53, 21 December 2017 (UTC)[reply]
I don't find zzuzz's argument convincing. "Autoconfirmed socks are easy to come by" If you bother getting your mouse over to the create an account button. Driveby vandals aren't usually socks. And I didn't think this was being proposed as a cure for socking, just vandalism which can be pretty hard to detect. I'm fine for the base limit to be raised to a thousand transclusions or something like that, but SEMI is a very effective anti-vandal lockdown. Also "encyclopedia anyone can edit" doesn't apply here. The template space is maintenance and framework of the encyclopedia, not the content. L3X1 (distænt write) 20:19, 21 December 2017 (UTC)[reply]
Oh the number of autoconfirmed socks I've blocked - these are dedicated regular vandals who do this, not drive-bys. But that's a minor point - I disagree with you strongly about the use of templates. Some are indeed maintenance and framework templates (particularly those targeted), but the vast majority of templates (particularly those edited by unregistered users) contain lists, and names, and numbers, and very much form part of the content. -- zzuuzz (talk) 20:36, 21 December 2017 (UTC)[reply]
  • Comment Primefac, wouldn't it be easier just to roll out an ACTRIAL like technical restriction rather than semi-protect every new template? TonyBallioni (talk) 19:08, 21 December 2017 (UTC)[reply]
    TonyBallioni, I'm not overly concerned about the creation of templates, I'm concerned about vandalism to templates. The vandalism to Template:Redirect-multi was done almost two years after the page was created. Primefac (talk) 19:24, 21 December 2017 (UTC)[reply]
    Right, but what I'm saying is that there may be a way simply to prohibit any editing to templates to non-AC users on en.wiki without having to manually protect them, which would be ideal. TonyBallioni (talk) 19:26, 21 December 2017 (UTC)[reply]
  • Oppose basically because of this. This proposal is fine in theory for long standing prominent, widely-used and heavily transcluded templates - typically maintenance templates. However there's a notable maintenance of less common templates by new and unregistered users, including those with dozens of transclusions. I find it's especially notable in sports templates, but several other topics - politics, media, software, all sorts actually. Many of these templates are almost exclusively maintained by unregistered users. Several hundred transclusions would be my floor. I would prefer instead that the current system of caching and purging is improved to reduce any vandalism's longevity. Also, autoconfirmed socks are incredibly easy to come by. -- zzuuzz (talk) 19:12, 21 December 2017 (UTC)[reply]
I do see that problem. Don't see why it has to be several hundred transclusions though. Most of those have less than 10 transclusions. Galobtter (pingó mió) 03:58, 22 December 2017 (UTC)[reply]
But plenty have more than ten transclusions. It's a figure based on experience. Take some random obscure examples recently edited by unregistered users: Template:Philip K. Dick - 184 transclusions; Template:Syracuse TV - 37 transclusions; Template:Miss International - 64 transclusions; Template:Cryptocurrencies - 109 transclusions; Template:Pixar - 110 transclusions; Template:Flash 99 transclusions. There's a lot like this. -- zzuuzz (talk) 07:22, 22 December 2017 (UTC)[reply]
  • Oppose Wikipedia is the encyclopedia that anyone can edit but there are only 159 editors with template editor right. I don't have this right myself and would resent being considered a suspicious vandal by default. We have far too much creeping lockdown of Wikipedia which is tending to kill the project. If such paranoia had been allowed to prevail in the early days, Wikipedia would never have been successful. Andrew D. (talk) 19:51, 21 December 2017 (UTC)[reply]
    @Andrew Davidson: This proposal is to semi-protect the templates, not template protect them. Just thought I would clarify that.—CYBERPOWER (Merry Christmas) 20:09, 21 December 2017 (UTC)[reply]
  • Oppose zzuuzz's argument is convincing. Jo-Jo Eumerus (talk, contributions) 20:01, 21 December 2017 (UTC)[reply]
  • Sort of oppose, sort of support. 10 transclusions is way too low a limit and the devil is in the details. Maybe 100+, or 1000+ and I'd be OK with this. But I feel a better approach might be to simply be identifying templates that do not need to be edited regularly, and semi-protect those (e.g. semi-protect maintenance templates, infoboxes, etc...). But I do not see a need to semi-protect templates like navboxes, for instance. Headbomb {t · c · p · b} 21:13, 21 December 2017 (UTC)[reply]
  • comment would it be better, perhaps, to use PC1? seems to me, aht it's a useful way to allow ipusers and new registered to make productive edits, without them going live immediately. (this may, however, require technical changes. -- Aunva6talk - contribs 22:32, 21 December 2017 (UTC)[reply]
  • Support semi-protting, but ONLY for templates with 250+ transclusions. The vandals are going to be drawn to those templates that are widely used in order to cause chaos, so the simplest thing to do is to semi-protect only those templates that are used in many articles at once. As per usual I oppose CRASHlock on ideological grounds, not to mention that only an idiot would want that many pages CRASHlocked all at once.Jeremy v^_^v Bori! 22:57, 21 December 2017 (UTC)[reply]
  • There must be some vague way to figure out if something is a maintenance or content template. Maybe make templates that take parameters semi-protected? Some sort of solution of 10+ transclusions and that. But it'd also have to be designed to prevent abuse. Galobtter (pingó mió) 03:58, 22 December 2017 (UTC)[reply]
  • Oppose. Aside from zzuuzz's points, it would vastly increase the workload on WP:Template editors (and admins who respond to requests that TE's can also do). Yes, non-TEs could respond to template-editing requests on templates that are only semi-protected, but most of them don't know that, and it would likely not be obvious what the protection level was when it came to any particular request. I think a more reasonable proposal would be to semi-protect templates with X number of transclusions. 100? 250? 1000? I'm rather agnostic on the matter. A good thing to do would be to find the sports-specific template that is used on the largest number of pages that is not an infobox (we do not want anons adding or deleting parameters from an infobox, because those templates are a massive WP:DRAMA factory as it is) and not a navobox (because we have rules about them, and anons mostly will not have read them). There are likely football (soccer) templates relating to team roster, uniforms ("kit"), league standings, etc., used on hundreds of articles at least, possible 1000+, to which anons with some experience could meaningfully contribute. Might give us an idea what number we should be thinking about. Anyway, I would actually like to see automatic semi-protection on somewhat-high-use templates as an anti-vandalism method, and also as a means for reducing the number of templates that have template-editor protection for which semi-protection would actually be sufficient. That will not only waste less TE time, it will get us more template development by anons and non-anons.  — SMcCandlish ¢ >ʌⱷ҅ʌ<  05:07, 22 December 2017 (UTC)[reply]
  • Oppose. Excellent arguments made by zzuuzz. I read through the recent changes link, and surprisingly few of the diffs listed were vandalism. We might want to protect highly-used maintenance templates. Perhaps 5000 transclusions would be a good floor for this (from what I've seen at the most-transcluded templates report). Enterprisey (talk!) 05:16, 22 December 2017 (UTC)[reply]
  • Support Vandalism on template is affecting several pages at the same time. Any IP user can propose any meaningful change via edit request which normally doesn't last 24 hours without getting response. –Ammarpad (talk) 09:15, 22 December 2017 (UTC)[reply]
    Based on recent data this would affect around 1,500 edits every week. I think most wouldn't even bother making requests, but if even a proportion did I would expect that to increase. -- zzuuzz (talk) 10:53, 22 December 2017 (UTC)[reply]
  • I would support a proposal for using a bot to give all templates with more than 10 transclusions semi-protection, or pending changes protection if possible; and to remove semi-protection if templates, having been protected by the bot, have their number of transclusions reduced below 10. Automatic semi-protection of all templates is definitely overkill (having a bot find the number of transclusions is definitely possible), and 5000 as a minimum for semi is definitely too high (I'd say even 500 would work for template protection). Note that none of these would increase the workload of template editors, since they are only needed for editing template-protected templates and modules. The bot could also avoid protecting templates consisting solely of navboxes or sidebars, since they are supposed to be transcluded on many pages by design and are relatively easy to edit. Jc86035 (talk) 11:38, 22 December 2017 (UTC)[reply]
  • OP Comment Okay, a few things I've seen that keep popping up:
    First, all templates with 1000+ transclusions are already semi'd (and 5k+ TE'd). This proposal is talking about those with 0-999 transclusions being potentially vandalised.
    Second, the proposal is for semi protection, which would not increase Template Editor's jobs in any way, shape or form. They would of course be welcome to patrol the edit requests for Template-space issues, but not required.
    Third (going off the previous note) the TEs aren't exactly hammered under the weight of responsibility. We get maybe three TPER requests a week.
Just felt I should clarify those things going forward. I will admit that IPs aren't all bad with their changes, especially to Sports templates, but those type of frequently-updated templates are (for the most part) single-season templates that are rarely used on more than 10 pages (which would mean the 10+ option of semiprot wouldn't affect them). If Cyberpower says he can make a bot to do it, then it wouldn't involve any software changes. Primefac (talk) 13:22, 22 December 2017 (UTC)[reply]
And for what it's worth, I don't particularly care if we decide on 10+ or 250+ or 500+, but I think there should be some threshold for semi-protecting templates. Primefac (talk) 13:24, 22 December 2017 (UTC)[reply]
So... if we (assuming we can) run the numbers, looking at frequency of edits and number of transclusions, what does that graph look like? Is there some evidence based threshold beneath which most IP editors can happily plod along, while the rest of us can avoid having a cock and balls transcluded on a few hundred pages every few weeks? GMGtalk 13:44, 22 December 2017 (UTC)[reply]
  • Oppose, current protection level works generally well, and vandalism level on templates isn't very high. Openness ("anyone can edit") is more important than restricting vandalism to sleeper accounts. —Kusma (t·c) 18:17, 22 December 2017 (UTC)[reply]
  • Oppose blanket protection. zzuuzz's argument is compelling and this is the free encyclopedia that anybody can edit. At a certain number of transclusions, the tradeoff points in the direction of protecting. So I would support an x+ transclusions rule. Malinaccier (talk) 18:28, 22 December 2017 (UTC)[reply]
  • Oppose, there are template such as sports competitions, sport team lineups, list of stations etc, where IP contributions are mainly constructive and helpful. I could support the x+ protection, though 10 inclusions looks like a low bar for me (a football team lineup is at least 23 + the team + the coach), I would more think about fifty or so.--Ymblanter (talk) 00:00, 25 December 2017 (UTC)[reply]
  • Oppose This is the encyclopedia anyone could edit. Functionally, semi-protecting everything would be great, but its not what we do and it's fundamentally against our ideas. Don't do this kneejerk action. Be smart, and judge it on a case-by-case basis. !dave 08:17, 28 December 2017 (UTC)[reply]
  • Oppose. There are many templates that would be perfectly legitimate for newer editors to edit, especially navboxes and the like. I would support semi-protecting everything with 100+ transclusions. We probably should create an edit filter logging newer editor's edits to template namespace, as an aside. ~ Rob13Talk 08:24, 28 December 2017 (UTC)[reply]
    @BU Rob13: If you use the new filters form on RC/WL/RCL (see beta preferences), this is all non-EC edits to template space. --Izno (talk) 13:39, 28 December 2017 (UTC)[reply]
  • Support on templates used more than 100/200 times but Oppose generally. Doc James (talk · contribs · email) 05:59, 30 December 2017 (UTC)[reply]
  • Oppose entirely per my response to BU Rob13. The majority of changes made in the template space by non-EC users are good changes; routine updates to navboxes seemingly the majority of such changes. --Izno (talk) 06:32, 1 January 2018 (UTC)[reply]
  • Oppose Counting transclusions isn't that useful in template vandalism, you really care about page views. If a template is transcluded 30 times, but one of those pages is Obama, it's a highly viewed template and should be protected. But if a stub template is used on a thousand pages that barely get read, protecting it is of little use. Beside that, protecting an entire namespace is an extremely dangerous road to go down IMO. We should always default to open. Legoktm (talk) 07:11, 2 January 2018 (UTC)[reply]
  • Support Semiprotection is not an insurmountable hurdle. The benefit of this proposal outweighs concerns, imo. James (talk/contribs) 14:43, 2 January 2018 (UTC)[reply]
  • Support with a reasonable threshold of transclusions (say 200?). Peter coxhead (talk) 16:01, 2 January 2018 (UTC)[reply]
  • Support Templates that are used in more than 200 pages should be permanently semi-protected. However, it doesn't seem to be reasonable to semi-protect all templates. I have seen constructive edits made by IP editors to templates. Extended confirmed protection for all the templates that are used to warn users about their edits. Pkbwcgs (talk) 16:49, 2 January 2018 (UTC)[reply]
  • Support as long as a) there is a transclusion threshold, b) the bot removes protection when a page drops below the transclusion threshold, and c) there is a mechanism to request that a template be unprotected (and a flag that the bot will obey). --Ahecht (TALK
    PAGE
    ) 22:28, 2 January 2018 (UTC)[reply]
  • Oppose per zzuuzz, Kusma et al. Better to apply whatever protection is required manually on an individual basis. At the end of the day, newby vandals are unlikely to be aware of template space anyway, and those out to deliberately disrupt the project would have no problems about waiting till they're auto-confirmed. Optimist on the run (talk) 11:22, 4 January 2018 (UTC)[reply]
  • Oppose per zzuuzz et al, Not all IPs are vandals and the other point is "We're an Encyclopedia that anyone can edit" .... we'd be defeating the purpose of the object if we locked all templates, Whilst in theory I agree with this proposal unless we ban all IP editing (which sounds great!) then I think it's best to just stick with semi protecting here and there and blocking here and there. –Davey2010Talk 16:47, 5 January 2018 (UTC)[reply]

Discussion

  • If the proposal is adopted, then the transclusion count cut-off point should be somewhere above 200. The vast majority of templates are navboxes: they don't get vandalised often and they do requite quite a bit of maintenance that anons are generally able and willing to help out with. – Uanfala (talk) 15:46, 24 December 2017 (UTC)[reply]
  • I havent observed the editing history of templates much. But Uanfala's comment above makes sense. Also, by doing this we will take away one more thing from "anybody can edit" thingy. Yes, anybody can edit, but you need an account. Also, you need to wait for 4-5 days, and make 11 edits before you can edit.
    Also, if a vandal who is going to vandalise through templates, i think he is already familiar with concepts of socks, and sleepers. So I dont see much point in implementing these proposals. courtesy ping to Uanfala to let them know that their comment was moved. usernamekiran(talk) 23:44, 24 December 2017 (UTC)[reply]
  • The template subspace not only contains templates but also their talk pages. New users should be able to ask for help on the talk pages of templates so the talk pages shouldn't be semiprotected. ChristianKl❫ 22:57, 31 December 2017 (UTC)[reply]
    • So far two ideas seem to be floating: one, that a bot protects any template with more than 10 transclusions (uses) or a sitewide configuration change for the template namespace. In neither case would talk pages be affected. :) Killiondude (talk) 03:31, 1 January 2018 (UTC)[reply]

Atleast can we have semi-protection for 100+ transclusions? Another incident occured today that shows why it really is needed. {{Sidebar person}} is transcluded on 564 pages, including Donald Trump. Yet it was completely unprotected, and was vandalized by someone so that a huge "fuck donald trump" showed for at least 40 minutes (probably more) at-least 2 hours (based on a reddit post), 2 hours after the vandalism was reverted (only for logged-out users, because apparently they get a cached version of the page - so regular editors did not see it) I'm thinking that after that automatic semi-protection, at admin discretion, maintenance templates that shouldn't be changed much can be preemptively template/semi protected. Galobtter (pingó mió) 11:25, 1 January 2018 (UTC) I'm wondering if there's a smarter way to do it. Templates that are transcluded onto other templates (like that one was) should be protected at a lower count. I'm sure there are reasonable ways so that sports stat templates etc are not protected while ones like these are. Galobtter (pingó mió) 11:52, 1 January 2018 (UTC)[reply]

This sub-transclusion problem is of course a major one not addressed above. What today's vandalism shows, yet again, is that it's templates with not merely tens or a hundred of transclusions, but several hundred transclusions. The templates which generated this thread were around 1,000 transclusions. But also the real problem is not the vandalism but the caching and no effective means to bust the caches. -- zzuuzz (talk) 12:03, 1 January 2018 (UTC)[reply]
Another tool might be an admin ability to mass purge cache, perhaps cache's generated in a certain period of time (when the template was vandalized) linked to a certain template. Galobtter (pingó mió) 12:15, 1 January 2018 (UTC)[reply]
Has anyone confirmed or heard if that was seen on any other page or did it just happen to Donald Trump? Emir of Wikipedia (talk) 16:49, 1 January 2018 (UTC)[reply]
Yes others[5][6]. -- zzuuzz (talk) 16:54, 1 January 2018 (UTC)[reply]
Wikipedia:Purge#forcerecursivelinkupdate is interesting Galobtter (pingó mió) 16:56, 1 January 2018 (UTC)[reply]
Apparently can force cache update of all transcluded pages.. Galobtter (pingó mió) 16:58, 1 January 2018 (UTC)[reply]

Hashing out a number

It looks like about half of the "oppose" votes are opposing the "blanket" semiprot, which I sorta get. That half also mentioned that an alternate option was an "X+ transclusions" option. Seeing as how the % of !votes who would be amenable to that is more than the "hard oppose", I think it's time to flesh out a number. The numbers that were thrown around the most were 10+, 100+, and 250+. So, despite my personal concern that we'll never agree on anything, I'd like to see if we can try. Primefac (talk) 02:21, 4 January 2018 (UTC)[reply]

  • 100+ - it's a high enough bar that the sports-type templates that frequently get updated by the helpful IPs won't be affected, but keep "bigger" templates from causing more harm than necessary (and <100 pages is a piece of cake for someone with AWB to null edit in a hurry). Primefac (talk) 02:21, 4 January 2018 (UTC)[reply]
  • 250+; I've made my reasons why clear above. —Jeremy v^_^v Bori! 02:27, 4 January 2018 (UTC)[reply]
  • 250+ or 10+ semi-protected pages. (I'm not sure this suggestion is feasible) Templates like Template:Duke Blue Devils men's basketball navbox should be able to stay unprotected. Templates transcluded on high-profile pages should have a lower threshold. power~enwiki (π, ν) 11:56, 4 January 2018 (UTC)[reply]
    Wait, are we talking semi-protection, or pending-changes protection? I could support pending-changes for the entire namespace. power~enwiki (π, ν) 12:17, 4 January 2018 (UTC)[reply]
    @Power~enwiki: I don't think the software supports pending changes in templates, as the software will always transclude the latest version. I can't find where I read that, though. -- John of Reading (talk) 07:23, 5 January 2018 (UTC)[reply]
    Not to mention that that would be too much of a strain on the hive of idiots that is CRASH. Like I said above, only utter fools would want so many pages CRASHlocked. —Jeremy v^_^v Bori! 20:36, 5 January 2018 (UTC)[reply]

RfC: Cross-wiki redirects to Wiktionary

Should cross-wiki recirects to Wiktionary be deleted, all or in part? Huon (talk) 00:46, 26 December 2017 (UTC)[reply]

Survey

  • Delete all Huon (talk) 00:46, 26 December 2017 (UTC)[reply]
  • Save prefix/suffixes. For instance: -ic, -izzle, Petro-, -ous, it seems appropriate for these. Brightgalrs (/braɪtˈɡæl.ərˌɛs/)[1] 13:56, 27 December 2017 (UTC)[reply]
  • Delete them all. I've always hated this cross-project redirects, and interwiki search results make them even less useful. FACE WITH TEARS OF JOY [u+1F602] 18:34, 27 December 2017 (UTC)[reply]
  • Delete all per NOTDICTIONARY, which they basically turn us into. TonyBallioni (talk) 19:13, 27 December 2017 (UTC)[reply]
  • Delete all yeah, don't see the value in them, NOTDICTIONARY and all that. Galobtter (pingó mió) 19:24, 27 December 2017 (UTC)[reply]
  • Delete all, unless the search filters out the results for a term (e.g. " ", though that page isn't a Wiktionary redirect). Jc86035 (talk) 08:54, 28 December 2017 (UTC)[reply]
  • Keep {{wiktionary redirect}} is not really a cross-wiki redirect, it's little more than a replacement for the standard "Wikipedia does not have an article with this exact name" message, and doesn't over-emphasize Wiktionary, but is infinitely more useful for readers that get their via a wikilink or google search.--Ahecht (TALK
    PAGE
    ) 17:48, 5 January 2018 (UTC)[reply]
  • Keep. I don't understand the rationale behind this proposal. {{wiktionary redirect}} serves a valuable function in helping to prevent the creation of dict-def articles. olderwiser 20:56, 5 January 2018 (UTC)[reply]

Threaded discussion

There are some 1,300+ cross-wiki redirects to Wiktionary, many of them making use of Template:Wiktionary redirect. Since the default search has been adapted to show results from sister projects (including Wiktionary) when there's no WP page of that title, those redirects don't serve much of a purpose any more. The template was recently nominated for deletion; the discussion resulted in "speedy keep" and a finding that the fate of the redirects should be discussed at the village pump instead. So I'm bringing it here. The only redirects that arguably still serve a purpose are those where the Wiktionary page redirected to doesn't share a name with the WP page with the redirect. So keeping those and only deleting those where target name and origin name are identical is an option. Personally I don't think that's worthwhile. Huon (talk) 00:46, 26 December 2017 (UTC)[reply]

  • No comments on the merits of this proposal.As the closer of the original TfD, this is the correct venue for such discussions.Regards:)Winged BladesGodric 08:29, 26 December 2017 (UTC)[reply]
  • Support this deletion. And yes, a discussion here is the right way to handle these, not a TFD discussion (although a case could be made for a mass-RFD, a discussion here is probably better). עוד מישהו Od Mishehu 10:35, 26 December 2017 (UTC)[reply]
  • If there is a mass deletion, would links to the deleted pages be changed to wikt links? Or failing that, notification sent to authors/watchers of pages linking to the redirects so that editors can update pages? Nessie (talk) 13:31, 28 December 2017 (UTC)[reply]
  • What do you all think should be done with a page such as Floccinaucinihilipilification (one of the soft redirects affected by this proposal)? Redlinked, so that people will keep trying to create it? Hard redirect directly to the Wikitionary entry? "Just not have these pages" is an unlikely outcome, since many of these soft redirects are already protected due to repeated re-creation. WhatamIdoing (talk) 19:52, 28 December 2017 (UTC)[reply]
  • Are we sure that all people likely to end up on such soft redirects come to the soft redirects via the site search? Direct links to Wikipedia pages do exist, as does Google search. Jo-Jo Eumerus (talk, contributions) 10:53, 29 December 2017 (UTC)[reply]

Twinkle's "unlink backlinks" feature and meatbot edits

The popular automated editing tool Twinkle has an "unlink backlinks" feature, which is intended to allow removing links to an article that has been deleted. This seems generally helpful (although it has previously been claimed to be counterproductive). However, a major issue is that it has occasionally been abused for large-scale unlinking of articles: as vandalism in 2008 (threads at the village pump and ANI), and in good-faith incidents in August 2011, September 2011, October 2016, and in apparently several "unlinking of the sandbox" fails, that last of which was in 2015. The latest incident occurred today, when an editor used it to remove over 460 links to English language (ANI thread, and drama at the editor's talk page).

There have been vague proposals to place limits on the kinds of pages that can be unlinked (in 2015) or to restrict its use to either sysops, or extended confirmed users. Instead, I think it will be more productive to put in place a restriction on the number of links that can be removed. How many are normally removed as part of the tool's intended use? If Twinkle allows single-click unlinking of more than a few dozen pages, then it's within WP:MEATBOT territory and it should be regulated. – Uanfala (talk) 20:45, 27 December 2017 (UTC)[reply]

  • There was no drama. There was a small group of editors that lost their collective minds when they saw a big number and just started reacting rather than discussing. Niteshift36 (talk) 21:22, 27 December 2017 (UTC)[reply]
  • That's the sort of quote that will likely return to haunt you at your next ANI appearance.
Twinkle's docs justify its use for "a page on a non-notable, vandalism, or other problematic topic" - just which of these is English language?
There is no justification for this sort of delinking. No one else supports it, you have given none. So how do we stop it? Restrict access to this feature by editor identity? Or by the ability of the feature itself, such as only removing redlinks? Andy Dingley (talk) 23:37, 27 December 2017 (UTC)[reply]
I think the best solution is to allow it only for users who are capable of deleting pages (admins) or for redlinked pages. The former allows removing the backlinks before deleting the page by the user who is about to delete the page, and that is desirable. עוד מישהו Od Mishehu 07:06, 28 December 2017 (UTC)[reply]
  • No, it won't "haunt" anything. There was a group of editors that lost their collective minds. Some even conceeded that the edits themselves weren't actually controversial, just that the manner bothered them. And yes, if some people actually thought about it, rather than reacted to it, they'd agree that most of those edits weren't controversial. In the end, 462 edits got reverted because a small percentage of them were contested. And some were contested for no other reason than 'I don't like how you did it' or 'Well, someone might click it'. Niteshift36 (talk) 14:31, 28 December 2017 (UTC)[reply]
There's a reason we have a BOT policy, and that is to prevent these sort of unilateral mass changes. Galobtter (pingó mió) 8:02 pm, Today (UTC+5.5)
  • And you've continually beat that drum, ignoring what I've actually said. Regardless, the "crisis" has been averted. Gotham is safe. Hundreds of pointless links have been restored. You can put the stick away. Niteshift36 (talk) 19:08, 28 December 2017 (UTC)[reply]
I'm not disagreeing, but why is it preferable to delete such links before a page? Andy Dingley (talk) 12:35, 28 December 2017 (UTC)[reply]
  • Nobody is proposing to delete the English language page. I've clearly stated that article is important and serves a purpose. What is not needed is the link to it spammed into thousands of articles. Niteshift36 (talk) 14:31, 28 December 2017 (UTC)[reply]
He's replying to Od Mishehu, who emphasized "before". Killiondude (talk) 19:48, 28 December 2017 (UTC)[reply]
We probably should consider whether some of these heavily linked articles should be linked in quite so many places. Even after Niteshift's alleged "near-orphaning", there were still more than 70,000 links to that article in the mainspace. (He removed only 0.6% of them, and I think he was right to do so in at least some of those cases.) WhatamIdoing (talk) 21:03, 28 December 2017 (UTC)[reply]
  • Thank you. "Near-orphaning" was hyperbole from the start. Niteshift36 (talk) 21:44, 28 December 2017 (UTC)[reply]
I remember checking about a dozen of these edits (including a few that were made manually before the Twinkle episode) and found about two or three that were OK to be removed. The majority however were not OK to be removed. – Uanfala (talk) 21:20, 28 December 2017 (UTC)[reply]
  • And that's where we atart to disagree. Even if I buy off on it in language related articles, saying that because a sentence says 'Joe was fluent in Spanish, French, English and German' and the other languages are linked, so English needs linked is absurd. (And yes, some of the objections were examples like that) It's more about aesthetics than usefulness. Niteshift36 (talk) 21:44, 28 December 2017 (UTC)[reply]
Is there a reason to remove though? If it looks aesthetically better, but has no disadvantages, then why not keep the link? Galobtter (pingó mió) 11:01, 29 December 2017 (UTC)[reply]
  • The fact that you even consider aesthetics to be a valid reason makes me think you'll never understand removing them. One reason overlinking is a pain is that when scrolling on a mobile device, it's easy to start bumping unnecessary links. That's less user friendly, not more. Wikilinks serve an important role, to make it easy to find articles that help them understand the topic better. But we don't (according to the MOS) link terms that most people understand. Nobody has even tried to put forth the notion that most readers of the English Wikipedia don't know what the English language is. The article is there and it serves a role. People know how to find it. Linking it in some infobox to tell us that Top Gun was filmed in English serves no real purpose and is exactly what overlinking is. Niteshift36 (talk) 14:49, 29 December 2017 (UTC)[reply]
Policy is clear on what constitutes mass-editing and what permissions are required before running what is effectively a bot. Mass-removing hundreds of links without prior consensus to do so is clearly and obviously a no-no. Regardless of the underlying merit (which has so far failed to persuade most editors who looked at it anyway). Realistically the only solution I can see is alter twinkle to not allow standard user accounts to make that many changes to links. Is it possible to restrict it to say, anything more than 50 requires a bot flag? Only in death does duty end (talk) 17:07, 29 December 2017 (UTC)[reply]
  • It has been undone, so all the "damage" is gone. Most of those who have disagreed have either taken issue with the fact that they weren't consulted, gave aesthetics reasons or have not even looked at the majority of edits and just claimed it 'doesn't hurt to have them'. So no, they weren't "persuaded" because they didn't even discuss most of the actual edits. Let's be honest, this whole "near orphaning" claim is a gross exaggeration from the start. Niteshift36 (talk) 20:31, 29 December 2017 (UTC)[reply]
Near the end of User talk:J947/Archive 1 we have a similar newbie mistake (by me) and I felt pretty much the same as Niteshift36. That was all due to me misinterpreting WP:OVERLINK. J947 (c · m) 05:01, 30 December 2017 (UTC)[reply]

Remove "you can help" from maintenance template

I believe that the invitation "You can help" in {{Data missing}} is a violation of MOS:YOU, and should be removed. Any thoughts? Optimist on the run (talk) 00:16, 29 December 2017 (UTC)[reply]

  • MOS:YOU is about article prose. Maintenance templates aren't part of the article text. They're ruptures in this text, the points where the fourth wall momentarily breaks down and we can directly address the reader with something to the effect of "Pssst. The facade is not all there is to it. Come behind the scenes!". That's part of the wiki way and it's more important than all the style manuals in the world. – Uanfala (talk) 00:23, 29 December 2017 (UTC)[reply]
    I disagree. We have plenty of places to invite readers to become editors of the encyclopedia that anyone can edit without breaking up the flow of text. This is why we don't have "you can help" in {{cn}} or a dozen other maintenance tags, Optimist on the run (talk) 07:25, 29 December 2017 (UTC)[reply]
Yeah, there's no reason why this one needs "you can help" when other maintenance tags don't. Galobtter (pingó mió) 11:04, 29 December 2017 (UTC)[reply]
I agree - a header template can say "you can help" or something similar without being too disruptive to the text; an inline template shouldn't say this. עוד מישהו Od Mishehu 16:00, 1 January 2018 (UTC)[reply]

an AfD-style "find sources" template for Talk Pages

When submitting an article for deletion, the template inserts useful links to find sources on the deletion discussion page (example). Is there any way to add something similar to a talk page of any article? Such a tool is very useful to editors and it's a pity if it's not available until an article is considered for deletion. AadaamS (talk) 10:06, 30 December 2017 (UTC)[reply]

AadaamS
{{Find sources AFD|Water}} gives
Find sources: Google (books · news · scholar · free images · WP refs· FENS · JSTOR · TWL
Galobtter (pingó mió) 10:09, 30 December 2017 (UTC)[reply]
@Galobtter: thanks! AadaamS (talk) 10:43, 30 December 2017 (UTC)[reply]

AWB proposal

I would like to make a proposal that an improvement is made to AWB so that it either highlights places where there isn't a capital letter at the beginning of a sentence. I have observed that AWB is not fixing this problem and this addition will make it easier to fix places where capital letter are not at the beginning of sentences. Pkbwcgs (talk) 17:30, 1 January 2018 (UTC)[reply]

@Pkbwcgs: Editors interested in fixing this error could use the existing AWB software to do the job, as the software already has enough find+replace options. They would have to check every change by eye before saving each edit; this is the rule for all software-assisted fixing of typos (WP:SPELLBOT). It's not a simple job. There would be many cases where a . does not mark the end of a sentence; for example, where it marks an abbreviation or is a decimal point. There would be other cases where a sentence is required to start with a lowercase letter, for example when writing about eBay. -- John of Reading (talk) 20:20, 1 January 2018 (UTC)[reply]
Pkbwcgs, if anyone is to do this, I strongly recommend it not be you. In case the string of warnings hasn't sunk in, you're one more mistake away not just from having the AWB permission stripped but from being blocked altogether. A task this complicated would need to be done by a person with a very keen eye for punctuation and an in-depth knowledge both of English grammar as it relates to punctuation and formatting, and of Wikipedia's style guidelines and when it's appropriate to disregard them, and given the number of errors and mistakes you make this person is definitely not you. ‑ Iridescent 20:26, 1 January 2018 (UTC)[reply]
I am clearly not making anymore mistakes. I also have a very keen eye for grammar and mistakes and I skip many articles on AWB when if I think that there is even one thing that is wrong. Pkbwcgs (talk) 13:30, 2 January 2018 (UTC)[reply]
Withdrawn I don't think it a very good proposal to make. Pkbwcgs (talk) 16:52, 2 January 2018 (UTC)[reply]

Proposal: rename "References" section to "Citations" section

For wikipedia it would be better to have a Citations section than References section because Citations are used in academic works and because it makes more sense in terms of "citation needed" tags. What do you think? Brian Everlasting (talk) 21:02, 1 January 2018 (UTC)[reply]

There's no rule against calling the section "Citations" (or anything else appropriate) if there's no possibility of confusion. In general, we avoid using "Citations" as a section name because of the potential for confusion; "Citations" could refer to a list of awards received by the article subject or a list of legal convictions, whereas "Notes" or "References" is unambiguous. Likewise, we discourage "Bibliography" as a section name because of the potential for confusion with a list of works either by or about the subject, rather than a list of the works used to source the article. (You can read chapter-and-verse of our guidelines for naming this section at MOS:FNNR.) I'd vehemently oppose any attempt to standardise the name of this section to "References", "Citations" or anything else, and I suspect virtually every other person to comment would as well, but if you really want to try to get consensus for it Wikipedia talk:Manual of Style/Layout would be the place to hold the discussion. ‑ Iridescent 21:13, 1 January 2018 (UTC)[reply]
Oppose per Iridescent · · · Peter (Southwood) (talk): 05:01, 2 January 2018 (UTC)[reply]