Wikipedia:Bot requests

From Wikipedia, the free encyclopedia
Jump to: navigation, search

This is a page for requesting work to be done by bots per the bot policy. This is an appropriate place to simply put ideas for bots. If you need a piece of software written for a specific article you may get a faster response time at the computer help desk. You might also check {{Botcats}} to see if the bot you are looking for already exists, in which case you can contact the operator directly on his or her talkpage.

If you have a question about one particular bot, it should be directed to the bot owner's talk page or to the Bot Owners' Noticeboard. If a bot is acting improperly, a note about that should be posted to the owner's talk page and to the Administrators' Noticeboard. A link to such a posting may be posted at the Bot Owners' Noticeboard.

If you are a bot operator and you complete a request, note what you did, and archive it. {{BOTREQ}} can be used to give common responses, and to make it easier to see at-a-glance what the response is.

There are a number of common requests which are regularly denied, either because they are too complicated to program, or do not have consensus from the Wikipedia community. Please see Wikipedia:Bots/Frequently denied bots for a list of such requests, and ensure that your idea is not among them.

If you are requesting that a bot be used to add a WikiProject banner to the talkpages of all articles in a particular category or its subcategories, please be very careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively. Compare the difference between a recursive list and a properly vetted one.

Please add your bot requests to the bottom of this page.
Make a new request


An article/reference bot[edit]

It would be helpful to have a bot that compiled a numerical result showing how many articles on any given Wikipedia edition lack references. While quality assessment is very difficult, such an analysis would give a rough 'verifiability index' of individual editions (and a possibility for comparisons between editions).

I assume a simple string search for <ref or reference tags in each article would suffice. If found, the article can be added to the number of referenced articles and the bot can skip to the next one. If it reaches the end of the article and no reference tag is found, the 'unreferenced' count is increased. The end result would just have to be the two resulting sums, which constitute the ratio of referenced vs. unreferenced articles.

I realize there is a certain error margin due to several factors, e.g. malformed references, but that would probably even out, as such errors would be equally distributed between editions.

There's no need for the bot to make any markup, it would just be for statistical QA.

If such a bot already exists or easily can be modified for the task, please advice. Thank you! Asav | Talk 18:57, 11 February 2015 (UTC)

Okay, the trick will be to get a transclusion count of the {{Reflist}} template. The current number is 3,410,088. Then, subtract it from the number of articles (currently 4,717,510). The downsides of this method are:
  • Jarry1250's tool counts all transclusion, AFAIK, even the non-mainspace ones.
  • All articles with {{Reflist}} might not have references.
  • Articles might have malformed references.
--QEDKTC 18:08, 12 February 2015 (UTC)
Also, the article might use <references /> instead. Then there are the articles with neither, but which are still fully-referenced - such as Actuary. --Redrose64 (talk) 18:22, 12 February 2015 (UTC)
Actuary does have a {{Reflist}}. --QEDKTC 04:41, 14 February 2015 (UTC)
It shouldn't have done. It looks like it was added in error by PoeticVerse (talk · contribs) as the wrong fix for this edit, which had used <ref>...</ref> (contrary to WP:CITEVAR and WP:PAREN). Following this edit, the {{reflist}} should definitely have been removed; so I've now done that. --Redrose64 (talk) 16:50, 14 February 2015 (UTC)
Thanks for your responses so far, but the bot has to be edition agnostic, so so looking for '<references>', '{{Reflist}}' or '{{references}}' tags won't work; as the Norwegian edition uses '{{Referanser}}' and the French '{{références}}', for example. The bot needs to tackle localized editions as well, hence my suggestion that it count occurences of articles containing '<ref'. (This probably won't work for non-Latin alphabets, but it's better than nothing.) Malformed references and related errors are not a major problem; they'll even out in statistical terms, given the huge numbers we're talking about. Asav | Talk 20:53, 12 February 2015 (UTC)
Oh, I didn't read the "any Wikipedia edition" part. We can run the script through a global bot on each wiki. And we can just change it to the localized template each time on a new wiki. All sensibly referenced articles have the {{Reflist}} template, so I believe we'll get almost accurate numbers. In fact, most articles with inline citations will have the template. We can change it to transclusions in article namespace, so the python script should work fine. I'm fine with running the script but someone has to help me migrate it to the Labs cluster. And I'm going away on 21st. So, I would rather do it before that. --QEDKTC 04:41, 14 February 2015 (UTC)
And some wikis still use the deprecated {{Ref}} which can be tackled by the script. --QEDKTC 04:46, 14 February 2015 (UTC)
Sorry, I may be a bit slow here, but when you say 'All sensibly referenced articles have the {{Reflist}} template,' do you mean the localized or the translated versions (such as {{Referanser}} and {{références}}) as well? Also, quite a few articles still use the deprecated <references> tag. Would that bot work on those too, or will it have to be adjusted for national/localized editions?
Would python -count {{Referanser}} <references> do the job on the Norwegian edition, for example? Asav | Talk 19:17, 14 February 2015 (UTC)
I'm bumping this, since QEDK is on a wikibreak. Asav | Talk 10:06, 23 February 2015 (UTC)
Yes, it should. The problem is however, (with -count) doesn't count the articles but the number of transclusions so if your mentioned "keywords" occur twice, thrice or more, it counts that number and not the number of articles it has checked. --QEDKTC 12:22, 22 March 2015 (UTC)
Thanks! What about articles that use parameters within the {{Referanser}}< tag, such as {{Referanser|2}} (for two column layout). Would each possible parameter have to be listed, or is there some sort of a wildcard function? Asav | Talk 19:10, 23 March 2015 (UTC)
Instead of -count, do not enter any arguments at all, just set it in the article namespace. Btw, this script would take days to output the number on a very large wiki. Anyway, the code for {{Referanser}} would be -namespace:0 referanser. I have no idea if this script supports redirects or parameters. --QEDKTC 16:52, 24 March 2015 (UTC)

Symbol merge vote.svg Needs wider discussion

I think this would be hugely controversial and difficult, but if it can happen it could change WikiPedia forever. I think we need to get concencus-- (talk) 17:42, 30 May 2015 (UTC)

I have code for this, if I can find it. Which editions of Wikipedia would you be interested in? All the best: Rich Farmbrough, 23:19, 16 July 2015 (UTC).
Asav@ All the best: Rich Farmbrough, 23:21, 16 July 2015 (UTC).

UK railway station categories[edit]

Last year, a dft_category parameter was added to {{Infobox GB station}}. However, still only a minority of railway station articles are using it, although there have been Wikipedia categories for them for much longer.

I am therefore requesting a bot to go through these articles (categories A–C2 have been done manually, and so only D, E, F1 and F2 still need to be done). The action to be performed on each is to add | dft_category = [category] to the infobox, and remove the manually-added category (since the infobox automagically adds the article to the relevant category, and having it there manually as well would create a risk of the two becoming out of sync).

I can see that there are cases that would need to be considered:

  • pages where the parameter has already been added (in which case the bot shouldn't do anything, except possibly remove the redundant manually-added category if there is one)
  • stations that are in more than one category (in which case the bot should flag them for human attention)
  • redirects and other similar templates (Infobox London station)
  • nested templates that may be present (though if the new parameter is added right at the beginning of the template transclusion this shouldn't be an issue).

Smjg (talk) 17:28, 22 February 2015 (UTC)

Unfortunately, I am not fun of categories added via templates/infoboxes in pages. This causes inconsistencies between pages having an infobox and those how don't. -- Magioladitis (talk) 19:14, 12 March 2015 (UTC)
That would be WP:TEMPLATECAT. But there are no articles without an infobox for which one of these categories is applicable: every station that has been assigned by the Department for Transport to one of their categories (A, B, C1, C2, D, E, F1, F2) has a Wikipedia article; and every one of those articles has either {{infobox London station}} {{infobox GB station}} or its redirect {{infobox UK station}}. --Redrose64 (talk) 19:37, 12 March 2015 (UTC)
Note also that the comment in WP:TEMPLATECAT is merely a recommendation, not a policy. Moreover, the reasons for it don't seem to be applicable here - once this work is complete, these categories will be populated almost entirely through these templates, thereby making it easier to restructure. Maybe there are still drawbacks to this approach, but I think it is a lesser evil than having to maintain the DfT category in two places in parallel (the infobox and the article categories) and the consequential likelihood of somebody inadvertently updating one but not the other. — Smjg (talk) 14:04, 14 March 2015 (UTC)
It's been nearly 2 months now. Anybody? — Smjg (talk) 17:26, 17 April 2015 (UTC)
Smjg, I can see why you'd want to do this and it wouldn't be tricky to do. But there are some tricky edge-cases that suggest it's not as simple as you suggest. For instance, United Kingdom railway station categories says that St Pancras railway station is in two categories. I see how the template handles this, but how would the bot know what to do? Relentlessly (talk) 18:25, 5 May 2015 (UTC)
Relentlessly, I already began to explain this. When the bot stumbles upon an unusual case such as this, it would not alter the article, but flag it for human attention in some way. — Smjg (talk) 22:18, 5 May 2015 (UTC)
This has been waiting 5 months now. The backlog is getting worse. I'd probably try and implement it myself if I had more time to look into the necessary processes (acquiring or developing suitable tools, implementing the bot, getting approval, whatever else) and carry them out. Can anybody hazard a guess at:
  • what has happened to all the long-time bot authors?
  • what we can do to recruit new bot authors to replace the ones that have disappeared?
Smjg (talk) 16:49, 29 July 2015 (UTC)

Removal of duplicated citations[edit]

I suggest a bot that can remove duplicated citations. If you look at the source code, you can see what I mean by "duplicated citations". Qwertyxp2000 (talk) 23:41, 6 April 2015 (UTC)

Markup Renders as
====Without duplicated citations===
Lorem ipsum dolor sit amet, consectetuer adipiscing elit.<ref name="random thingy" group="example ref1">[ Random citation] Google. Retrieved at "random date".</ref> Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus..<ref name="random thingy" group="example ref1" /> Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a.

====Dummy refs====
{{reflist|group="example ref1"}}

{{tick}} This is acceptable

===With duplicated citations===

Lorem ipsum dolor sit amet, consectetuer adipiscing elit.<ref group="example ref2">[ Random citation] Google. Retrieved at "random date".</ref> Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus..<ref group="example ref2">[ Random citation] Google. Retrieved at "random date".</ref> Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a.

====Dummy refs====
{{reflist|group="example ref2"}}

{{cross}} This is not acceptable 

Without duplicated citations

Lorem ipsum dolor sit amet, consectetuer adipiscing elit.[example ref1 1] Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus..[example ref1 1] Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a.

Dummy refs

  1. ^ a b Random citation Google. Retrieved at "random date".

YesY This is acceptable

With duplicated citations

Lorem ipsum dolor sit amet, consectetuer adipiscing elit.[example ref2 1] Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem. Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus..[example ref2 2] Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim. Aliquam lorem ante, dapibus in, viverra quis, feugiat a.

Dummy refs

  1. ^ Random citation Google. Retrieved at "random date".
  2. ^ Random citation Google. Retrieved at "random date".
N This is not acceptable
@Qwertyxp2000: AWB's general fixes will do this - see the page for more details. GoingBatty (talk) 01:10, 7 April 2015 (UTC)
GoingBatty, thank you for finding the right page. I will soon be changing the {{Duplicated citations}} tag. Qwertyxp2000 (talk) 01:15, 7 April 2015 (UTC)
@Qwertyxp2000: You might want to have the template link to WP:REFNAME instead of the AWB page. GoingBatty (talk) 01:20, 7 April 2015 (UTC)
@Qwertyxp2000: You might want to have a comment in the documentation saying that AWB may be used to fix the issue, and provide the link to the AWB page. GoingBatty (talk) 01:23, 7 April 2015 (UTC)
Why cannot you do this all? Then I can see what you are thinking. Qwertyxp2000 (talk) 01:31, 7 April 2015 (UTC)
@Qwertyxp2000: Apparently some people think that duplicate citations are acceptable. GoingBatty (talk) 01:41, 7 April 2015 (UTC)
Looking at the second scenario, if I have referenced the first and last sentences of a paragraph to the same source but not the middle of it, or per haps the middle is cited to another source, then if someone comes along and removes a "duplicate" cite, I would revert that as vandalism. We encourage people to use inline citation and multiple sources, but we don't limit people to only citing one statement from each source they use. ϢereSpielChequers 05:58, 7 April 2015 (UTC)
I think you are misunderstanding, WereSpielChequers. No one is saying a statement can only be referenced once. Qwertyxp2000 is wanting a bot to fix references which are duplicated (rather than referenced twice or more). Duplicated references produce two entries to the same thing in the list of references, whereas a reference used multiple times will have one entry with multiple uses (the little "^ a b" you see next to the example reference in the first example). I recently manually combined a bunch of duplicate references here (I also normalized the references so they could be referred to multiple times). Maybe that will help clarify things. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 18:07, 24 May 2015 (UTC)
So is this a good idea for a bot or is it already fulfilled by AWB? Because I am looking to start working on a bot (something I've been putting off for two years). :P Sn1per (talk)(edits) 13:51, 27 June 2015 (UTC)
@Sn1per: AWB's general fixes will also do this in some circumstances - see the page for more details. GoingBatty (talk) 22:57, 28 June 2015 (UTC)
@GoingBatty: I see this problem a lot though and perhaps it would be a good idea to have it actively fixed by a bot to take some work off of AWB users? Sn1per (talk)(edits) 15:07, 29 June 2015 (UTC)
@Sn1per: I don't object to a bot task to do this using AWB's rules. How would you create a list of articles to be fixed? GoingBatty (talk) 15:38, 29 June 2015 (UTC)
@Sn1per and GoingBatty: A database scan for <ref>([^\<]+)</ref>.+<ref>\1</ref> will find about 20,000 candidates. The first 1% or so are listed at User:John of Reading/Sandbox. But remember that the AWB general fixes will only combine duplicate citations if the article already has at least one named reference, to avoid changing the citation style (AWB documentation). -- John of Reading (talk) 16:09, 29 June 2015 (UTC)
@John of Reading: Thanks John! You might want to tweak your regex to also include named references. GoingBatty (talk) 16:28, 29 June 2015 (UTC)
@GoingBatty: That would take more than a "tweak", but I'll think about it. If two references have the same name, the software will use only the first definition whether or not the definitions are identical. So the search would have to be for references that have identical content but different names, or one named and one unnamed. -- John of Reading (talk) 16:40, 29 June 2015 (UTC)
@John of Reading: Thanks for the regex, I tend to be terrible at those :P I would assume that my bot should follow the same behavior as AWB to comply with the same policy? Sn1per (t)(c) 22:54, 29 June 2015 (UTC)
@Sn1per: Definitely, and even then you may run into objections. See this 2011 thread. Hint: it's surprisingly long. -- John of Reading (talk) 06:04, 30 June 2015 (UTC)
@John of Reading: I think I tweaked the regex, not sure if it works all the time but should be accurate enough to quickly select pages for further scrutiny. Are there any obvious errors? <ref(.|\n)*?>([^\<]+)<\/ref>.+<ref(.|\n)*?>\2<\/ref> Sn1per (t)(c) 18:18, 2 July 2015 (UTC)
(still working on improving the regex) Sn1per (t)(c) 18:33, 2 July 2015 (UTC)
@Sn1per: The ".+" in the middle will only work if the regex is run with the "singleline" option turned on - the dot/period needs to match newlines - so the "(.|\n)" can be simplified to just ".". That's <ref.*?>([^\<]+)<\/ref>.+<ref.*?>\1<\/ref>. -- John of Reading (talk) 18:47, 2 July 2015 (UTC)


@John of Reading: Thanks for the advice. Here is an improved regex: <ref([^\>]*)?>([^\<]*)</ref>.*?<ref(?!\1)[^\>]*?>\2</ref> It (should) be able to find two refs, where at least one has no attributes i.e. name="", or if both have different attributes. Sn1per (t)(c) 19:15, 2 July 2015 (UTC)
(note that I am using python regexs, where a "." doesn't seem to match newlines) Sn1per (t)(c) 19:18, 2 July 2015 (UTC)
Nevermind, I'm an idiot. Just say your note about singleline mode. Sn1per (t)(c) 19:19, 2 July 2015 (UTC)
@Sn1per: Yes, that regex does the job - neat! Now, a quick reality check: on my laptop it will take 20 hours to run this against a database dump. My latest dump is from mid-May; I think it's not worth tying up the machine for so long to produce a list that is several weeks out of date. But I can produce a partial list of a hundred or so articles for testing purposes fairly easily. -- John of Reading (talk) 19:45, 2 July 2015 (UTC)
Document-properties.svg Coding... Well, it seems like a good idea to me, so I'll start working on it. But if you guys strongly disagree, leave a message on my talk page so I don't waste too much effort. The bot will probably take a few days to work on given that I am new to the field. Sn1per (talk)(edits) 17:16, 28 June 2015 (UTC)
@Sn1per:, Why code it? Why not use the AWB library that already contains the code. - X201 (talk) 15:33, 29 June 2015 (UTC)
@X201: I was thinking of making a pywikibot-based robot based on the WMF Tool Labs servers so that the bot can run all the time, rather than relying on me to run it off my PC, given that the problem is pretty large. Sn1per (t)(c) 22:54, 29 June 2015 (UTC)

Redirects to lists, from the things they are lists of[edit]

Please could someone do this:

  1. For every article titled "List of foo"
  2. if the article called "Foo" exists; do nothing
  3. otherwise, create "Foo" as a redirect to "List of foo"

For example, I just created Birds of Tunisia as a redirect to List of birds of Tunisia.

This might usefully be added to a list of monthly cleanup tasks, for new "List of..." articles. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:06, 6 May 2015 (UTC)

Pictogram voting wait.svg Doing... - Though I have messaged WikiProject Lists to check consensus first. Jamesmcmahon0 (talk) 12:54, 7 May 2015 (UTC)
Thank you. Please see also #Century-item redirects, below. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:36, 9 May 2015 (UTC)
Symbol wait.svg BRFA filed - Wikipedia:Bots/Requests for approval#MoohanBOT 8 It is just for this task as I had already generated the list of pages needed and there seems to be no opposition to it. I will have a look at #Century-item redirects in a few days but feel free to jump ahead GoingBatty as that one may be outside of my regex expertise... Jamesmcmahon0 (talk) 11:05, 10 May 2015 (UTC)

It appears that User:Jamesmcmahon0 has dropped this. Can anyone else help, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:29, 5 July 2015 (UTC)

Replace links from a referenced site for WP:ANIME[edit]


Basically, a website changed their linking from to, and made some other tiny changes to the URL (mostly removing ".html" and replacing it with a "/", and adding a day in the URL).

For example (to use the example given in the discussion):

is now located at

You can read the details here. The list of links which need to be fixed is here.

Thanks for any help! ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 20:00, 6 May 2015 (UTC)

136 pages. -- Magioladitis (talk) 15:31, 8 May 2015 (UTC)

Do you need any other info? Please let me know. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 04:47, 14 May 2015 (UTC)
Any word on this? It's been over a week since anyone else commented here. Also, I moved the following section up and made it a subsection of this one as the tasks would be the same, just for different URLs. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 19:17, 20 May 2015 (UTC)
Symbol wtf vote.svg Idea is not well explained @Nihonjoe: Ok, I'll ask a question, How do we derive the YMD structure from just the YM structure that was there previously present short of having the bot opening a search on the new page looking for the "slug" that matches. My gut reaction is that I think this is a bad idea for a bot to do as it will need some sort of human component to decide which one is right. Hasteur (talk) 19:58, 20 May 2015 (UTC)
I can see how that would make things more complicated. Seeing as it is only the day and the .html that are different between the URLs, wouldn't it be fairly easy to have the bot cycle through the (at most, 31 for each entry) possibilities and then replace it with the one which comes up as a valid URL? Just have it go through a process like this:
I imagine a bot could do that much more quickly than a human. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 22:04, 20 May 2015 (UTC)
Nihonjoe Have you contacted the site requesting permission for us to brute force crawl their website like that? The method you describe can be done, but is extremely resource intensive on the server being hit up. Also do you really want all of those pages updated? I think we're more looking at Articlespace pages and possibly active Talk pages (WT: and Article Talk). Hasteur (talk) 23:59, 20 May 2015 (UTC)
Yes (waiting for a reply), though if it was done in small groups over a week or so, I imagine it would be fine as it would spread the load over an extended period. The mainspace and mainspace talk pages are the most important ones, but it would be nice to fix all of them. There are only 136 links which need to be fixed, so if they were done 2-3 at a time, 2-3 times a day? ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 00:06, 23 May 2015 (UTC)
Got a tentative "Yes", but waiting to hear from the webmaster (the owner of the site replied). ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 02:23, 23 May 2015 (UTC)
@Nihonjoe: Ideally the task will run to completion, but I can insert a ~5 minute period. The links in Archives, I'm thinking not changing because we don't want to change talk archives. Also, do we want to save the state of the page after every unique link replacement or do we want to wait till we have all the links replaced before we move forward? This work for you? I'm going to start designing the bot task over the next few days. FYI: This does not look at the subsection below. Hasteur (talk) 00:15, 24 May 2015 (UTC)
@Hasteur: Okay, the owner of the site wrote back: "Hey, thanks so much for letting me know about this. Most of my traffic is from the USA, although I do get a fair amount of global traffice, but the upshot is, if you do the update during 12AM-8AM Eastern US time, it should be okay." The webmaster also wrote saying it was fine, as well. As for talk page archives, you can leave those alone. If there are multiple links on one page, it's fine to replace all of them and then save the page. If you can have the bot do the following task at the same time, that would be great. The one below is just a simple search and replace, not involving any web searches. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 17:54, 24 May 2015 (UTC)
@Hasteur: Just following up since it's been about 1.5 weeks. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 04:43, 3 June 2015 (UTC)
@Hasteur: Just following up since it's been about 2.5 weeks now. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 19:58, 9 June 2015 (UTC)
@Nihonjoe: Thank you I heard you the first time. Beein slightly more than a little busy at work and my discretionary time has been busy. I've got it doing the replacements, but I can't get the page generator set to work correctly so that it'll correctly identify the pages. Hasteur (talk) 22:53, 9 June 2015 (UTC)
@Hasteur: Sorry if you felt I was being impatient, but I reasoned I was being patient by waiting so long in between inquiries. I can understand being busy outside of Wikipedia as I've had plenty of that myself. I appreciate the update. That's what I was looking for. Thank you for tackling this project. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 07:16, 10 June 2015 (UTC)
Symbol wait.svg BRFA filed. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 03:28, 12 June 2015 (UTC)

Yes check.svgY Done Hasteur (talk) 15:41, 14 June 2015 (UTC)

Thanks! ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 17:59, 14 June 2015 (UTC) needs links need update[edit]

See my example fix here. I need someone to help me spot and fix those links in case there are more of them. -- Magioladitis (talk) 11:39, 12 May 2015 (UTC)

So the only thing is that CGI/search/syousai_put.cgi?key=search&isbn= gets replaced by comics/4253? Nyttend (talk) 01:57, 14 May 2015 (UTC)
This could be combined with the request above as they are both for the same thing, just different URLs. For the same project, too. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 07:07, 16 May 2015 (UTC)
@Nyttend: did you do this? Hasteur (talk) 15:39, 14 June 2015 (UTC)
Symbol wait.svg BRFA filed Hasteur (talk) 16:03, 14 June 2015 (UTC)
I didn't do anything. I just looked at the "example fix here" edit and asked if I understood rightly. Nyttend (talk) 17:00, 14 June 2015 (UTC)
Looks like the BRFA was approved. Thanks in advance for when this task is completed. WP:ANIME appreciates the help. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 04:02, 19 June 2015 (UTC)

Century-item redirects[edit]

Please could someone do this:

  1. For every page or category beginning with a cardinal number (e.g. 17th-, 21st-) century; or articles prefixed "List of..." matching that pattern:
  2. Create a redirect from the equivalent title, with no dash
  3. Create a redirect from the equivalent title, using words
  4. Create a redirect from the equivalent title, using words, with no dash

For example, for the existing Category:20th-century war artists, I just created:

Other examples matching the above pattern would include:

This might usefully be added to a list of monthly cleanup tasks, for new articles and categories matching the above pattern. Note the overlap with #Redirects to lists, from the things they are lists of, above. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:35, 9 May 2015 (UTC)

@Pigsonthewing: What redirect template(s) do you want included on these new redirects? Thanks! GoingBatty (talk) 19:45, 9 May 2015 (UTC)
@GoingBatty: I suppose {{Redirect from alternative spelling}} would be best, but I have no strong feelings on the matter. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:59, 9 May 2015 (UTC)
This request doesn't seem to match with Wikipedia:Categories for discussion#Redirecting categories, unless they each "frequently have articles assigned to them accidentally, or are otherwise re-created over and over." But there's no evidence of that presented here. Anomie 11:58, 10 May 2015 (UTC)
As I read it, the CFD thing you quote is in a different context: it's saying "We delete most categories, but if it's a likely mistake, we redirect it instead of deleting it". I don't see it as being at all relevant to the idea of creating the likely mistakes in the first place. Nyttend (talk) 23:41, 13 May 2015 (UTC)
Except it doesn't say "unless it's a likely mistake" at all. It says "unless people keep actually making the mistake". Anomie 00:56, 14 May 2015 (UTC)
"Likely" is demonstrated by the fact that people make it, but again, the context is that of deletion versus redirecting, where the category already exists; it doesn't address the request being made here. Nyttend (talk) 01:55, 14 May 2015 (UTC)
I for one keep making this "mistake", when using HotCat. The redirects will help me and other editors to find the categories we need to apply. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:59, 25 June 2015 (UTC)
Can anyone help, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:59, 25 June 2015 (UTC)

GA Cup[edit]

On the behalf of myself and Figureskatingfan, we are looking at the possibility of having a bot help assist us in the GA Cup. We held the first competition at the end of 2014/beginning of 2015 and after the success of it, we are currently planning a second competition, hoping for it to be a even bigger success. In the first competition, some of the participants expressed their frustration in the how the submission process for their Good article reviews was not very efficient. For the upcoming competition, we were wondering if it would be possible to have a bot scan the Good article nomination page for reviews being conducted by participants and add the appropriate review links to a page.

More specifically, ideally, the bot would scan the nomination page and say BenLinus1214 was reviewing an article, it would add it under the appropriate header.

If anyone is interested in helping us I would be glad to have you on board and will be more than happy to answer any questions!--Dom497 (talk) 23:25, 11 May 2015 (UTC)

@Dom497: Doesn't seem that hard or complicated. I'll test around a bit but I'm not going to promise anything yet. -24Talk 21:39, 28 May 2015 (UTC)
@Negative24: Thank you so much for trying! I would have done it myself but I don't know enough code to do it!--Dom497 (talk) 22:35, 28 May 2015 (UTC)
@Negative24: Hey, just curious to see if you've had any positive results. Thanks again! :) --Dom497 (talk) 02:30, 14 June 2015 (UTC)
@Dom497: Sorry for the delay. I've been exploring everything related to bots and this being my first time even trying to make a bot, things are going a bit slow. I'm going to be going on a month long Wikibreak (related to a project in life that needs my time) and so I may not be able to code anything up before the 2015 GA cup. I'm going to leave this open for anyone to pickup (feel free to do so if you're interested) but I'm not able to fully start on something at the moment. I will resume this project when I have the time and if someone hasn't picked it up by then. Sorry, -24Talk 03:08, 19 June 2015 (UTC)
@Negative24: No problem at all! Thanks for trying! :) --Dom497 (talk) 19:26, 19 June 2015 (UTC)

Replacing Trains project template with Trains in Japan project template[edit]

Per this section on the main page of the Trains in Japan WikiProject, we need to search the talk pages of all articles listed in the Category:Rail transport in Japan (and it's sub categories) for any of the following templates:

...and replace the above templates with Template:Trains in Japan. Additionally, a check should be made for "|Japan=yes" or "|japan=yes" and remove it if present as the replacement template automatically includes that. Here's a few examples of what the bot would do: 1, 2, 3, 4. The purpose of this is so the articles will be properly automatically placed into the appropriate articles by quality category. This will affect several hundred articles, though some have already been done so those would be ignored by this bot process. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 18:44, 16 May 2015 (UTC)

Why {{Trains in Japan}}? Why not {{WikiProject Trains in Japan}}? Have WT:RAIL been informed? --Redrose64 (talk) 19:14, 16 May 2015 (UTC)
{{WikiProject Trains in Japan}} would be fine, too. They both point to the same thing. WP:RAIL hasn't been informed as I was just going by the referenced section on Wikipedia:WikiProject Trains in Japan (I'm a member of that project). This is simply implementing instructions already part of the project (the instructions specifically state to use {{Trains in Japan}} (which points to {{WikiProject Trains in Japan}}). As this doesn't affect WP:RAIL in the least (since it's a descendant project), there's no reason to notify them. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 22:38, 16 May 2015 (UTC)

As far as I recall we should avoid the wrapper and move the other direction. -- Magioladitis (talk) 21:53, 16 May 2015 (UTC)

Why? The wrapper exists for a reason, so why not use it? It reduces the amount needing to be typed when adding the template. Also, for some reason, the Template:WikiProject Train doesn't put things into the correct categories, even though it should when |Japan= is present. I haven't been able to figure out why not. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 22:38, 16 May 2015 (UTC)
Nihonjoe the later seems impossible. Maybe Frietje can check this. My guess that the wrapper was created for a reason but nowadays with all these bots running and with the WikiProject standardisation we may not need to use these wrappers anymore. -- Magioladitis (talk) 22:42, 16 May 2015 (UTC)
Hence, the request to replace all of them with the wrapper. If you (or someone else) can figure out how to get the Template:WikiProject Train with |Japan= to work, then nothing needs to be done. There are other descendant projects where it works fine, and I copied the syntax used in those parts of the template, but it still won't work. Using the wrapper does. Very weird. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 22:55, 16 May 2015 (UTC)
Please give examples of talk pages using {{WikiProject Trains in Japan}} where the categorisation is correct. --Redrose64 (talk) 23:32, 16 May 2015 (UTC)
Any of the pages showing up in the various subcategories here. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 23:19, 17 May 2015 (UTC)
Many of those pages don't use {{WikiProject Trains in Japan}} at all, but instead use {{WikiProject Trains|Japan=yes}} or equivalent, yet the categorisation is the same. This suggests to me that it does not matter whether the wrapper is used or the main template. One thing that I noticed when I first looked at this matter is that every single one of the pages in subcategories of Category:WikiProject Trains in Japan articles by quality has {{WikiProject Japan|trains=yes}} or similar, which was an indication that it is {{WikiProject Japan|trains=yes}} which produced the categorisation that you desired.
I see that a few hours before this thread was raised, you made these changes. This should have the effect of populating subcategories of Category:WikiProject Trains in Japan articles by quality, and again it will not make a difference whether {{WikiProject Trains in Japan}} or {{WikiProject Trains|Japan=yes}} is used. Is there any problem with the categorisation? --Redrose64 (talk) 12:26, 18 May 2015 (UTC)
I was doing things one step at a time, so the only changes were made to the {{WikiProject Trains}} template. I wanted to get it working correctly there before attempting to get it to work on {{WikiProject Japan}}. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 05:36, 19 May 2015 (UTC)
OK, I've looked into this a bit more. If a page has e.g. {{WikiProject Trains |class=c |Japan=yes }} (and no other banners) it is placed in Category:C-Class rail transport articles Category:Unknown-importance rail transport articles Category:C-Class WikiProject Trains in Japan articles, whereas if it has {{WikiProject Japan |class=c |trains=yes }} (and no other banners) it is placed in Category:C-Class Japan-related articles Category:Unknown-importance Japan-related articles Category:WikiProject Japan articles. There is thus nothing basically wrong with {{WikiProject Trains}}; however, code does seem to be lacking from {{WikiProject Japan}}. --Redrose64 (talk) 13:47, 18 May 2015 (UTC)
I've added the code for this to {{WikiProject Japan}} — Martin (MSGJ · talk) 14:14, 18 May 2015 (UTC)
Thanks. That was going to be my next step after getting it to work in {{WikiProject Trains}}. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 05:36, 19 May 2015 (UTC)
{{WikiProject Trains in Japan}} reduces the amount needing to be typed - yes, it saves one character compared to {{WikiProject Trains|Japan=yes}}. --Redrose64 (talk) 23:26, 16 May 2015 (UTC)
I generally just type {{Trains in Japan}} or {{TIJ}}, and the latter saves quite a lot. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 23:22, 17 May 2015 (UTC)
Seeing as there are bots going around bypassing those redirects, I considered it fairer to compare the full template names. I could have suggested {{TWP|Japan=yes}}. --Redrose64 (talk) 12:26, 18 May 2015 (UTC)
Maybe @MSGJ: can help? He really knows templates. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 23:19, 17 May 2015 (UTC)
I really know templates too, particularly the WikiProject banners and especially {{WikiProject Trains}}. --Redrose64 (talk) 12:26, 18 May 2015 (UTC)
Apparently so. In the past, I've always worked with MSGJ when I've run into WikiProject banner issues. Glad to know of a second person who is able to help. Face-smile.svg ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 05:36, 19 May 2015 (UTC)

Nihonjoe We can add |Japan= to the existing banners if you like. --- Magioladitis (talk) 22:08, 16 May 2015 (UTC)

See my comment just above. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 22:38, 16 May 2015 (UTC)
As noted above by several people, it's better practice to use the parent template {{WikiProject Trains}} with the parameter |Japan=yes. If there is a problem with categorisation, I will look into it. I've just replaced one and the categories seem to work just fine ... — Martin (MSGJ · talk) 08:20, 18 May 2015 (UTC)

X mark.svgN Not done. ···日本穣? · 投稿 · Talk to Nihonjoe · Join WP Japan! 18:05, 14 June 2015 (UTC)

Language icon templates[edit]

one pattern i see a lot more often and which inho may warrant a bot fix is a citation template followed by a language icon template. such language template could easily and more efficiently combined using the |language= parameter within the citation template. -- Ohc ¡digame! 04:56, 26 May 2015 (UTC)

@Ohconfucius: BattyBot's task 31 removes the language icon templates from citation templates. I run this about once a month. I have submitted a Reflinks bug report, but I'm not aware of any fixes yet. GoingBatty (talk) 23:22, 29 May 2015 (UTC)
GB: i'm glad you thought of that one. the other, which i was referring to, is where the language icon is within the <ref></ref> tags and placed specifically after (not within) the {{citation}} template. Perhaps you could submit a bot task to catch those? -- Ohc ¡digame! 05:51, 30 May 2015 (UTC)
@Ohconfucius: Oops! You're referring to edits to change {{xx icon}} to |language=xx like this one. It's easy to change, but not easy to check to see if the template already has a |language= parameter, and then determine what to do (e.g. this edit). GoingBatty (talk) 16:16, 30 May 2015 (UTC)

Change MSDS to SDS[edit]

The Globally Harmonized System of Classification and Labelling of Chemicals (GHS) is a harmonization scheme that many countries have adopted, including nearly all major English-speaking countries. Part of the GHS is the Safety Data Sheet (SDS), which is a standardized version of what was formerly known as a "Material Safety Data Sheet" (MSDS) in many countries. The page "material safety data sheet" has been moved to safety data sheet and I have requested a change be made to Template:Chembox & Template:Chembox Hazards. If you need more details, see Talk:Safety data sheet#Move to Safety Data Sheet and Wikipedia talk:Chemical infobox#Change MSDS to SDS.

Many chemicals have a page titled "[name of chemical] (data page)" which has a section titled "Material safety data sheet" and a link to the old page title Material safety data sheet. For example: Methanol (data page)#Material Safety Data Sheet (I didn't change that page so it could be an example for this discussion). I am asking for the help of a bot to:

  • Change links to "Material safety data sheet" to "safety data sheet" (it's not the existence of a redirect link that is of concern, but the text displayed for the link)
  • Change the section titles on pages titled "[name of chemical] (data page)" from "Material safety data sheet" to "Safety data sheet"

I have started a discussion at Wikipedia talk:Chemical infobox#Change MSDS to SDS concerning changes in the chemical infoboxes. At the time of making this request, there was no response in that discussion about whether a parameter in the infoboxes should be changed from "ExternalMSDS" to "ExternalSDS". Another editor made these changes to one of the infobox templates. I'm not familiar with template coding, so someone else will need to determine and coordinate any changes to the template parameters.

Also note that there are some links to the old page title that appear in the "What links here" (with transclusions and redirects hidden), but I cannot find the link in the article. For example Carbon monoxide is on the "What links here" (with transclusions and redirects hidden) for material safety data sheet, but when I click the edit tab and use the find functionality in my browser, I cannot find the link to material safety data sheet. I'm not sure why. On a final note, I'll be around for a few hours after making this request, but won't be around for the next couple of days. AHeneen (talk) 01:32, 29 May 2015 (UTC)

@AHeneen: Do you mean Carbon monoxide, or do you mean Carbon monoxide (data page)? Only the latter appears in Special:WhatLinksHere/Material safety data sheet; and the link is in the "Material safety data sheet" section. --Redrose64 (talk) 09:01, 29 May 2015 (UTC)
I'll rephrase. One request is: in page Carbon monoxide (data page), change section title "Material safety data sheet" (if exists) into "Safety data sheet" (uc/lc this way). There are some 155 such (datapage)'s, listed (indirectly, by parent page) in Category:Chemical articles having a data page. -DePiep (talk) 10:42, 29 May 2015 (UTC)
This is a small number that can be rapidly done by hand, or AWB. Since the header is linked too from the template I suggest the same person changes both in the same session. All the best: Rich Farmbrough, 23:51, 16 July 2015 (UTC).

The Most Beautiful Women of All Time[edit]

Can we possibly change these rankings and automatically place them onto the respective Wikipedia page? The Most Beautiful Women of All Time e.g. in the Audrey Hepburn page mentioned that she is ranked #4 ?-- (talk) 17:38, 30 May 2015 (UTC)


Symbol merge vote.svg Needs wider discussion

I'd say (a) is it really notable information if it's constantly changing like that and (b) the easier way would be to create a template (template|Audrey Hepburn) that responds with "4" or fourth (although those tend to be deleted eventually). -- Ricky81682 (talk) 05:49, 9 July 2015 (UTC)

Remove persondata[edit]

Persondata has been deprecated by this RfC. A bot is needed, please, to remove it from all articles. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:31, 1 June 2015 (UTC)

Pigsonthewing it's not easy and not that immediate. I supported the deprecation but we first have to check that all info is transfered to Wikidata before start removing. We can stop adding but not start removing. -- Magioladitis (talk) 12:33, 1 June 2015 (UTC)
This has been done already; and the RfC closed with "Consensus is to deprecate and remove" (emphasis added). Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:50, 1 June 2015 (UTC)
Gerda has suggested on my talk page that an infobox ({{Infobox person}}, presumably) with the data from PD be compiled by the bot and deposited on each article's talk page to facilitate the migration of the (meta)data. Alakzi (talk) 15:06, 1 June 2015 (UTC)
That would work, and {{Infobox person}} would be the best box. Perhaps it could be made to pull in data from Wikidata, first? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:20, 1 June 2015 (UTC)
Yes, that was my suggestion too, considering that one of the reasons PD has been deprecated is its unreliability. Alakzi (talk) 15:30, 1 June 2015 (UTC)
More specific: for articles with no infobox. --Gerda Arendt (talk) 15:27, 1 June 2015 (UTC)
This discussion should not be done here. Here it's a place to discuss how to implement tasks that have consensus. Please continue this discussion in template's talk page, Wikidata's talk or the Village pump. Thanks, Magioladitis (talk) 15:45, 1 June 2015 (UTC)
Please don't attempt to dictate to other users where to have discussions. The tasks "deprecate and remove" both already have consensus. The bot request page is exactly the right page to discuss how to implement the removal of material by bot. However I must point out that we cannot systematically (e.g. by bot) add infoboxes to articles. That is contrary to an ArbCom decision and would require an amendment - which I very much doubt would pass. --RexxS (talk) 11:00, 2 June 2015 (UTC)
The suggestion was to place the infobox on the talk page. Alternatively, we could create a userspace or a WikiProject page with all of the replacement infoboxes. Alakzi (talk) 11:31, 2 June 2015 (UTC)
I would strongly recommend that such a list/table be compiled on a separate project page. Throwing an infobox onto the talk page of articles that have rejected infoboxes by consensus with militant opposition will not be viewed kindly. Resolute 20:32, 2 June 2015 (UTC)
It doesn't look like it's going to happen either way. Alakzi (talk) 20:34, 2 June 2015 (UTC)
@RexxS: I think your memory has let you down. Arbcom haven't made such a ruling; an RfC did, which can be overturned by the community. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:52, 2 June 2015 (UTC)
Well, in that ArbCom case, Gerda was sanctioned for adding "infoboxes to many articles systematically", and – daft as I certainly am – I wouldn't want to see any other good-faith editors treated in that way. --RexxS (talk) 14:16, 2 June 2015 (UTC)
(I would like to see evidence that I "systematically added infoboxes", which I don't remember having done, but never mind.) Let me know what is wrong in placing on the talk page of an article two things: a copy of the PD as it has been removed, and a suggested infobox with the data from Wikidata? Seeing eyes of people could decide what to do with it, with the option to ignore or archive, but also to check for differences and update. (Btw, we live in a different time now, - a former arbitrator added an infobox to Beethoven, by community consensus, and its parameters are amicably discussed on the talk.) --Gerda Arendt (talk) 08:27, 3 June 2015 (UTC)
I understand the issues that persondata has presented, and I have followed the deprecation/removal RfC with interest. Before we start removing the existing persondata, however, I have a request: we need an alternative for embedding the multiple forms of maiden and married names of our female article subjects. Persondata has become the default location to bury the multiple married names of many of our female bio subjects, and we should not just delete that information when we remove the persondata templates. When a subject has a single maiden name and a single married name, then it is easy enough to deal with that data in the first sentence of the lead. When a female subject has multiple married names -- sometimes as a result of multiple marriages, compound name forms, and nicknames in combination with husbands' names -- there is no elegant way to present that data. I suppose the next best alternative is to create redirects for each married name variation. In any event, care should be taken that some alternative is implemented before the persondata templates are removed and that married name data is lost. Thanks. Dirtlawyer1 (talk) 16:06, 1 June 2015 (UTC)
The alternative is Wikidata. An RfC has determined that there is, now, a clear consensus to remove Persondata from this project. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 08:14, 2 June 2015 (UTC)
Fair enough, Andy. How and when do you propose to migrate that married name data from the persondata templates to Wikidata? Dirtlawyer1 (talk) 08:39, 2 June 2015 (UTC)
I suggest you read the lengthy and detailed discussion of data import under the RfC (though you did say you'd followed it?); and on the pages linked from there, on Wikidata. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 08:51, 2 June 2015 (UTC)
Andy, I see a five-point plan that was included in the RfC introduction, to wit:
"1. Transfer |SHORT DESCRIPTION= across to Wikidata. Yes check.svg Done
"2. Stop bots from automatically creating new Persondata templates. Yes check.svg Done
"3. Notify users in relevant projects of deprecation.
"4. Transfer any new data to Wikidata, then remove methodically.
"5. lose the template and relevant pages when ready."
Only one item on that five-point checklist has actually been completed. I see that items No. 2 through No. 4 have not been done, and most notably, there is no plan presented here to "transfer any new data to Wikidata, then remove methodically." Unless I'm missing something this bot request to remove all persondata templates immediately is premature by the express terms of the RfC -- there has been no transfer of "any new data to Wikidata," which was an express condition to removal. Dirtlawyer1 (talk) 14:58, 2 June 2015 (UTC)
As of this commit, the helper script at Articles for Creation no longer adds Persondata. (And as soon as Theopolisme gets around to pushing the new version.) APerson (talk!) 16:26, 2 June 2015 (UTC)
You're misunderstanding how a WP:Request for Comment works. Somebody puts up a suggestion or question or plan, and other people discuss it, sometimes suggesting other options, othertimes giving reasons for supporting or opposing the suggestion. At the end of the process, somebody uninvolved will summarise the discussion and close the debate. It's not the proposal-vote scheme that you seem to be trying to cast it as. It became apparent during the debate that Wikidata had no interest in accepting any more data from Persondata methodically, because the data left in Persondata wasn't of an acceptable quality. So the draft plan is no longer feasible in the way it was written at the start of the RfC. That's ok; it happens in an RfC, and the discussion continued nevertheless. We're not going to be systematically transferring any more data to Wikidata from Persondata. Period. You mistook a suggestion on how we might want to proceed for a condition (as if you could have an "express condition" for an RfC). The bots are being stopped as I write; and if you're worried that any projects are unaware of the deprecation, then it's open to you to inform those projects if you wish. --RexxS (talk) 20:13, 2 June 2015 (UTC)
Rexx, I've been on Wikipedia for about 18 months less than you, and I have a pretty good idea how an RfC works. Unfortunately, RfCs are nothing like a legally defined document, and closing administrators overlook elements of the discussion all the time. I have asked the closing administrator to clarify whether his close contemplated the discarding of useful persondata information, e.g., variations on female married names before it was transferred to Wikidata. If so, I will defer to the closing admin since there's not a lot of recourse, but I will say that you and Andy sure seem determined to implement this deletion with no delays for any reason, regardless of what information is discarded in the process. Dirtlawyer1 (talk) 20:41, 2 June 2015 (UTC)
It's quite possible to have been on Wikipedia for thirteen years and still not understand many things - we just desysopped an editor who had been an admin for over ten years who didn't understand basic policies - but you shouldn't take it to heart when you get hold of the wrong end of the stick. Nevertheless, I think you've grasped the issue now - RfCs are indeed not legal contracts and we don't do process here in the way that I guess you're used to at work. We do make changes during the course of the discussion and I'm sure now that you've had a chance to read the discussion, you'll have spotted Periglio's contributions. I expect you've seen that the only person who commented and had been involved in transferring information from Persondata to Wikidata had become firmly convinced that there was nothing of value left in Persondata to transfer. Who do you expect to transfer those variations on female married names to Wikidata? That's a decision for the bot operators on Wikidata and as far as I can see, that's not going to happen because they don't believe any fields beyond the short 'description' are accurate enough. Finally, can I ask you to think again when you accuse me of being "determined to implement this deletion with no delays for any reason", when you can read below my earlier comment requesting that the removal waits until we've ensured that systematic addition has stopped? I'm pretty sure that qualifies as a delay (for a good reason), doesn't it? --RexxS (talk) 01:25, 3 June 2015 (UTC)
Sure, and it's quite possible to have been on Wikipedia for seven years and not understand the details of our content attribution policy, even while attempting to quote it to other editors in a dispute, and not be aware that there are multiple ways to satisfy that policy because the editor in question didn't bother to actually read it. One shouldn't take that to heart, either.
As for the problem at hand, I'll take personal responsibility for manually transferring the persondata name variants for about 2000 bios that I entered personally and which I have a very high degree of confidence in its accuracy. In fact, I've already started. As far as I can tell, it's not limited to women's married names, either; there has been no attempt to import full names, as opposed to article title COMMONNAMEs, of males or females. My cursory review of Wikidata profiles that I had not previously edited shows that very little data, if any, has been transferred from persondata to Wikidata, other than the sometimes eccentric "brief descriptions". There appears to be a lot of persondata still to be mined, but there is clearly very little interest in doing so. That's just sad; the whole rush to delete it ASAP, and the complete lack of interest in migrating anything else, smacks of a "not invented here" mentality. So be it. I can't personally fix the Wikidata for a million bios. It would, however, be a sensible and considerate gesture to provide several weeks of wide-ranging advance notice to the hundreds of editors who entered persondata in good faith over the last five or six years, so they could take personal responsibility for migrating it to Wikidata. As far as I can tell, there is no particular reason for rushing the deletion of persondata templates today, tomorrow, next week, or even next month. Dirtlawyer1 (talk) 02:25, 3 June 2015 (UTC)
As one editor who has worked on hundreds of biographical articles, I am not taking any personal responsibility for migrating anything to Wikidata whatsoever. I am not part of the Wikidata Project. The persondata templates have been deprecated, so I am no longer adding them to new articles. What the Wikidata people want to dio is up to them. Hawkeye7 (talk) 22:01, 14 June 2015 (UTC)
I remove persondata when data is correct on Wikidata. SLBedit (talk) 22:58, 14 June 2015 (UTC)
I also suggest that we should wait until bots that add Persondata are changed (e.g. AWB general fixes) before we unleash bots to remove Persondata. GoingBatty (talk) 16:33, 1 June 2015 (UTC)
Approval can be granted, here, with that caveat. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 08:14, 2 June 2015 (UTC)
I believe it is important to ensure that no further systematic additions of Persondata are happening before removal begins. That would enable us to establish a date when all of the Personadata is still in place. Remembering that no content is actually being deleted, we would then have a date that a bot could use at any point in the future to revisit an article's history and extract the Persondata. I agree that the caveat suggested by GoingBatty and Andy is the right way to proceed. --RexxS (talk) 11:12, 2 June 2015 (UTC)
It is kind of too late because what made me think is that PD was already removed from an article without infobox and without keeping the data anywhere, Statz Friedrich von Fullen, on the Main page right now. --Gerda Arendt (talk) 11:49, 2 June 2015 (UTC)
Wikipedia:Bots/Requests for approval/Yobot 24. -- Magioladitis (talk) 15:01, 2 June 2015 (UTC)
AWB bots will stop adding Persondata after rev 11039. -- Magioladitis (talk) 15:18, 2 June 2015 (UTC)
@Magioladitis: Thank you for updating AWB. I have taken that update, and other bot operators have done so as well. However, since some AWB bot operators may never take an SVN, I suggest that this bot task waits until a new version of AWB and the current version is obsoleted. Thanks! GoingBatty (talk) 00:02, 5 June 2015 (UTC)
  • Removal of persondata may be sanctioned by RfC, that doesn't neccessarily mean it is a bot task. I contend it is not per WP:COSMETICBOT policy. Making cosmetic changes that only make a difference in edit mode (and make no difference in Read mode) like removeUselessSpaces are specifically discouraged to be performed by bots unless there is "...a substantial change to make at the same time".
Also the WP:CONTEXTBOT policy comes in play, the context "that would normally require human attention" being the need to check, on an article by article basis, what metadata are valid to be transferred elsewhere, and/or what metadata are incorrect or superfluous. That the RfC was fairly unanimous on casting doubt about the PD metadata, doesn't mean it is true for each of the thousands of affected articles, and shouldn't be checked on an article by article basis (especially as the metadata as such aren't causing trouble in a cosmetic sense).
I explained a bit more in detail as some of the participants in this discussion maybe are less accustomed to bot specifics. Otherwise a "No, per WP:COSMETICBOT and WP:CONTEXTBOT" should suffise. --Francis Schonken (talk) 06:46, 3 June 2015 (UTC)
    • As I have already said on the related Ybot page, if the addition of personadata by a bot was not a breach of COSMETICBOT, then neither should be its removal. CONTEXTBOT is a red herring; not relavant here. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 08:45, 3 June 2015 (UTC)
Re. "if the addition of personadata by a bot was not a breach of COSMETICBOT, then neither should be its removal" – I fail to see the relevance of that WP:OTHERSTUFFEXISTS remark. Not all persondata was added by bots, even if it were the remark has no relevance. Bots don't get automatic approval of tasks "extremely similar" to tasks they already performed, leave alone opposite tasks. --Francis Schonken (talk) 17:44, 3 June 2015 (UTC)
  • strong oppose of the removal of persondata without some means to move that metadata to another medium, tag it as "unreviewed" and then instigate some process (on any timescale we can achieve, no matter how long) to move it from unreviewed to a reviewed, trusted [sic] form within the replacement storage (Wikidata or whatever).
    A simple blanket deletion with no conservation is ridiculous. Andy Dingley (talk) 20:03, 3 June 2015 (UTC)
    • We've already had an RfC. Its conclusion was to "Deprecate and remove" persondata. If you feel the closure was inappropriate summary of the discussion, WP:AN is your venue; not here. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:08, 3 June 2015 (UTC)

DPL-bot like approach[edit]

  • Seeing this recent edit I was thinking about the nice bot-generated remarks I get on my talk page every once and awhile:
==Disambiguation link notification for <date>==
Hi. Thank you for your recent edits. Wikipedia appreciates your help. We noticed though that when you edited <article(s)>, you added a link pointing to the disambiguation page <dab page> ([<check to confirm>]> | [<fix with Dab solver>]). Such links are almost always unintended, since a disambiguation page is merely a list of "Did you mean..." article titles. Read the <FAQ> • Join us at the <DPL WikiProject>.
It's OK to remove this message. Also, to stop receiving these messages, follow these <opt-out instructions>. Thanks, <bot sig>
I was thinking about a bot doing something similar:
==Persondata notification for <date>==
Hi. Thank you for your recent edits. Wikipedia appreciates your help. We noticed though that when you edited <article(s)>, you added metadata in the deprecated PersonData format instead of using the current WikiData system ([<check to confirm>] | [<handy link for how to transfer these metadata to Wikidata>]). Read the <FAQ> • Join us at the <Wikidata Project>.
It's OK to remove this message. Also, to stop receiving these messages, follow these <opt-out instructions>. Thanks, <bot sig>
--Francis Schonken (talk) 17:44, 3 June 2015 (UTC)

How to transfer Persondata to Wikidata[edit]

  • @Francis Schonken: Click on the "edit" button in the upper right of the Wikidata profile page, and that opens all of the fields in the top box of the profile page, including name variants and brief descriptions. The defined fields below the top box are edited individually. Dirtlawyer1 (talk) 19:26, 3 June 2015 (UTC)
Tx. I was rather thinking about a how-to guide for transferring metadata (ending with something like "...after you have transferred the metadata delete the persondata template from the article"). Is there no how-to page that explains this? Note for example that for me the Wikidata page opens in Dutch, and various other differences with my usual wiki-editing. What is metadata? What goal does it serve? What is VIAF and numerous other abbreviations used on that page? Why is it a good idea to put time & effort in figuring all that out? In other words: current user-friendlyness leaves a lot to be desired. Is there no tool that can help me pick the Persondata I want to see transferred to Wikidata, and transfer them with a few clicks (the equivalent of dabsolver for links to disambiguation pages, see what I proposed above)? --Francis Schonken (talk) 19:45, 3 June 2015 (UTC)
@Francis Schonken: Your best bet for particular instructions is User:Jared Preston. He is an English langauge editor who is an administrator on Wikidata. He has turned up in several related discussions today, and has been very helpful. I already have an open discussion thread on his talk page. Dirtlawyer1 (talk) 20:09, 3 June 2015 (UTC)
@Jared Preston: is it possible to provide a how to guide for how to transfer Persondata to Wikidata (probably a page in Help: namespace)? Tx. --Francis Schonken (talk) 20:27, 3 June 2015 (UTC)
P.S. If you want to organize a Wikipedia-wide bot message to be left on every appropriate discussion board, I would be happy to help prepare some "instructions," and I'm sure Jared Preston would help, too. If we're going to make a meaningful effort to transfer accurate persondata to Wikidata, we need ready, willing and able bodies to help. There are 1,230,630 current uses of Template:Persondata. I've manually transferred the persondata of about 75 articles in the last 24 hours, and I have another 2,000 or so on my personal watch lists to deal with. Dirtlawyer1 (talk) 20:14, 3 June 2015 (UTC)
@JaGa: See my proposal above #DPL-bot like approach - pinging the DPL bot operator to check whether something similar could be implemented for editors updating Persondata. Just to get some input on feasibility? Tx. --Francis Schonken (talk) 20:27, 3 June 2015 (UTC)

I'm afraid there isn't anything I can do to help here, and as per the RfC, it appears that some (or most?) of the information provided by persondata cannot be copied to Wikidata by bots due to the formatting. I can understand that to some extent too, seeing as persondata has an array of differences from article to article. All I can add to this conversation is that the majority of articles I edit regularly have all the information from the whole article on the corresponding Wikidata item, and not just name, dob/pob and a description. Over the years, I've always added a persondata box when creating a new article, but now I use Wikidata, I feel the information is much better preserved there, and for those who need to extract the data far more useful (in any given language). I'm sorry I can't offer any more help, maybe Mr. Mabbett could, if he was interested. Jared Preston (talk) 21:08, 3 June 2015 (UTC)

I'm extremely interested, and intimately involveed in the issue, and I endorse both the RfC closure and the bot approval request.Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:14, 3 June 2015 (UTC)

@Jared Preston: So it is not "possible" to write a Help: namespace page that explains the steps for non-bot editors how to use Wikidata instead of Persondata? --Francis Schonken (talk) 21:23, 3 June 2015 (UTC)

  • "'… advised to manually transfer that data …' – That's the most pointless/impractical advice I've read in a long time." ([1]) – this is what I'm talking about: write a practical guide on how to manually transfer the data... --Francis Schonken (talk) 04:44, 4 June 2015 (UTC)

RfC: Remove persondata practical steps[edit]

Bots can't/won't move metadata included in the persondata system to wikidata (apart from a limited set already transferred), which led to the deprecation of the persondata system, and the agreement to remove all {{persondata}} templates and their contents from Wikipedia.en.

Here are the practical steps proposed for a persondata to wikidata migration:

  1. Write a practical guide in Help: namespace on how to manually transfer persondata to wikidata;
  2. Invite a (DPL-like) bot to produce user talk page messages in this vein:
==Persondata notification for <date>==
Hi. Thank you for your recent edits. Wikipedia appreciates your help. We noticed though that when you edited <article(s)>, you added metadata in the deprecated PersonData format instead of using the current WikiData system ([<check to confirm>] | [<link to practical guide mentioned in previous point>]). Read the <FAQ> • Join us at the <Wikidata Project>.
It's OK to remove this message. Also, to stop receiving these messages, follow these <opt-out instructions>. Thanks, <bot sig>

Tx for considering this proposal. --Francis Schonken (talk) 05:25, 4 June 2015 (UTC)

  • Oppose We already have consensus to remove (thus not to maintain) persondata; the migration has been completed in as much as is practical and sensible, as was discussed at length during the RfC, (and on Wikidata). With a change involving one and a quarter million articles, there are always going to be a few edge cases: they are statistically insignificant, and the large gains outweigh the trivial losses. Trying to nudge people into making manual changes for what must, due to volume, an automated task is both futile and asinine. Besides, as has been pointed out already, the data is still in article histories should anyone need to retrieve it. This is also off-topic for this page. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 07:47, 5 June 2015 (UTC)
    • There is no consensus to remove persondata by bot. The RfC didn't request a bot operation, nor did the closing admin grant one. This is exactly the spot to discuss whether and how a RfC result could be translated in a bot task.
    So I proposed a bot task analogous to the task currently performed by DPL bot. The task proposal deals with several issues raised at the yobot BRFA.
    Re. "...what must, due to volume, an automated task" – I'm unimpressed: the volume of non-bot editing provoked by DPL bot is in the same order of magnitude, if not many times larger.
    Over-all judgements on the value of remaining untransferred persondata are not warranted by the RfC outcome. Indeed, I see bot operators refusing to transfer more persondata to wikidata for essentially WP:CONTEXTBOT reasons. In good Wikipedia tradition (analogy: DPL bot) when a task can't be performed by a bot for that reason, leave human editors the opportunity to perform such task based on case-by-case judgement calls, and let an appropriate bot support with invitations etc. That's what I see reflected in the "practicable workarounds" for "good faith objections" noted by the closer of the RfC. --Francis Schonken (talk) 11:30, 5 June 2015 (UTC)
  • Comment Using a bot to deliver a message of this sort is unlikely in itself to cause any harm and may do some good. However, making this an open-ended process without time limit essentially nullifies the result of the RfC that agreed Persondata be deprecated and removed. I support anything that encourages Wikipedia editors to contribute to Wikidata (it needs well-sourced data), as long as it's not just a cynical attempt to do an end-run around community consensus here. If this proposal had a time-limit within the next month or so, I'd cheerfully support, but as it stands, I can't. --RexxS (talk) 19:17, 5 June 2015 (UTC)
  • @RexxS: How about a hard deadline of sixty (60) days from the date the bot messages are delivered to editors? I would also suggest picking off the low-hanging fruit pursuant to GoingBatty's suggestion below, so that editors who don't have a full plate already may focus on those remaining uses of persondata that may have data that may be productively transferred to Wikidata. Dirtlawyer1 (talk) 02:27, 6 June 2015 (UTC)
  • Re. timeframe: whatever timeframe that has consensus would work for me. I see several options:
    1. The RfC didn't conclude a timeframe, so having no timeframe is not a transgression of the RfC outcome, so that's a first possibility. After all editors might not be aware of the persondata deprecation yet, and introduce persondata in articles tomorrow or the day after tomorrow: in that case keeping the message bot active ad infinitum seems like a good idea.
    2. As far as I'm concerned the bot-generated user talk page message could end on something like "... In three days any remaining persondata will be removed from aformentioned <article(s)>"
    3. Then after half a year or so an evaluation can be made whether there's an acceptable rate of persondata removal, and if needed other measures to speed up could be applied (e.g. removing all persondata that were introduced by bot and weren't updated after that).
    4. Also combinations of systems are possible, e.g. combination with the system proposed in the next section, and/or the month or two month delimitations proposed above, as said any time frame that has consensus works for me. --Francis Schonken (talk) 08:17, 6 June 2015 (UTC)

Another proposal[edit]

What if the bot was to start deleting only those {{Persondata}} templates where the values within the template are duplicated in categories or infobox parameters? For those who want to get rid of every template now, this would get us started. For those who want to transfer valuable information to Wikidata, this would help you focus on those articles where the information may only exist within Persondata. Maybe logic like this (much of which is similar to how AWB populated the template):

  • If the template contains no parameters, delete the template. (I've done this manually, but more might pop up.)
  • If the template contains no values in any parameters, delete the template. (I've done this manually, but more might pop up.)
  • If the only parameter populated is NAME, delete the template. (There are about 1,000 of these.)
  • Otherwise, only delete the template if all of the following are true
    • ALTERNATIVE NAMES is blank or matches the birth name field in the infobox
    • SHORT DESCRIPTION is blank or matches a category (e.g. |SHORT DESCRIPTION=Italian painter and Category:Italian painters)
    • DATE OF BIRTH/DEATH are blank, or only contain a year and the article contains the birth/death year category with the same year, or matches the birth/death field from the infobox or a template such as {{birth date and age}}
    • PLACE OF BIRTH/DEATH are blank or matches the same field from the infobox

Comments/suggestions are welcome. GoingBatty (talk) 01:36, 6 June 2015 (UTC)

@GoingBatty: By and large, that's a very constructive suggestion. If the basic idea is to eliminate all of the persondata templates in which there is no usable, non-duplicate data, I would support that. It would permit us to eliminate the useless, and focus on the potentially useful data still to be transferred to Wikidata.
Having manually transferred non-duplicate data from persondata to Wikidata for approximately 240 articles in the last two days, I hthink I have a pretty good idea what data remains to be harvested/transferred, and what does not. I would make the following modifications of your proposal, only deleting the template if all of the following are true:
  1. ALTERNATIVE NAMES is blank or matches those listed in the "Also known as" field of the article's Wikidata profile (infobox is not really relevant for present purposes);
  2. SHORT DESCRIPTION is blank or matches that listed in the "Description" field of article's Wikidata profile (categories are a different system, and not really relevant for present purposes);
  3. DATE OF BIRTH and DATE OF DEATH are blank or match the "date of birth" and "date of death" parameters in the article's Wikidata profile (again, infobox data is not really relevant for present purposes); and
  4. PLACE OF BIRTH and PLACE OF DEATH are blank or match the "place of birth" and "place of death" parameters in the article's Wikidata profile (again, infobox data is not really relevant for present purposes).
I think we share the same goal of making sure as much non-duplicate and accurate information from persondata is transferred to Wikidata as possible. I think my ideas tighten yours, and would permit us to substantially narrow the field from the presently daunting 1,230,000 current uses of Template:Persondata. Dirtlawyer1 (talk) 02:21, 6 June 2015 (UTC)
@Dirtlawyer1: I'm not a Wikidatian (is that a word?), but I do want to respect your work. Could you please help me understand why copying information from Persondata is more valuable than copying the same information from Wikipedia categories or infoboxes? Thanks! GoingBatty (talk) 02:57, 6 June 2015 (UTC)
@GoingBatty: Nor am I, sir. I'm a Wikipedia editor whose primary concern is that we don't waste 5 to 6 years worth of efforts by our fellow editors. In answer to your question, for the simple reason that such information is, by and large, not being copied from categories to Wikidata. I can point to dozens of examples that I have touched in the last 24 hours where places of birth and death, short descriptions, full names and other name variants have simply not been transferred from the infobox or persondata, nor have basic Wikidata fields like "country of citizenship," etc., been filled with data from the categories. Dirtlawyer1 (talk) 03:05, 6 June 2015 (UTC)
@Dirtlawyer1: Is there any reason why editors could not copy data from the Wikipedia infobox and categories to the appropriate property value in Wikidata? GoingBatty (talk) 17:44, 6 June 2015 (UTC)
None whatsoever, GB. But the question at hand is what are we going to do with existing persondata, not whether some editor in the future might copy similar information from the article's infobox or categories. Dirtlawyer1 (talk) 19:10, 6 June 2015 (UTC)
@Dirtlawyer1: I agree that "the question at hand is what are we going to do with existing persondata". My thought is that it is safe to delete Persondata if each of its values is also contained elsewhere in the Wikipedia article. Thanks! GoingBatty (talk) 19:38, 6 June 2015 (UTC)
My goal is to get as much of the metadata into Wikidata as possible. A lot of editors invested a lot of time to create that metadata -- which often does not correspond directly to a Wikipedia category or infobox parameter -- and I believe we should be endeavoring to get the accurate data into Wikidata, rather than simply writing it off. We should be treating this as a golden opportunity to build out Wikidata, not simply draw a line under the old system, delete it, and move on. Personally, I've done more work on Wikidata in the last two days than I have in the last two years. We should be encouraging more of that buy-in, familiarization, and build-out. Dirtlawyer1 (talk) 19:48, 6 June 2015 (UTC)
@Dirtlawyer1: Although populating Wikidata is definitely a worthwhile activity, you reminded me that "the question at hand is what are we going to do with existing persondata". I believe that my proposal answers that question and does not prevent anyone from populating Wikidata. GoingBatty (talk) 20:16, 6 June 2015 (UTC)
The data hidden in persondata is not available to our readers, nor is it apparently being used by anyone elsewhere. Any "waste of 5 to 6 years worth of efforts by our fellow editors" won't come about because of the deletion of persondata from this Wikipedia. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:49, 6 June 2015 (UTC)
"I learned long ago, never to wrestle with a pig. You get dirty, and besides, the pig likes it." -- George Bernard Shaw

Temporarily exclude pre-1924 births from persondata delete warnings[edit]

Above, there are proposals to use bots to place notices in articles that the persondata template will soon be deleted, and encourage editors to move the data to Wikidata. Unfortunately, Wikidata is not prepared to receive pre-1924 dates of any kind, such as birth or death dates. This is because the last country to switch from the Julian calendar to the Gregorian calendar was Greece, and the first full year under the Gregorian calendar in Greece was 1924.

The Wikidata user interface is seriously broken with respect to Julian dates, and contains large number of dates that are labeled as Gregorian but are really Julian. In addition, the format to represent dates internally and in JSON are under discussion with no final decision. An additional problem for dates before the year AD 1 is that some date are stored with the convention year -1 = 1 BC; others are stored with the convention year 0 = 1 BC.

Until Wikidata is capable of storing older dates I suggest that nothing be done to encourage editors to move pre-1924 information to Wikidata. Jc3s5h (talk) 12:40, 6 June 2015 (UTC)

@Jc3s5h: That's kind of a serious flaw in the metadata system that we've been billing as the definitive replacement for persondata, don't you think? Is there a timeline for a fix? Should we be encouraging people to delete persondata from 1924 and before, if such data cannot be accurately transferred to Wikidata at this time? Is the Julian/Gregorian dates problem restricted to Greece? If not, what are the cut-off dates for other countries? I've manually transferred persondata to Wikidata for over 240 bios in the last three days, but all of them were 20th Century or late 19th Century Americans, Australians, Canadians and Britons -- presumably there is no problem with with any of those dates and countries, and I may continue transferring such persondata to Wikidata without negative consequences, right? Dirtlawyer1 (talk) 13:30, 6 June 2015 (UTC)
I don't really understand Wikidata development schedules, but considering that some aspects of the Wikidata date problems have been known since the autumn of 2014, and discussion is still unresolved about the direction for a fix, I would guess a year or more. The cut-off date for Britain and British colonies was 14 September 1752. Other countries are described at Adoption of the Gregorian calendar. Some of today's European countries were, in the 14th to 18th centuries, were divided into provinces, kingdoms, principalities, etc., some of which were Catholic and some of which were Protestant. So it may require a lot of historical research to determine if a European date was Julian or Gregorian. Jc3s5h (talk) 13:48, 6 June 2015 (UTC)
If I'm understanding this correctly, Wikidata has only got the capacity to store Gregorian dates, with "Julian" very much counter-intuitively being a display option. Therefore, any Julian dates would have to be converted to the Gregorian calendar before being transferred to Wikidata. Alakzi (talk) 13:59, 6 June 2015 (UTC)
Wikidata has the capacity to store pretty much anything, with dates being stored as objects, the format of which can change (as we well know). A typical date might look like:
Timestamp       +1770-12-17T00:00:00
Timezone        +00:00
Calendar        Gregorian
Precision       1 day
Which should be interpreted as "17 December 1770 (Gregorian) - with no more information about the time of the day". As you can see, the calendar type is stored with the timestamp as part of the date object. No conversion is needed to enter dates of any kind into Wikidata, as long as the editor remembers to specify Gregorian or Julian for the calendar. HTH. --RexxS (talk) 15:25, 6 June 2015 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── Okay -- is or is this not a real problem? RexxS does not believe that it is. If it's not a real complication, let's hat this and focus on the major issue at hand: how may we transfer remaining accurate, non-duplicate data from persondata to Wikidata? Dirtlawyer1 (talk) 15:39, 6 June 2015 (UTC)

OK, it was what I gathered from reading user complaints about dates at Wikidata, so it might've been an issue in the past - or not. I queried their JSON API; dates do indeed carry a "calendarmodel". I don't know what form this information takes in Lua or wikitext, but the point is, it does exist. Alakzi (talk) 16:18, 6 June 2015 (UTC)
RexxS, Wikidata user interface allowed users to choose Julian or Gregorian during input, and would convert Julian to Gregorian for storage. Then it changed, and large numbers of dates were pulled from persondata templates by bots and stuffed into Wikidata with no attempt at conversion. (Of course, persondata templates stated dates in whichever calendar was in force at the time of the event). Since so many dates in Wikidata are actually Julian even though they were supposed to be Gregorian, it has been decided to support storage in Julian or Gregorian, but this has not been achieved yet. See mw:Wikibase/DataModel/JSON. Also read all the Wikidata project chat archives beginning September 2014. Jc3s5h (talk) 16:19, 6 June 2015 (UTC)
The page you linked to documents calendarmodel as "a URI of a calendar model, such as gregorian or julian. Typically given as the URI of a data item on the repository." Presumably, the URI is set to either Gregorian's or Julian's when you toggle between the two on the web interface. What is the issue here? Alakzi (talk) 16:25, 6 June 2015 (UTC)
Look at the history of the document. JSON used to always be Gregorian, and the calendar model was just a suggestion to the data consumer on how to present the data. Now it can be either, but the information in the database has not been scrubbed. The user interface does not support Julian dates; you can't enter February 29, 1900 for example. Part of the database code, Blazegraph, treats the year 0 as illegal and the year -1 as 1 BC. But the aforesaid data model regards 0 as a legal year and the year -1 as 2 BC. Like I said, read the project talk archives since September 2014, then read all the bugs mentioned in the calendar threads, then read all the bugs mentioned in the bugs. Jc3s5h (talk) 16:36, 6 June 2015 (UTC)
If I wanted to lose my mind, I'd have just picked up some Wittgenstein to read, thank you very much. What do you mean, "it can be either"? How are we supposed to distinguish between the two? Alakzi (talk) 16:50, 6 June 2015 (UTC)
The Wikidata user interface definitely does support Julian dates. The reason you can't enter 29 February 1900 is that the date doesn't exist in any calendar system (leap years are not century years except for millennium years). If you try, you'll find you can enter 29 February 1904 without problems because that date does exist. You can also enter 15 March 44 BCE and that is accepted, and it even allows you to mark it as Julian or Gregorian as you choose. That's Julius Caesar's (d:Q1048) date of death and is currently stored as ["time"] = "-0044-03-15T00:00:00Z" with a calendar model of "Proleptic Julian calendar" (d:Q1985786). It really doesn't matter whether some extension that reads RDF chooses to interpret -44 as 45 BCE because we're not using that extension. If Dirtlawyer1 wants to enter dates into Wikidata and supplies the relevant calendar for the date in each case, there will be nothing ambiguous about what is stored; nor is there any need for ambiguity in the value returned to Wikipedia via a Lua module that calls the mw API. I still don't see what the problem is for anyone wishing to manually add dates to Wikidata. --RexxS (talk) 17:43, 6 June 2015 (UTC)
Thanks, Rexx. I don't see any conflict with the manual transfer to Wikidata for the articles on which I've been working. All of them are post-1752 birth dates, and they do not include Greek, Russian or Bulgarian persons who were born before 1925. There may or may not be a hitch here, but it obviously does not impact the transfer of accurate persondata to Wikidata in the vast majority of cases. Dirtlawyer1 (talk) 18:00, 6 June 2015 (UTC)
(Reply to Alakzi after edit conflict) In the future, when mw:Wikibase/DataModel/JSON is implemented, when a "time" is put into, or extracted from the database, there must be a time field and a calendarmodel field (among others). You will use the calendarmodel field to determine if the time field is a Julian date/time or a Gregorian date/time. As for the user interface, it's broken; no word yet on what the repaired version will look like. Jc3s5h (talk) 17:48, 6 June 2015 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── That appears to be the case now; this is the value of "date of birth" of King George II, which includes both a Julian and a Gregorian date:

The implementation would suggest that calendarmodel is a representational, and not a presentational, value. [Note: I have since corrected his Julian dob to 30 October.] Alakzi (talk) 18:34, 6 June 2015 (UTC)

RexxS, your statement "The reason you can't enter 29 February 1900 is that the date doesn't exist in any calendar system (leap years are not century years except for millennium years)" is not correct. In the Julian calendar every AD year divisible by 4 is a leap year. In Russia and Greece, for example, 29 February 1900 was a valid date. I'll add more about Julius Caesar, but I want to save this post before I get another edit conflict. Jc3s5h (talk) 17:53, 6 June 2015 (UTC)
Ah, OK - I see what you're getting at now. I stand corrected: I should have said "apart from 29 February 1900 (Julian), 29 February 1800 (Julian), etc. I can't see any problem for anyone wishing to manually add dates to Wikidata." The interface warns you by saying "will display as 1 March 1900". I trust that Dirtlawyer and anybody else reading this thread will bear those century leap years in mind when working with Julian calendars. Thank you for the correction. --RexxS (talk) 18:23, 6 June 2015 (UTC)
The Wikidata death date for Caesar is correct incorrect, according to the current documentation for the JSON data model, and the birth date (12 July 100 BCE Julian calendar) disagrees with the "Julius Caesar" article, which doesn't venture to give a day in July. Evidently the information was incorrectly copied from the persondata, which disagrees with the text of the article.
If the death date is viewed in the user interface, it appears correct. But if another valid way of obtaining the data, the URL is used, the time value is given as "-0044-03-15T00:00:00Z" but the "-0044" should be "-0043". Jc3s5h (talk) 18:40, 6 June 2015 (UTC)
We need not concern ourselves on Wikipedia with the vagaries of JSON conventions. 15 May 44 BC is stored on Wikidata as "-0044-03-15T00:00:00Z" - that's all we need to know. If I call {{#Property:P570}} from Julius Caesar I get 15 March 44 BCE. When I'm writing Lua modules I'll remember that the snak value uses -1 for 1 BC - it's convenient because I can use mw.language:formatDate() with the negative value and just stick a 'BC' or 'BCE' on the end. That's all that is going to concern anybody wanting to use Wikidata in Wikipedia.
As for Caesar's birthdate, the Wikidata entry went from 'July 99 BC' to 'June 99 BC' to '1 July 100 BC' to 'July 100 BC' to '12 July 100 BC', without anyone offering a reference. Our article actually has footnote to the opening sentence claim of July 100 BC which reads: There is some dispute over the date of Caesar's birth. The day is sometimes stated to be 12 July when his feast-day was celebrated after deification, but this was because his true birthday clashed with the Ludi Apollinares. Some scholars, based on the dates he held certain magistracies, have made a case for 101 or 102 BC as the year of his birth, but scholarly consensus favors 100 BC. Goldsworthy, 30. I think this is good example of how sources can offer differing dates for an event in antiquity and the article acknowledges this. If we were good at supplying references for the data in Wikidata, then the question of whether Caesar's birthdate should be given as '12 July 100 BC', '13 July 100 B'C, or 'July 100 BC' would somewhat easier to resolve. Of course, all three could be stored in Wikidata, each with a different reference if we wanted to do that. --RexxS (talk) 19:08, 6 June 2015 (UTC)
Wikidata lies about dates before AD 1. I will fight tooth and nail against any importation of dates before AD 1 from Wikidata to Wikipedia until the faults are fixed. Jc3s5h (talk) 19:25, 6 June 2015 (UTC)

This is all very interesting, but appears to bear little if any relevance to the request for a bot to enact the community's decision to remove persondata from this Wikipedia. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:45, 6 June 2015 (UTC)

I've changed the subtitle of this subsection, to clarify my concern. I don't mind if the persondata templates get deleted after a period of time to rescue worthwhile data. But perusing dates in Wikidata items about people makes it clear that most editors and most bot operators have no idea that it is not OK to just copy and paste a date from Wikipedia to Wikidata. The number of Julian calendar dates is ever-increasing as the year gets earlier than 1924. Editors and bot operators have been oblivious to the distinction betweeen Julian and Gregorian dates. Thus, we should not put any warning in articles that would tend to encourage editors to move pre-1924 dates from Wikipedia to Wikidata; let the editors capable of doing this right self-identify and do it on their own initiative. Jc3s5h (talk) 17:57, 8 June 2015 (UTC)
The "practical guide" (see first step of RfC above) could (and I think: should best) contain a step-by-step portion on how to check correctness of dates (1. whether persondata date matches article & refs; 2. whether it has been transferred correctly to wikidata, and if not, or not transferred at all, how to update wikidata so that the date is correct & unambiguous). Maybe that's a good idea as a starting point to get the guide written. If I knew how it works I'd already started writing the guide but thus far I was still unsuccessful in adding "novelist" ([2]) as an occupation to I tried a lot of things but I still have no clue how to update the wikidata record on this person (the only thing wikidata seems to allow me is to edit the Dutch-language short description and aliases which I'm not interested in). --Francis Schonken (talk) 20:46, 9 June 2015 (UTC)
I have added novelist to Erwin Mortier (Q1620233)'s occupation. — Martin (MSGJ · talk) 11:48, 19 June 2015 (UTC)
@MSGJ: Tx, but that's not even close to what I meant: how about explaining how to do that to someone like me? --Francis Schonken (talk) 13:18, 19 June 2015 (UTC)
At the bottom of the "occupation" statement, press "Add" then type "novelist" and press "Save". — Martin (MSGJ · talk) 13:19, 19 June 2015 (UTC)
There is no such "Add" link when I go to that page, see what I wrote above. --Francis Schonken (talk) 13:25, 19 June 2015 (UTC)
It sounds like you might be blocking JavaScript. Alakzi (talk) 13:36, 19 June 2015 (UTC)
Can't find it – JavaScript seems to be working fine. --Francis Schonken (talk) 14:19, 19 June 2015 (UTC)
Try clicking on your Wikidata preferences page and see if you are an autoconfirmed user. That page might not allow edits by users who are not autoconfirmed (but I don't know how to inspect the page to see if that is the case). Jc3s5h (talk) 14:35, 19 June 2015 (UTC)
Says "Member of groups: Autoconfirmed users, Users" – tx for the help anyway. --Francis Schonken (talk) 14:46, 19 June 2015 (UTC)
Have you tried reading through d:Help:Statements and specifically d:Help:Statements#Adding statements? --Izno (talk) 15:37, 19 June 2015 (UTC)
That help page assumes [Add] links show up on the page, they don't on my screen (the page doesn't explain what to do if they don't). So I sorta gave up on trying to figure this out. I suppose I think my time is better spent elsewhere than trying to figure out a system that doesn't seem to work. Thanks again for the suggestions, but I fail to see a solution. I'd rather update persondata than lose any more time on this. --Francis Schonken (talk) 06:24, 24 June 2015 (UTC)
You could try the usual troubleshooting steps, like checking if it works in another browser. Alakzi (talk) 20:43, 1 July 2015 (UTC)


The above discussions have gone off at various tangents. The original RfC remains valid, and should be implemented, now. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:13, 30 June 2015 (UTC)

  • No, not "now", and not "by bot". Neither of these conditions resulted from the previous RfC, neither of them passed the current. Even if a more acceptable solution isn't yet put on track, that doesn't invalidate the objections to the original bot request. --Francis Schonken (talk) 14:23, 30 June 2015 (UTC)
    • @Francis Schonken and RexxS: At this point, Andy is the only editor still in support of an immediate deletion of all persondata by bot action; bless his heart, he is nothing if not consistent. What we need is a plan per Francis' comments and a deadline per RexxS's comments above. Personally, I have now manually completed the transfer of persondata to Wikidata for over 500 articles in the past month, and I can say the following with some certainty:
1. there is a great deal of perfectly accurate and usable persondata that has not been transferred to Wikidata by previous bot actions, including, but not limited to (a) common name variations of bio subjects, (b) most persondata datapoints entered since November 2014, and (c) to a lesser extent, places of birth and death;
2. most of the persondata requires a manual review and comparison of the persondata and Wikidata for a given article to (a) confirm the validity and usefulness of datapoints, (b) ensure that duplicate information is not being entered into Wikidata, and (c) properly enter it into Wikidata;
3. based on my interactions with other editors, most editors don't understand and don't care, and many that actively participated in the persondata program are resentful/cynical that their persondata efforts are being chucked without a more serious effort to incorporate existing persondata into the Wikidata article profiles -- we're going to have a harder time getting folks to buy into Wikidata as a result.
My conclusions: in the absence of a more sophisticated bot action to transfer name variants, etc., most of the remaining usable persondata is not going to be transferred unless it's manually transferred by concerned editors. I suggest we set a reasonable deadline of 90 to 180 days, create a set of persondata-to-Wikidata manual transfer instructions, then provide bot notices to every active editor and WikiProject which include and/or link to the instructions, and let the chips fall where they may. Andy is right about several things: Wikidata is a better system for embedding metadata, and persondata has been deprecated in recognition of that. It makes no sense whatsoever to drag this out. Let's settle on a plan and a deadline and get on with it. Reactions, comments, suggestions? Dirtlawyer1 (talk) 20:07, 1 July 2015 (UTC)
  1. What were your experiences with pre-1924 dates?
  2. I still can't edit wikidata from my machine, it sort of incapacitates me to see how it practically works (and from putting together a manual). And makes me wonder whether I'm the only one with this problem?
  3. When there's little animo to go ahead with this (as you describe), I suppose to impose a deadline before some more experience with how editors are likely to cooperate with these tasks is not really feasible. Try to get it running, check conversion rates, and only then decide on a deadline.
  4. what was your point 2.c, seems to have fallen of the cart? --Francis Schonken (talk) 20:34, 1 July 2015 (UTC)
@Francis Schonken: In answer to your questions: (1) I have experienced no problems with pre-1924 dates because fewer than 10% of the articles whose persondata I have transferred have such dates, and all of them were for Americans, Brits or other English-speaking nationalities for whom such dates are not a problem from 1733 to the present. (2) Can you not access Wikidata by clicking on the "Wikidata item" link on the lefthand tools menu for articles? If not, you need to get help at VP or elsewhere. (3) I suggest we prepare instructions and create a talk subpage for questions, etc., at Template talk:Persondata, and get a bot operator on board to provide notices to explain and attract more worker bees. A deadline, however, should be tied to the date of the bot notices. If it doesn't get done in six months, it's never going to get done. (4) I have finished my thought re 2(c) -- pesky client calls interrupted my thought. Dirtlawyer1 (talk) 22:34, 1 July 2015 (UTC)
Re. 2: as it happens I tried and clicked a lot of things (see also the advice I got from others in the previous section), but lost interest (the one persondata datapoint I remember entering, is now safely stored in Wikidata by someone else). My only point is: if I'm not the only one experiencing this we'd need to be prepared to answer such questions when we invite people to help out, it's not as if technical impediments are going to be helpful when cooperation interest is low...
Re. 3: then I'd propose a test run with one or two wikiprojects. I'd be happy with Wikipedia:WikiProject Classical music, one I'm familiar with.
Re. 2 (continued): compare this, an invitation we got on that project: I clicked a bit around on that proposal to help out, as it wasn't very clear what one exactly was supposed to do to be of any help, I clicked away from it again very soon. In order to get maximum response the threshold for understanding what to do and how it works (including overcoming possible technical impediments) needs to be very low. --Francis Schonken (talk) 22:50, 1 July 2015 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── No, I am not "the only editor still in support of an immediate deletion of all persondata by bot"; we had an RfC which agreed - with no caveats - that persondata should be removed. However, there are only a couple of editors who are trying to wikilawyer around that consensus. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:38, 2 July 2015 (UTC)

Bot to check the Internet Archive for dead link solutions.[edit]

Pursuant to the discussion at Wikipedia:Village pump (proposals)#Can we get a bot to check the Internet Archive for dead link solutions?, we need a bot that will address dead links on Wikipedia by checking to see whether the URL has been archived at the Internet Archive, ideally finding whatever archiving is the closest in time to the addition of the link to the Wikipedia page. The bot would then replace the dead link with the Internet Archive link, with a note indicating that the content still needs to be confirmed by a human editor. Given the proliferation of dead links on Wikipedia, such a bot can massively improve the sourcing of articles. This bot program, once created, will be as important to the quality of Wikipedia as Cluebot is now. bd2412 T 23:31, 3 June 2015 (UTC)

Definitely. I haven't taken up a bot project in a while. I'll attempt this one. Officially, Pictogram voting wait.svg Doing...cyberpowerChat:Online 15:06, 5 June 2015 (UTC)
Great to see this happening. User:Ladsgroup was also thinking of working on this as we were discussing it. User:Ocaasi is also meeting with Internet Archives regarding API access.
A next step would be to have the bot trigger archiving of urls by Internet Archives which will require a collaboration with Internet Archives. This would be a further improvement as we would then have the correct version and it would occur before the url goes dead. Doc James (talk · contribs · email) 19:09, 10 June 2015 (UTC)
I sincerely hope that I'm the only one working on this. I've spent a considerable amount of time and effort carefully coding up this bot, to let it go to waste. As for archiving URLs that have no archive, I'm way ahead of you on that one, and it's already implemented.—cyberpowerChat:Offline 04:26, 15 June 2015 (UTC)
Alternatively (unless it's already done) one could think of creating a bot to archive/link stuff to WebCite; having more than one archive (as a backup and for links for which Wayback doesn't have a valid page) might be good. Jo-Jo Eumerus (talk) 05:45, 15 June 2015 (UTC)
The person who runs WebCite cannot afford to have tons of links added. It is currently hosted by amazon and he is paying for it personally. He was interested in us taking it over but I think their are concerns with us hosting content that is none free. Doc James (talk · contribs · email) 10:37, 15 June 2015 (UTC)
Again, to reiterate, the bot is able to instantaneously archive active sites into the WayBack machine. The bot will essentially make sure that there will be an archived copy available before the link ever goes dead.—cyberpowerChat:Online 11:59, 15 June 2015 (UTC)
Wonderful :-) Doc James (talk · contribs · email) 12:22, 15 June 2015 (UTC)
Brilliant. Please. If it also does past links, it could just pick the last archive copy before the edit linking to the page was made (often the later archived versions are 404 redirects or their like; it might also be useful to look for dramatic changes between archived versions). But these are added decorations. HLHJ (talk) 12:12, 8 July 2015 (UTC)
I'm working in the UK Government Web Archive. We have a far smaller set of archived sites than, but what we do have should be exhaustive - very few missing pages. That means that even if the same page is on, links from that page to nearby ones are much less likely to be broken. Is there any help I could give to use the bot with our archives too? (I imagine the same will go for various national web archives -eg the Australian Government archive, or the Portuguese one - all likely to have deeper coverage of our specialised areas than, especially for recent material).New marinheiro (talk) 12:52, 28 July 2015 (UTC)

Eleutherodactylus articles family errors[edit]

Many of the 276 species articles of genus Eleutherodactylus and linked from that article display the family as Leptodactylidae in the taxobox and the body of the article. But Eleutherodactylus is now considered to be in the Eleutherodactylidae family. Some articles that need this fix are Eleutherodactylus amplinympha, South Island telegraph frog (redirected from Eleutherodactylus audanti), Eleutherodactylus auriculatoides.

This task would take a long time to do by hand. Could a bot be generated to edit each of the Eleutherodactylus species articles and set the family to Eleutherodactylidae if it is not already?

I imagine the bot might operate something like this pseudocode:

edit article
   within {{Taxobox}} template
      if familia parameter != "Eleutherodactylidae" 
         then replace familia parameter with "Eleutherodactylidae"
   after {{Taxobox}} template until first period character
      find the first use of character string "[[...]] family"
          where ... is any string of alphabetic characters.
      replace [[...]] with [[Eleutherodactylidae]]

This would work for at least the three articles listed here. I haven't verified it would work for all affected articles. It would mess up if the family were mentioned after the first sentence, or if the taxobox-style display were generated by a different template such as speciesbox. —Anomalocaris (talk) 04:17, 5 June 2015 (UTC)

  • Yes check.svgY Done the pages displaying Leptodactylidae as the family (manually). Thine Antique Pen (talk) 22:10, 21 July 2015 (UTC)

Add templates using Wikidata to category[edit]

There is Category:Templates using data from Wikidata, for templates using wikidata, which currently contains 29 pages... but searching in template namespace for insource:/\#property/ matches 88 pages. Adding missing templates to the category (or at least generating a worklist), and listing potentially miscategorised templates for human checking, seems like a useful task for a bot to run every so often, so that we have a more accurate idea of how many templates are actually using Wikidata. - Evad37 [talk] 06:44, 5 June 2015 (UTC)

Actually, this isn't so simple – some templates are just checking wikidata to add tracking categories, and not actually using data directly - Evad37 [talk] 07:04, 5 June 2015 (UTC)
I suspect there are a number of templates not using the #property but instead using Lua via module-space to get the data. --Izno (talk) 15:22, 7 June 2015 (UTC)

Help with Category:Historic England citations needing attention[edit]

Would it be possible to have a bot help out clearing the above category? I've been doing this with AWB but it's now becoming too big of a job.

What I need is the following changes to transclusions of {{National Heritage List for England entry}}, including redirects {{English Heritage List entry}}, {{NHLE}} and {{National Heritage List for England}}:

  • |separator=, AND |ps= replaced with |mode=cs2

Also the following non-essential changes to be made at the same time:

  • |accessdate= to |access-date=
  • |fewer-links=x to |fewer-links=yes

Probably best to do this in main space only and leave the handfull of non-main space uses for manual review.


Reason: This template defaults to Citation Style 1; previously it used the |separator= and |ps= parameters in tandem to switch to Citation Style 2, however |separator= is now deprecated and no longer works.

I'll keep going with AWB in the meantime, but would be grateful for any help. PC78 (talk) 15:28, 5 June 2015 (UTC)

I have modified an AutoEd script to clean these and can work on some of them today. Meanwhile, carry on with AWB. It's only a few hundred articles.
Anyone who wants a regex for these changes can look at User:Jonesey95/AutoEd/coauthors.jsJonesey95 (talk) 17:17, 5 June 2015 (UTC)
Thanks for the help! As far as I can tell the category is still filling up and there are about 5000 transclusions in total. PC78 (talk) 17:24, 5 June 2015 (UTC)
I fixed somewhere over 300. There are currently 200 exactly. You are correct that the category is still being populated by the job queue. Category population based on template changes can take one to three months. – Jonesey95 (talk) 02:52, 6 June 2015 (UTC)
Yes check.svg Done. This category is empty for now. – Jonesey95 (talk) 14:22, 6 June 2015 (UTC)
Cheers, I cleared most of the others earlier today. I'll keep an eye on the category, make sure that it doesn't get out of hand again. PC78 (talk) 14:26, 6 June 2015 (UTC)

Linking identically-named categories on Wikipedia and Wikimedia Commons[edit]

I noticed that there are many categories on Wikipedia with names that are identical on Wikimedia Commons, such as Category:Political parties by continent. Nonetheless, these pages haven't been automatically linked. Can we obtain a list of identically-named categories without these links? Jarble (talk) 04:48, 8 June 2015 (UTC)

Perhaps this isn't the right place to request this feature. Can we send this request to the developers of Wikidata so that they can implement it? Jarble (talk)

Wikidata would be the place, yes. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:15, 10 June 2015 (UTC)

Signing user: Another Believer "Wiki Love Pride" announcements[edit]

Can a bot be set up to go through Another Believer (talk · contribs) postings to talk pages about "Wiki Loves Pride" (on June 3 and June 2)? All the announcements (such as this one [3]) are missing dated signatures, and are thus going to break the archival bots on all these WikiProject talk pages that automatically archive discussions based on a date timestamp. Missing the timestamp, these will not be archived, so will continue to advertise the event well after it is over, becoming useless congestion on the talk pages. -- (talk) 04:19, 11 June 2015 (UTC)

A bot that compiles a list of all DYK pages under the jurisdiction of a given wikiproject[edit]

Can someone program a bot that compiles a chronological list of all articles that have been exhibited on the main page as DYK hooks under the jurisdiction of a given wiki project and posts the list as a subpage of the wikiproject's page? All the bot has to do to source the list is see which article talk pages contain both the DYK template and the template denoting it as under the purview of the wikiproject the list is being compiled for. Abyssal (talk) 01:02, 15 June 2015 (UTC)

User:JL-Bot/Project content already does that, and more - Evad37 [talk] 01:11, 15 June 2015 (UTC)

Accidental template protection[edit]

Occasionally, I've noticed that an article has been mistakenly template-protected. Perhaps a bot could monitor the protection log, and if a page in a namespace other then Template, Module, User, or Wikipedia is template-protected, deliver a "did you mean to do this" message to the protecting admin, i.e.

Hello administrator name. On date you template-protected [[page]] ([ (url-encoded) log]). As template protection is only meant to be used for templates, or other highly transcluded pages, did you perhaps mean to select a different level? Thanks, bot signature

or similar. - Evad37 [talk] 03:54, 17 June 2015 (UTC)

Unsure if necessary - couldn't you just add something to said template? (Just lurking WP:BOTREQ to see what sort of things people want bots to do). E. Lee (talk) 04:59, 17 June 2015 (UTC)
@Elee: Which template are you talking about? And how does adding something to a template fix a wrongly applied protection level? - Evad37 [talk] 05:50, 17 June 2015 (UTC)
For an example of the problem and proposed solution (letting admins know that they may have made a mistake so they can fix it), see User talk:Ponyo#Maithali protection level, or User talk:Black Kite#Farshad Fotouhi‎ protection level - Evad37 [talk] 05:54, 17 June 2015 (UTC)
There are padlock templates added to protected pages. These "sense" if they are incompatible with the protection actually used, I believe, and put the page in a category to be fixed.
Arguably there is something that could be done along these lines.
A list of template protected articles can be found here (currently empty). A bot could check this, and act upon it. All the best: Rich Farmbrough, 21:43, 27 July 2015 (UTC).

AFC submission cleanup bot[edit]

There should be a bot that adds {{subst:AFC submission/draftnew}} on pages on the draftspace when there is no AFC tag on the draft and that removes duplicate AFC tags. --TL22 (talk) 18:32, 17 June 2015 (UTC)

Symbol declined.svg Not a good task for a bot There are two types of pages that live in the Draft namespace, those that are enrolled in AFC and those that are not. It has been expressed several times that anybody can create a page in the namespace and they are not required to be enrolled in AfC. Secure a consensus at the Draft namespace talk page (and with the AFC wikiproject) before making this proposal. Hasteur (talk)
@Hasteur: But can the "remove duplicate tags" task at least be done? --TL22 (talk) 18:48, 17 June 2015 (UTC)
@ToonLucas22: Needs more definiton as to what you mean by duplicate AFC tags, Also you need to secure a consensus from the AFC wikiproject) to make this kind of change as I know that the potential duplicate AFC tags may be useful (i.e. previous AFC declines). Hasteur (talk) 18:51, 17 June 2015 (UTC)
@Hasteur: what I mean with duplicate AfC tags is the markup {{AFC submission/pending}}, {{AFC submission/reviewing}}, {{AFC submission/draft}} or {{AFC submission/draftnew}} written twice. --TL22 (talk) 21:02, 17 June 2015 (UTC)
@ToonLucas22: which tag would then stay? E. Lee (talk) 13:46, 5 July 2015 (UTC)

Articles with {{Infobox Journal}} seek bot to ensure redirects are in place[edit]

As discussed at Wikipedia_talk:WikiProject_Academic_Journals#Bot_task?, there are several fairly standard redirects needed to each article in this project using that infobox. The box has parameters for the journal title and it's ISO abbreviation. Citations routinely vary the capitalization, abbreviations, and punctuation of these abbreviations, creating a need for redirects from each common variation to the actual article title (usually the same as the journal title, in sentence case). Is there a bot that might be suited to the task? LeadSongDog come howl! 01:23, 18 June 2015 (UTC)

  • I've obtained the ISO 4 vocabulary to convert, e.g., "European Physical Journal" to "Eur. Phys. J."; it's a spreadsheet-format version of the PDF available at Could you bot-wizards please tell us if such a conversion would be simply too complicated? Thanks! Fgnievinski (talk) 02:30, 30 July 2015 (UTC)
  • Maybe an easier and useful thing to do instead would be to start from Infobox journal's title field (e.g., "European Physical Journal") and its manually-entered abbreviation field ("Eur. Phys. J."), and create the desired redirects: e.g., "European physical journal", "Eur. Phys. J.", "Eur Phys J", "eur phys j", "E. P. J.", "E.P.J.", "E P J", "EPJ". Fgnievinski (talk) 02:45, 30 July 2015 (UTC)

Repair or replace Wikipedia:Changing username/Simple clerkbot[edit]

The CHU/S clerkbot has stopped marking completed requests as done, and its operator has signalled that they are unable to create a patch at this time, but would be fine with someone else updating the code or taking over the task.

The source code is here:

I believe the relevant code begins around line 276. Please let me know if you require any further information. –xenotalk 12:41, 18 June 2015 (UTC)

Xeno has asked me to take a look. It shouldn't be too difficult to take over, but I'd like to first get Cyberbot II moving which should be happening really soon.—cyberpowerChat:Online 20:16, 25 June 2015 (UTC)
Doing...cyberpowerChat:Online 01:09, 22 July 2015 (UTC)
Symbol question.svg Question: Can you please give me a description of the bug? It would make it a lot easier to find if I knew what to look for.
Symbol wait.svg BRFA filed with fixed bugs.—cyberpowerChat:Online 23:15, 23 July 2015 (UTC)

Replacement of Template:Infobox Country World Championships in Athletics[edit]

Hello. Could I hire a bot to substitute all transclusions of {{Infobox Country World Championships in Athletics}}, per the outcome of this TfD? Alakzi (talk) 13:12, 20 June 2015 (UTC)

Same with {{Infobox China station}} and {{Infobox Japan station}}, but using the sandbox version. Alakzi (talk) 17:34, 20 June 2015 (UTC)
{{Infobox Country World Championships in Athletics}} done - thanks Plastikspork. Alakzi (talk) 16:00, 25 June 2015 (UTC)

Updating US Census Estimates[edit]

Is there a bot available that could add the current United States Census Bureau population estimates (and unfortunately I wouldn't trust OCR for a lot of the older Census files because I often have to look carefully/zoom myself to tell 3 from 8 or 6 from 0)? It should be a fairly straightforward task. The Census updates can be found at I am in the process of adding data (mostly, I am using an AWK script on my computer to format data from a spreadsheet for copy/paste into Wikipedia) manually, and for that, I'm okay, since it gives me a chance to do spot edits on those pages as well and allows me to try to make sure that adding the USCensusPop widget doesn't completely screw up formatting of the page, but it's not something I could do every year.

Specifically, it could check to see if a page for a place has a Template:USCensusPop, and then if so, just update. Very simple. I'd write it myself, but it would be nicer if somebody either has code I can reuse or if they could do it all themselves. Thanks. DemocraticLuntz (talk) 23:30, 21 June 2015 (UTC)


I want a simple UI bot that runs on desktop or pc for the clash of clans game that is played in android and IOS device. — Preceding unsigned comment added by (talk) 06:11, 23 June 2015 (UTC)

Sorry, this page is only for requests that improve the Wikipedia encyclopedia or its sister projects. -- John of Reading (talk) 06:21, 23 June 2015 (UTC)

Denied.-- (talk) 07:47, 24 June 2015 (UTC)

The computing reference desk would give you pointers. All the best: Rich Farmbrough, 21:33, 27 July 2015 (UTC).

Template:Aviation lists[edit]

Following the RFC consensus that this template should not be used on articles that aren't linked directly to the template, please can a bot go through and remove this from approx. 17,000 articles that include the template. Thanks. Lugnuts Dick Laurent is dead 07:18, 23 June 2015 (UTC)

I've got this. Cheers! bd2412 T 18:45, 23 June 2015 (UTC)
Thank you. Lugnuts Dick Laurent is dead 19:57, 23 June 2015 (UTC)
Done. bd2412 T 03:58, 24 June 2015 (UTC)

Convert (certain) external links to HTTPS[edit]

Pictogram voting wait.svg Doing... for reasons and explained here (among others) internet traffic should be encrypted. Recently, Wikimedia decided to use HTTPS by default. Which begs the question why we not also convert external links to HTTPS (wherever this is an option). For instance, one of the most-linked websites on Wikipedia, the Internet Archive, actually encourages HTTPS inbound links ever since 2013, yet most of the external links on Wikipedia to them still use insecure HTTP. Also, all Google services offer HTTPS access, and Google encourages one to use it, but there are still thousands of links to Google Books, Google News, and YouTube with HTTP. Long story short, what I am asking for is a simple search-and-replace bot, to convert:

youtube ... you get the idea.

Is it possible to have this done by a bot? --bender235 (talk) 17:43, 27 June 2015 (UTC)

bender235 does the htto to https econversion for extenal link has consensus? I recall some reactions in the past but I can't find any link to some discussion about it. -- Magioladitis (talk) 17:50, 27 June 2015 (UTC)
We had a discussion on VPP that concluded that we should use protocol-relative links to whichever site supports both HTTP and HTTPS equally. However, since Wikipedia moved to HTTPS by default permanently, protocol relative links make little sense. --bender235 (talk) 18:40, 27 June 2015 (UTC)
JFYI: this is why we do this: “Since the Internet Archive site uses HTTPS by default for its connections, Russian ISPs are unable identify which page is being requested by their users, and thus whether it is the one subject to the new ban.” ISPs can no longer interfere with a site's traffic. All they can do is block the entire domain, which sooner or later will cause public protest. --bender235 (talk) 05:53, 29 June 2015 (UTC)
I think for youtube is better to convert to {{Youtube}}? -- Magioladitis (talk) 17:50, 27 June 2015 (UTC)
Yes, raw Youtube links should be converted to {{Youtube}}, but those inside {{cite web}} or similar templates should just be converted to https. --bender235 (talk) 18:40, 27 June 2015 (UTC)
@Bender235: In early 2014 we discussed using protocol-relative links instead. GoingBatty (talk) 18:35, 27 June 2015 (UTC)
It appears that {{Google books}} uses protocol-relative links while {{YouTube}} and {{Wayback}} use https. GoingBatty (talk) 18:39, 27 June 2015 (UTC)

I pointed out to Bender235 that over and above altering links from "http:" to "https:" that changing from a country specific address (such as to .com may deny access to some people, as sometimes there appears to be a restriction in the access to text in one country but not another. Bender235 wants proof of this, but as I have not kept records of it and I make a lot of edits, I will provide one when I come across it, but in the mean time I see no need to change the country domain along with the connection type.

It has been pointed out that this sort of edit can easily mask vandalism (see User talk:Bender235#https), so as it is not a change that needs expediting, that must be weighed in whether this is a suitable candidate for automation (rather than for example adding to to a process like AWB to be done when other more specific changes are made). See also User talk:Bender235#AWB, apparently Bender235's AWB access was removed on by user:Materialscientist on 2 July 2015 (it has not been restored. When discussing this on Bender235's talk page it was suggested by Bender235 that the discussion Wikipedia:Village pump (technical)/Archive 138#HTTPS by default was relevant to this and so should probably be included in this conversation.

-- PBS (talk) 09:50, 13 July 2015 (UTC)

Bot was requested to tag pages related to this WikiProject[edit]

Please read Wikipedia:Village_pump_(technical)#Can_we_add_WikiProject_Poland_template_to_all_articles_that_are_missing_it_but_have_the_milhist-Poland_taskforce_template.3F. Piotrus requested that Yobot adds banners of WikiProject Poland to pages that already have the milhist-Poland taskforce template. Any comments are welcome. IF there are no disagreements the bot will start the task in the next few days.

Just for your information: Piotrus is a member of WikiProject Poland and both WikiProject Poland and WikiProject Military history were notified. -- Magioladitis (talk) 14:16, 1 July 2015 (UTC)

Removal of {{Start date}} from {{Singles}} template[edit]

It has become common practice in album articles to use {{Start date}} in the {{Singles}} add-on to {{Infobox album}}. Per Template:Start date/doc: "This purpose of the {{start date}} template is to return the date (or date-time) that an event or entity started or was created. It also includes duplicate, machine-readable date (or date-time) in the ISO date format (which is hidden by CSS), for use inside other templates (or table rows) which emit microformats. It should only be used once in each such template and should not be used outside such templates." i.e. {{Start date}} should only be used in album articles for the album release date, not single release dates. It would be nice to have a bot to clean this up, as this error is currently in who knows how many articles. Chase (talk | contributions) 16:44, 5 July 2015 (UTC)

DOI bot[edit]

Given a reference in the forms

"<ref>doi:10.[four digits]/*</ref>" "<ref>[four digits]/*</ref>" or "<ref>[four digits]/*</ref>",

the bot should insert the full reference into the article page and into Wikidata. It might be extended to add data to existing references that are, say, missing the date of publication.


HLHJ (talk) 12:28, 8 July 2015 (UTC)

See User talk:Citation bot#Replacement citation bot? and the immediately preceding section. --Izno (talk) 13:09, 8 July 2015 (UTC)
That discussion does not appear to be leading to getting a bot to start working on the Cite Doi templates. Abductive (reasoning) 19:13, 8 July 2015 (UTC)
The bot in question preempts the need for doing so (were it turned on). Inserting {{cite journal|doi=value}} and then the bot fills in the other data is what the bot does (or did with {{cite doi}}). --Izno (talk) 19:48, 8 July 2015 (UTC)
As for Wikidata, I'm not sure of your intentions, so you will need to clarify. Regardless, that bot would need to be approved at Wikidata, not here. --Izno (talk) 19:49, 8 July 2015 (UTC)
That sounds good, and would do half my request. I hope it's back soon.
Apologies for the lack of clarity. Wikidata has a data format for journal sources, but there is currently no way to create items from citation templates. See this discussion. There are tools for doing it from a DOI; see the tools section here. It seemed to me that co-ordination between bots working on both might be helpful at avoiding duplicates, etc., but I take your point that separate bots might be easier. HLHJ (talk) 14:40, 16 July 2015 (UTC)
There is consensus at WPMED to replace cite DOI with cite journal on medical articles. Doc James (talk · contribs · email) 15:35, 16 July 2015 (UTC)

Add extracts to pages[edit]

Dear Wikipedia,

In making a small script in python to be used with an alias on linux machines. It would be a super easy way to view content from wikipedia.

It simply grabs the extract from a page of your choice. In checking it's functionality, I've noticed not all pages have an extract. Is there a tool to be made or already made to make sure all pages have an extract?

-Ben — Preceding unsigned comment added by 2601:47:4102:1FEC:2C11:67D6:6917:BB7E (talk) 01:07, 16 July 2015 (UTC)

Academic journals may lack WPJournals template in their talk pages[edit]

Articles in Category:Academic journals may lack Template:WikiProject Academic Journals in their talk pages; could a bot check and list those, please? I guess asking a bot to fix automatically might be risky. Thanks. Fgnievinski (talk) 04:16, 18 July 2015 (UTC)

@Fgnievinski: Yes check.svg Done - see User:Fgnievinski/Academic journals. The category tree looks good for requesting the template be added. Note that some of the articles do not have talk pages. GoingBatty (talk) 14:00, 18 July 2015 (UTC)
Click the "►" below to see all subcategories:
@GoingBatty: Many thanks! We're discussing a few special cases at Wikipedia_talk:WikiProject_Academic_Journals#1500_untagged_academic-journal_articles and will be back with a decision later. Fgnievinski (talk) 20:52, 18 July 2015 (UTC)

Redirects to academic journals may lack WPJournals template (class=redirect) in their talk pages[edit]

Would it be possible to check, for each page with Template:WikiProject Academic Journals in its talk page, if its redirects also have Template:WikiProject Academic Journals (class=redirect) in their respective talk pages? Thanks! Fgnievinski (talk) 20:52, 18 July 2015 (UTC)

Bot for linking pages[edit]

Hi, I'm looking for a bot that will add links to pages that already exist. So for instance "The rain in Spain falls mainly on the plains" does anyone know of one? --Stuartbman (talk) 09:45, 19 July 2015 (UTC)

Stuartbman, I don't think any bot does that. Also, a question: in your example sentence, how would the bot avoid linking rain, falls, and plains? This proposal in general somewhat opposes WP:BUILD, and it might confuse readers. APerson (talk!) 13:45, 19 July 2015 (UTC)
@Stuartbman: Have you tried Find link? GoingBatty (talk) 15:08, 19 July 2015 (UTC)
Thanks @GoingBatty:@APerson:. This is actually for a separate wiki with a smaller number of pages, I will check Find link though which I might be able to modify, thanks! --Stuartbman (talk) 17:20, 20 July 2015 (UTC)

Multiple Shared Ip notices on Talk pages[edit]

There can be many shared ip notices on talk pages as seen here:
This can be confusing, looks bad. Only one is needed at the bottom. TheMagikCow (talk) 14:36, 20 July 2015 (UTC)

I've aet up automated archiving for that page. Perhaps a bot could do so for others? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:13, 20 July 2015 (UTC)

Companies with coordinates[edit]

Please could somebody either count, or better, provide a list (with a count) of articles using {{Infobox company}} and {{Coord}}? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:04, 20 July 2015 (UTC)

Here is a Quarry query and a wikitable of the results. --Bamyers99 (talk) 19:58, 20 July 2015 (UTC)
@Bamyers99: Thank you. I'd not seen Quarry before, so that's a bonus. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:47, 22 July 2015 (UTC)


The bot would monitor recent changes and look for people adding username mentions to existing posts but not not changing the timestamp. It would then ping the new mention user, giving them a diff of where their username was added in an ineffective ping attempt. Opt-out capable for both the mentioner and the mentioned. –xenotalk 10:07, 22 July 2015 (UTC)

the end of .an[edit]

On 31 July (that's next Friday!) the top level domain .an (used for the Netherlands Antilles, until their dissolution in 2010) will be terminated, which will result in a significant number of dead links. Most (but not all) domain owners moved to the corresponding Curacao domain (.cw), while keeping the .an page in tact. I'd like a bot to:

  • convert all .an links to .cw
  • if possible, check if these .cw domains give a result (any result) and if not: make a list for manual checking....
  • if possible, check if the link on the .an is the exact copy of the link to the .cw site (and make a list of non-conforming links)

Would that be doable or am I making a very complicated request? L.tak (talk) 22:04, 22 July 2015 (UTC)

@L.tak: There are 157 (well 156, I fixed one) links listed here. Easy enough to fix the project ones manually? (note there are many dead links which will need an archival copy search.)
All the best: Rich Farmbrough, 21:25, 27 July 2015 (UTC).

Crashed links[edit]

Is there any way to automatically revive the links for all the Athletic Bilbao football players? The club's website keeps changing configuration, so it frequently crashes down. Right now, current (which now have already been revived by a fellow WP user) and past players combined, we have 236 articles, it would be a pity to lose that link because it's quite comprehensive and in English (or it has an English version available, better said).

Attentively -- (talk) 20:29, 23 July 2015 (UTC)

Is there a simple mapping between the old URL & the new one? May be you could give an example that could be followed? If this happens regularly then a template may be the answer so that the URL can be modified centrally and all articles using it changed appropriately (assuming that the player part remains the same between changes). Keith D (talk) 21:16, 23 July 2015 (UTC)

Yes sir, please see Eneko Bóveda before and after (here -- (talk) 23:53, 23 July 2015 (UTC)

I'm not sure what you mean by "which now have already been revived by a fellow WP user". Please clarify whether any links currently need to be fixed. All links I examined were already working correctly before your post. Are you requesting something to protect against future changes to the url's? That is impossible in general without knowing how the url's might change, and different websites do all sorts of different things to their url's. The example changed to "842" was in the old url but not "eboveda" so there is no simple mapping like Keith D asked about. Creating a template like {{Athletic Bilbao profile}} with content like [{{{1|}}}/{{{2|}}}.html {{{title|Athletic Bilbao profile}}}] and currently calling it with two parameters like {{Athletic Bilbao profile|842|eboveda}} will create protection if a potential future url can be derived from "842" and "eboveda". Then a single template edit could immediately fix all uses of the template. If only "842" had been in the url before like the example change then a bot could have been coded to scan the 26 index pages at and discover which name goes with which number. PrimeHunter (talk) 15:17, 27 July 2015 (UTC)
I now see on searching at Special:LinkSearch that around 260 url's need an update. I'm not a bot coder but a bot could do it by scanning as suggested. I have created {{Athletic Bilbao profile}}. It's not documented yet and may add or modify named parameters later today but for unnamed parameters, {{Athletic Bilbao profile|842|eboveda}} produces:
PrimeHunter (talk) 15:53, 27 July 2015 (UTC)
Pictogram voting wait.svg Doing... I will try to do this with a large set of AWB find and replace rules instead of a bot. PrimeHunter (talk) 14:32, 28 July 2015 (UTC)

Many thanks for your help, sorry for the delay because I have only read the follow-up to my query now. By "which now have already been revived by a fellow WP user", I meant the link had been fixed only in the current players of this club (please see the Athletic Bilbao WP article, then "Current squad", to see who they are), but the old players (dead, retired, with other teams) were not working. My intention was never to seek protection against future changes to the URLs, that's not feasible of course, and this link in particular has changed configuration several times, we'll just have to configure as they do.

Cheers again -- (talk) 18:31, 29 July 2015 (UTC)

Thanks for the clarification. I have fixed all mainspace links starting with In a single case I couldn't find a new link and used the Internet Archive.[4] I have used {{Athletic Bilbao profile}} on around 200 articles (including some where the url was already fixed) so hopefully the fixes will be easier next time. List of Athletic Bilbao players had a lot of formatted references so I only changed url's there.[5] There are still many articles with broken url's with other paths at I expect to look at that later today. PrimeHunter (talk) 19:55, 29 July 2015 (UTC)

Scan colors used by Infobox television season[edit]

Can someone have a bot scan all the (approx 3650) transclusions of Template:infobox television season and provide a list of the values used with |bgcolour= or |bgcolor= or |headercolour= or |headercolor=? you can put the result in a subpage of my userspace, and I will process it to compute colour contrast ratios. thank you. Frietjes (talk) 14:10, 27 July 2015 (UTC)

The following redirects exist:
  1. {{Infobox Smallville season}}
  2. {{Infobox television Project Runway}}
  3. {{Infobox reality dance competition}}
  4. {{Infobox Strictly Come Dancing Series}}
  5. {{Infobox Dancing with the Stars season}}
  6. {{Infobox Dancing on Ice series}}
  7. {{Infobox Television season}}
  8. {{Infobox tvseason}}
  9. {{Infobox Television Season}}
  10. {{Infobox dancingwiththestarsseason}}
  11. {{Infobox dancing with the stars}}
  12. {{Infobox television series}}
  13. {{Infobox Television Top Chef}}
  14. {{Infobox Television Project Runway}}
  15. {{Infobox TV series}}
All the best: Rich Farmbrough, 21:18, 27 July 2015 (UTC).
I'm curious as to the point of this, when we already have a tracking category for the ~1,000 instances with inaccessible colour combinations. Several threads on Template_talk:Infobox television season refer. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:26, 27 July 2015 (UTC)
Andy: Frietjes wants the actual values, not only that the values are not compliant. To that end, that means that we actually only need to scan the uses of infobox TV season and its redirects which are also in the category that you provided. --Izno (talk) 21:30, 27 July 2015 (UTC)
What are we going to use the values for? Alakzi (talk) 21:33, 27 July 2015 (UTC)
Besides the stated "compute a contrast ratio"? --Izno (talk) 21:36, 27 July 2015 (UTC)
I mean, to what end? It looks like this was requested by Dirtlawyer1, but the purpose is unclear to me. Alakzi (talk) 21:45, 27 July 2015 (UTC)
Thank you. My question was about the point of the request, not its content, which is clearly stated. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 09:47, 28 July 2015 (UTC)
Not being able to read either's mind, I would hazard a guess that the purpose is (one of the two):
  • to get a list so that there's an understanding of the offending color combinations
  • to modify each value in the list so that a bot can update the entire set of offending values at the same time to satisfy the consensus.
Supposition only, of course. --Izno (talk) 13:09, 28 July 2015 (UTC)
The "offending color combinations" are in the tracking category I mentioned above, so if, as you speculate, that is its purpose, this request is unnecessary. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 23:28, 29 July 2015 (UTC)
Let me rephrase bullet one, because bullet two should be obvious and is alone enough to sustain this request (which, by the way, your opinion on whether the request is necessary is irrelevant given that consensus is not needed simply for a request for information): to get a list of the offending color combinations so that there's an understanding of the offending color combinations. That this was the intent of bullet one should have been obvious, but since it apparently was not, I hope this second go at it is. --Izno (talk) 23:44, 29 July 2015 (UTC)

Using Infobox journal's language field to populate Category:Academic journals by language sub-categories[edit]

Could a bot please inspect values entered in field "language" of Infobox journal? Then possibly populate individual sub-categories of Category:Academic journals by language (as per WP:JWG). Thanks! Fgnievinski (talk) 02:14, 30 July 2015 (UTC)