Wikipedia:Bot requests: Difference between revisions
→Help with BAFTA articles: new section |
|||
Line 1,345: | Line 1,345: | ||
I don't quite know what bots are capable of, but I'm hoping it's possible to fix this. I imagine the best way would be for a bot to search for any articles with: <nowiki>[[BAFTA Award for Best Film|BAFTA Award for Best British Film]], [[BAFTA Award for Best Film|Best British Film]], [[BAFTA Award for Best Film|BAFTA Award for Best Film Not in the English Language]], [[BAFTA Award for Best Film|Best Film Not in the English Language]], [[BAFTA Award for Best Film|BAFTA Award for Best Foreign Film]], [[BAFTA Award for Best Film|Best Foreign Film]].</nowiki> And then hopefully it could fix them by removing the piping? If it's at all possible that would be '''great''' because doing it manually will take ages. --'''[[User:Loeba|<font color="#CC0O66">Loeba</font>]] [[User talk:Loeba|(talk)]]''' 11:24, 10 December 2015 (UTC) |
I don't quite know what bots are capable of, but I'm hoping it's possible to fix this. I imagine the best way would be for a bot to search for any articles with: <nowiki>[[BAFTA Award for Best Film|BAFTA Award for Best British Film]], [[BAFTA Award for Best Film|Best British Film]], [[BAFTA Award for Best Film|BAFTA Award for Best Film Not in the English Language]], [[BAFTA Award for Best Film|Best Film Not in the English Language]], [[BAFTA Award for Best Film|BAFTA Award for Best Foreign Film]], [[BAFTA Award for Best Film|Best Foreign Film]].</nowiki> And then hopefully it could fix them by removing the piping? If it's at all possible that would be '''great''' because doing it manually will take ages. --'''[[User:Loeba|<font color="#CC0O66">Loeba</font>]] [[User talk:Loeba|(talk)]]''' 11:24, 10 December 2015 (UTC) |
||
[[WP:AWB]] baby [[Special:Contributions/166.170.47.209|166.170.47.209]] ([[User talk:166.170.47.209|talk]]) 16:30, 10 December 2015 (UTC) |
Revision as of 16:30, 10 December 2015
This page has a backlog that requires the attention of willing editors. Please remove this notice when the backlog is cleared. |
Commonly Requested Bots |
This is a page for requesting tasks to be done by bots per the bot policy. This is an appropriate place to put ideas for uncontroversial bot tasks, to get early feedback on ideas for bot tasks (controversial or not), and to seek bot operators for bot tasks. Consensus-building discussions requiring large community input (such as request for comments) should normally be held at WP:VPPROP or other relevant pages (such as a WikiProject's talk page).
You can check the "Commonly Requested Bots" box above to see if a suitable bot already exists for the task you have in mind. If you have a question about a particular bot, contact the bot operator directly via their talk page or the bot's talk page. If a bot is acting improperly, follow the guidance outlined in WP:BOTISSUE. For broader issues and general discussion about bots, see the bot noticeboard.
Before making a request, please see the list of frequently denied bots, either because they are too complicated to program, or do not have consensus from the Wikipedia community. If you are requesting that a template (such as a WikiProject banner) is added to all pages in a particular category, please be careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively (see example difference).
- Alternatives to bot requests
- WP:AWBREQ, for simple tasks that involve a handful of articles and/or only needs to be done once (e.g. adding a category to a few articles).
- WP:URLREQ, for tasks involving changing or updating URLs to prevent link rot (specialized bots deal with this).
- WP:SQLREQ, for tasks which might be solved with an SQL query (e.g. compiling a list of articles according to certain criteria).
- WP:TEMPREQ, to request a new template written in wiki code or Lua.
- WP:SCRIPTREQ, to request a new user script. Many useful scripts already exist, see Wikipedia:User scripts/List.
- WP:CITEBOTREQ, to request a new feature for WP:Citation bot, a user-initiated bot that fixes citations.
Note to bot operators: The {{BOTREQ}} template can be used to give common responses, and make it easier to keep track of the task's current status. If you complete a request, note that you did with {{BOTREQ|done}}
, and archive the request after a few days (WP:1CA is useful here).
Legend |
---|
|
|
|
|
|
Manual settings |
When exceptions occur, please check the setting first. |
Bot-related archives (v·t·e) |
---|
You may want to increment {{Archive basics}} to |counter= 67
as Wikipedia:Bot requests/Archive 66 is larger than the recommended 150Kb.
Redirects to lists, from the things they are lists of
Please could someone do this:
- For every article titled "List of foo"
- if the article called "Foo" exists; do nothing
- otherwise, create "Foo" as a redirect to "List of foo"
For example, I just created Birds of Tunisia as a redirect to List of birds of Tunisia.
This might usefully be added to a list of monthly cleanup tasks, for new "List of..." articles. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:06, 6 May 2015 (UTC)
- Doing... - Though I have messaged WikiProject Lists to check consensus first. Jamesmcmahon0 (talk) 12:54, 7 May 2015 (UTC)
- Thank you. Please see also #Century-item redirects, below. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:36, 9 May 2015 (UTC)
- BRFA filed - Wikipedia:Bots/Requests for approval/MoohanBOT 8 It is just for this task as I had already generated the list of pages needed and there seems to be no opposition to it. I will have a look at #Century-item redirects in a few days but feel free to jump ahead GoingBatty as that one may be outside of my regex expertise... Jamesmcmahon0 (talk) 11:05, 10 May 2015 (UTC)
- Thank you. Please see also #Century-item redirects, below. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:36, 9 May 2015 (UTC)
It appears that User:Jamesmcmahon0 has dropped this. Can anyone else help, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:29, 5 July 2015 (UTC)
- I see User:Jamesmcmahon0 has been editing again... Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:30, 3 August 2015 (UTC)
- The BRFA expired, this task is now open for grabs again.—cyberpowerChat:Online 20:20, 7 September 2015 (UTC)
- This is a two minute job. All the best: Rich Farmbrough, 03:12, 16 October 2015 (UTC).
- Coding... Should get this done soon. PhilrocMy contribs 20:10, 20 October 2015 (UTC)
- BRFA filed All done! PhilrocMy contribs 23:01, 21 October 2015 (UTC)
The BRFA has been denied for reasons unrelated to the task itself. Another editor is welcome to take this on. @Jamesmcmahon0: Are you willing to reopen the old BRFA? — Earwig talk 23:15, 22 October 2015 (UTC)
- @Jamesmcmahon0: And I have refiled the BRFA. PhilrocMy contribs 11:30, 23 October 2015 (UTC)
- Denied again... — Earwig talk 01:43, 2 November 2015 (UTC)
- Wanna just make a list of the pages and let someone else do the redirects by hand. A search of red-links on a page and creating those as redirects is simply enough for AWB to handle, even if it's a lot of pages. -- Ricky81682 (talk) 02:36, 2 November 2015 (UTC)
- This is fine. We can start by working further on the list from Wikipedia:Bots/Requests for approval/MoohanBOT 8. — Earwig talk 02:39, 2 November 2015 (UTC)
Accidental template protection
Occasionally, I've noticed that an article has been mistakenly template-protected. Perhaps a bot could monitor the protection log, and if a page in a namespace other then Template, Module, User, or Wikipedia is template-protected, deliver a "did you mean to do this" message to the protecting admin, i.e.
- Hello
administrator name
. Ondate
you template-protected [[page
]] ([https://en.wikipedia.org/w/index.php?title=Special%3ALog&type=protect&page=page (url-encoded)
log]). As template protection is only meant to be used for templates, or other highly transcluded pages, did you perhaps mean to select a different level? Thanks,bot signature
or similar. - Evad37 [talk] 03:54, 17 June 2015 (UTC)
- Unsure if necessary - couldn't you just add something to said template? (Just lurking WP:BOTREQ to see what sort of things people want bots to do). E. Lee (talk) 04:59, 17 June 2015 (UTC)
- @Elee: Which template are you talking about? And how does adding something to a template fix a wrongly applied protection level? - Evad37 [talk] 05:50, 17 June 2015 (UTC)
- For an example of the problem and proposed solution (letting admins know that they may have made a mistake so they can fix it), see User talk:Ponyo#Maithali protection level, or User talk:Black Kite#Farshad Fotouhi protection level - Evad37 [talk] 05:54, 17 June 2015 (UTC)
- There are padlock templates added to protected pages. These "sense" if they are incompatible with the protection actually used, I believe, and put the page in a category to be fixed.
- Arguably there is something that could be done along these lines.
- A list of template protected articles can be found here (currently empty). A bot could check this, and act upon it. All the best: Rich Farmbrough, 21:43, 27 July 2015 (UTC).
- Such a bot should also check the queue for move protection. ~ RobTalk 14:11, 9 September 2015 (UTC)
Articles with {{Infobox Journal}} seek bot to ensure redirects are in place
As discussed at Wikipedia_talk:WikiProject_Academic_Journals#Bot_task?, there are several fairly standard redirects needed to each article in this project using that infobox. The box has parameters for the journal title and it's ISO abbreviation. Citations routinely vary the capitalization, abbreviations, and punctuation of these abbreviations, creating a need for redirects from each common variation to the actual article title (usually the same as the journal title, in sentence case). Is there a bot that might be suited to the task? LeadSongDog come howl! 01:23, 18 June 2015 (UTC)
- I've obtained the ISO 4 vocabulary to convert, e.g., "European Physical Journal" to "Eur. Phys. J."; it's a spreadsheet-format version of the PDF available at issn.org. Could you bot-wizards please tell us if such a conversion would be simply too complicated? Thanks! Fgnievinski (talk) 02:30, 30 July 2015 (UTC)
- Maybe an easier and useful thing to do instead would be to start from Infobox journal's title field (e.g., "European Physical Journal") and its manually-entered abbreviation field ("Eur. Phys. J."), and create the desired redirects: e.g., "European physical journal", "Eur. Phys. J.", "Eur Phys J", "eur phys j", "E. P. J.", "E.P.J.", "E P J", "EPJ". Fgnievinski (talk) 02:45, 30 July 2015 (UTC)
Replacement of Template:Infobox Country World Championships in Athletics
Hello. Could I hire a bot to substitute all transclusions of {{Infobox Country World Championships in Athletics}}, per the outcome of this TfD? Alakzi (talk) 13:12, 20 June 2015 (UTC)
- Same with {{Infobox China station}} and {{Infobox Japan station}}, but using the sandbox version. Alakzi (talk) 17:34, 20 June 2015 (UTC)
- {{Infobox Country World Championships in Athletics}} done - thanks Plastikspork. Alakzi (talk) 16:00, 25 June 2015 (UTC)
- @Alakzi: is this still pending or done? Mdann52 (talk) 18:18, 28 August 2015 (UTC)
- China and Japan station are pending. Alakzi (talk) 18:20, 28 August 2015 (UTC)
- Hi. I'm working on eliminating the backlog of requests, one request at a time. Unless, someone else takes this one, I will hopefully get to it soon.—cyberpowerChat:Online 00:37, 29 August 2015 (UTC)
- @Cyberpower678: Mind if I steal this one, if you haven't started on it yet? I'm working on clearing WP:TFD/H and there's a handful of templates that can be handled in one BRFA, including the two remaining here. ~ RobTalk 16:06, 3 September 2015 (UTC)
- Sure. As long as I haven't plastered a doing or coding template, I haven't taken it yet.—cyberpowerChat:Limited Access 16:10, 3 September 2015 (UTC)
- Doing... Thanks. ~ RobTalk 16:15, 3 September 2015 (UTC)
- Sure. As long as I haven't plastered a doing or coding template, I haven't taken it yet.—cyberpowerChat:Limited Access 16:10, 3 September 2015 (UTC)
- @Cyberpower678: Mind if I steal this one, if you haven't started on it yet? I'm working on clearing WP:TFD/H and there's a handful of templates that can be handled in one BRFA, including the two remaining here. ~ RobTalk 16:06, 3 September 2015 (UTC)
- Hi. I'm working on eliminating the backlog of requests, one request at a time. Unless, someone else takes this one, I will hopefully get to it soon.—cyberpowerChat:Online 00:37, 29 August 2015 (UTC)
- China and Japan station are pending. Alakzi (talk) 18:20, 28 August 2015 (UTC)
- @Alakzi: is this still pending or done? Mdann52 (talk) 18:18, 28 August 2015 (UTC)
Updating US Census Estimates
Is there a bot available that could add the current United States Census Bureau population estimates (and unfortunately I wouldn't trust OCR for a lot of the older Census files because I often have to look carefully/zoom myself to tell 3 from 8 or 6 from 0)? It should be a fairly straightforward task. The Census updates can be found at census.gov/popest. I am in the process of adding data (mostly, I am using an AWK script on my computer to format data from a spreadsheet for copy/paste into Wikipedia) manually, and for that, I'm okay, since it gives me a chance to do spot edits on those pages as well and allows me to try to make sure that adding the USCensusPop widget doesn't completely screw up formatting of the page, but it's not something I could do every year.
Specifically, it could check to see if a page for a place has a Template:USCensusPop, and then if so, just update. Very simple. I'd write it myself, but it would be nicer if somebody either has code I can reuse or if they could do it all themselves. Thanks. DemocraticLuntz (talk) 23:30, 21 June 2015 (UTC)
- Sounds like a great idea.--BabbaQ (talk) 23:47, 26 September 2015 (UTC)
Removal of {{Start date}} from {{Singles}} template
It has become common practice in album articles to use {{Start date}} in the {{Singles}} add-on to {{Infobox album}}. Per Template:Start date/doc: "This purpose of the {{start date}} template is to return the date (or date-time) that an event or entity started or was created. It also includes duplicate, machine-readable date (or date-time) in the ISO date format (which is hidden by CSS), for use inside other templates (or table rows) which emit microformats. It should only be used once in each such template and should not be used outside such templates." i.e. {{Start date}} should only be used in album articles for the album release date, not single release dates. It would be nice to have a bot to clean this up, as this error is currently in who knows how many articles. Chase (talk | contributions) 16:44, 5 July 2015 (UTC)
- While we're at it, the bot that would do this should also remove {{Start date}} from AltDate in {{Episode list}}. nyuszika7h (talk) 19:32, 5 September 2015 (UTC)
DOI bot
Given a reference in the forms
"<ref>doi:10.[four digits]/*</ref>" "<ref>http://www.doi.org/10.[four digits]/*</ref>" or "<ref>www.doi.org/10.[four digits]/*</ref>",
the bot should insert the full reference into the article page and into Wikidata. It might be extended to add data to existing references that are, say, missing the date of publication.
See:
HLHJ (talk) 12:28, 8 July 2015 (UTC)
- See User talk:Citation bot/Archive1#Replacement citation bot? and the immediately preceding section. --Izno (talk) 13:09, 8 July 2015 (UTC)
- That discussion does not appear to be leading to getting a bot to start working on the Cite Doi templates. Abductive (reasoning) 19:13, 8 July 2015 (UTC)
- The bot in question preempts the need for doing so (were it turned on). Inserting {{cite journal|doi=value}} and then the bot fills in the other data is what the bot does (or did with {{cite doi}}). --Izno (talk) 19:48, 8 July 2015 (UTC)
- As for Wikidata, I'm not sure of your intentions, so you will need to clarify. Regardless, that bot would need to be approved at Wikidata, not here. --Izno (talk) 19:49, 8 July 2015 (UTC)
- That sounds good, and would do half my request. I hope it's back soon.
- Apologies for the lack of clarity. Wikidata has a data format for journal sources, but there is currently no way to create items from citation templates. See this discussion. There are tools for doing it from a DOI; see the tools section here. It seemed to me that co-ordination between bots working on both might be helpful at avoiding duplicates, etc., but I take your point that separate bots might be easier. HLHJ (talk) 14:40, 16 July 2015 (UTC)
- There is consensus at WPMED to replace cite DOI with cite journal on medical articles. Doc James (talk · contribs · email) 15:35, 16 July 2015 (UTC)
- That discussion does not appear to be leading to getting a bot to start working on the Cite Doi templates. Abductive (reasoning) 19:13, 8 July 2015 (UTC)
For the record, we have an user tool that can be used to derive {{Cite journal}} from DOIs. Having a bot that can autoexpand DOIs to full citations would be useful. Maybe one could reuse the {{Cite doi}} template for it; the bot would convert it to a {{Cite journal}}. Jo-Jo Eumerus (talk, contributions) 19:01, 29 August 2015 (UTC)
- FYI, consensus was reached to deprecate the {{cite doi}} templates.[1] Citation bot will no longer be creating those templates or inserting references to them into articles. Thus I don't think there's anything blocking the original request here. Kaldari (talk) 21:21, 9 December 2015 (UTC)
Redirects to academic journals may lack WPJournals template (class=redirect) in their talk pages
Would it be possible to check, for each page with Template:WikiProject Academic Journals in its talk page, if its redirects also have Template:WikiProject Academic Journals (class=redirect) in their respective talk pages? Thanks! Fgnievinski (talk) 20:52, 18 July 2015 (UTC)
- Idea is not well explained.—cyberpowerChat:Online 20:14, 27 August 2015 (UTC)
- @C678: sorry, here's an example: Proceedings of the National Academy of Sciences of the United States of America has several redirects [2], some of which are correctly tagged with
{{WPJournals|class=redirect}}
in their respective talk pages (e.g, Talk:PNAS) others that are either blank or as a redirect to the target's talk page (e.g., Talk:Proc Nat Acad Sci). fgnievinski (talk) 22:47, 28 August 2015 (UTC)- Maybe it's better moving the page to the redirect pages than just create a new page?--Kanashimi (talk) 13:05, 1 October 2015 (UTC)
- @C678: sorry, here's an example: Proceedings of the National Academy of Sciences of the United States of America has several redirects [2], some of which are correctly tagged with
Using Infobox journal's language field to populate Category:Academic journals by language sub-categories
Could a bot please inspect values entered in field "language" of Infobox journal? Then possibly populate individual sub-categories of Category:Academic journals by language (as per WP:JWG). Thanks! Fgnievinski (talk) 02:14, 30 July 2015 (UTC)
- @Fgnievinski: do you mean check they are valid, or add individual articles about journals to the category? Mdann52 (talk) 16:19, 28 August 2015 (UTC)
- @Mdann52: the latter, please; thanks for looking into this. fgnievinski (talk) 19:03, 28 August 2015 (UTC)
- @Fgnievinski: in that case, this is beyond my capability, I will leave this for someone else to take a look at. Mdann52 (talk) 19:07, 28 August 2015 (UTC)
- @Mdann52: a listing of inconsistencies between category membership and infobox language field would be a great start; then one could manually fix as appropriate. fgnievinski (talk) 19:15, 28 August 2015 (UTC)
- Actually, even a reverse listing of transclusions by language field value would be helpful (e.g., English, French, etc.); it doesn't even need to break up sub-values (e.g., "English", "French", and eventual "English and French" would be fine). fgnievinski (talk) 00:12, 10 September 2015 (UTC)
- @Mdann52: a listing of inconsistencies between category membership and infobox language field would be a great start; then one could manually fix as appropriate. fgnievinski (talk) 19:15, 28 August 2015 (UTC)
- @Fgnievinski: in that case, this is beyond my capability, I will leave this for someone else to take a look at. Mdann52 (talk) 19:07, 28 August 2015 (UTC)
- @Mdann52: the latter, please; thanks for looking into this. fgnievinski (talk) 19:03, 28 August 2015 (UTC)
Container category diffusion
Category:Container categories, by the current definition of the notice box, only allow subcategories, no other pages. If possible, I think it would help the maintenance process if a bot could check container categories for pages, and if found, check if they are already categorized in a subcategory of the container category being checked. If they are, remove them from the container category, referencing the subcategory and WP:SUBCAT in the edit summary. --Slivicon (talk) 14:57, 3 August 2015 (UTC)
I also think, such a bot is required -- Pankaj Jain Capankajsmilyo 13:27, 1 September 2015 (UTC)
Cleanup of "naked" Google Books?
Right now there are approximately 2500 pages with "naked" google books entries (defined as containing the string >http://books.google.com/ ). I asked on Wikipedia talk:AutoWikiBrowser whether there was any way to combine AWB and the Wikipedia citation tool for Google Books at http://reftag.appspot.com/ to help on cleaning these up, and got a response to ask here. Would this be appropriate for a Bot? It *may* also be appropriate to include <ref>[http://books.google.com/... text]</ref> cleanup as well, but that would be a later request if the first makes sense.Naraht (talk) 17:05, 24 August 2015 (UTC)
- It would have to distinguish between reference and non-reference links, at any rate. IMO, changing the links in this way would be a net improvement of the wiki, at any rate. Jo-Jo Eumerus (talk, contributions) 17:20, 24 August 2015 (UTC)
- OP:Absolutely. And given the small (but not non-existant) crossover between those users who would use named refs and those who would put a "naked" google book in a ref, I would be *quite* happy to limit this at the start to something like regexp <ref>http://books.google.com/[^ <]</ref> Naraht (talk) 18:07, 24 August 2015 (UTC)
- @Naraht: If no one else pops up in the meantime, I'll take this on once my current BRFA is concluded, and I have sufficient time to sort this all out. Mdann52 (talk) 08:41, 25 August 2015 (UTC)
- OP:Absolutely. And given the small (but not non-existant) crossover between those users who would use named refs and those who would put a "naked" google book in a ref, I would be *quite* happy to limit this at the start to something like regexp <ref>http://books.google.com/[^ <]</ref> Naraht (talk) 18:07, 24 August 2015 (UTC)
If you're going down this line (and it sounds worthwhile) I strongly recommend you set the bot to run slowly, so you give people a chance to notice and then feed back on any errors before they're reproduced on multiple pages that may not be actively Watchlisted. --Dweller (talk) 08:49, 25 August 2015 (UTC)
- @Naraht: Creating a appropriate cite tag requires human intelligence and eyeballs to correct mistakes. This is a big no-no for Bots. I see a case for a bot (or report) that lists all pages that have at least one naked books reference and allow people to work the report/backlog by sorting the cite tag out. Hasteur (talk) 12:34, 25 August 2015 (UTC)
- Just to note that if I do it, it will be a manual (or at best supervised) run, and won't be ongoing. If you want one that runs constantly, I'm not the person to write that. Mdann52 (talk) 15:02, 25 August 2015 (UTC)
- Support - I support a bot to be made that fixes these issues.--BabbaQ (talk) 23:41, 26 September 2015 (UTC)
Update the lists at WikiProject Fix common mistakes
We really need a bot that updates the lists at Wikipedia:WikiProject Fix common mistakes#Log. Right now they are being done manually, which is a very tedious process. Some of the entries have not been updated since November of 2014 and there are a bunch of errors that we haven't added because we can't keep up with the ones we list now.
Note: This was brought up before at Wikipedia:Bot requests/Archive 63#Bot to updated lists at WikiProject Fix common mistakes, where it was marked as resolved and archived, despite the fact that we are still doing this by hand. --Guy Macon (talk) 18:40, 29 August 2015 (UTC)
Monitor and circulate old unanswered questions
Talkpages have disadvantages, not least that many are unwatched so posting a query on them doesn't always get a response from someone who knows about the subject. But we could greatly reduce this with a bot.
If someone posts on an unwatched talkpage we risk having their query linger unnoticed. Would it be possible to have a bot run lists of open talkpage queries to relevant wikiprojects? With a special list for "talkpage queries on pages not tagged for any wikiproject". I'm sure we could get volunteers to go through the default list and either answer queries or tag those pages for relevant wikiProjects, so as hopefully to bring the query to the attention of someone who could answer it.
It would need to ignore threads marked {{done}}, and ideally the reports to each WikiProject should be colour coded and date sequenced so you could differentiate between queries or discussions that more than one editor had participated in and sections on talkpages with only one editor having posted. For talkpages tagged to multiple wikiprojects it would probably help if we also had a template that marked that section as of interested or otherwise to particular wikiprojects, so someone from WikiProject mountaineering could go through some relevant talkpage threads, answer those they could and tag those that were about glaciation, vulcanology or botany so that the bot would know that while the article on that mountain was tagged to several WikiProjects including mountainerring, that particular thread with a question re the most recent eruption was for WikiProject vulcanology ϢereSpielChequers 11:12, 7 September 2015 (UTC)
- It's been my understanding that this is what the project tags accomplish on talk pages. You can click the links on those to find a more general audience for queries. ~ RobTalk 19:41, 7 September 2015 (UTC)
- Yes, but this would work the other way round, so anyone visiting a wikiProject page could easily see a list of open questions that are likely to be of interest to their WikiProject. The reason being that newbies post questions on talkpages and sometimes they linger till stale. ϢereSpielChequers 16:20, 8 September 2015 (UTC)
Replace stubs category with stub template
As of the September 2015 dump, there are over 2700 articles with a stub category. (e.g. \[\[Category:[\w\s]+stubs\]\]
) Could someone create a bot that would replace the stub category with the appropriate stub template? (If the stub template already exists on the article, just delete the stub category.)
For example, Antikristos contains Category:Folk dance stubs. The Category:Folk dance stubs page contains {{Stub Category|article=[[folk dance]]|newstub=folk-dance-stub|category=Folk dance}}
. Therefore, the bot would:
- Delete the stub category
- Look at the value in
|newstub=
- If the article does not contain {{folk-dance-stub}}, add {{folk-dance-stub}} at the bottom of the article
Thanks! GoingBatty (talk) 21:09, 7 September 2015 (UTC)
- Support -I support this idea.--BabbaQ (talk) 23:42, 26 September 2015 (UTC)
- Anyone interested in taking this on? GoingBatty (talk) 17:36, 3 October 2015 (UTC)
- A BRFA has been filed here; users are invited to comment. — Earwig talk 19:36, 8 October 2015 (UTC)
- Anyone interested in taking this on? GoingBatty (talk) 17:36, 3 October 2015 (UTC)
Mexican digital television stations
On September 24, 2015, some Mexican TV stations will become all-digital. This implies a change in callsigns, but most Mexican TV station links are redirects, not articles.
As such, some redirects need to be moved to new locations and references to them changed out. For instance, XHBAB-TV must become XHBAB-TDT. A bot to make these moves would be very helpful, and the code will be vital when more than 600 stations do this on December 31. There are enough references that all of them can be changed and the old -TV suffixes can be removed.
Note that some stations have their own articles, and those will be manually moved and updated.
The stations that are redirects and to be moved are:
- XHAFC
- XHBAB
- XHBTB
- XHGWT
- XHMOY
- XHOPMT
- XHGNB
- XHSIB
- XHSRB
- XHVEL
I will likely need to make one or two more requests, and then on December 31 we will need to have a massive blitz of some 600 of these, so having reusable code is a must for my sanity. Raymie (t • c) 21:37, 8 September 2015 (UTC)
- @Raymie: So just to make sure, you want all redirects with the prefixes listed above with the "TV" suffix to be changed to the same prefix with the "TDT" suffix? -24Talk 20:15, 17 September 2015 (UTC)
- @Negative24: That would be correct. The actual redirects need to be moved and the links to them need to be modified too. And I'll want to be able to do it again in December with 600+ of them. Raymie (t • c) 02:38, 18 September 2015 (UTC)
- @Raymie: Alright, I will see what I can do but this would be the first "real" task for User:Bot24 so it might be a tiny bit rough from the beginning. -24Talk 02:43, 18 September 2015 (UTC)
- @Negative24: That would be correct. The actual redirects need to be moved and the links to them need to be modified too. And I'll want to be able to do it again in December with 600+ of them. Raymie (t • c) 02:38, 18 September 2015 (UTC)
Texas Historical Commission atlas has changed information links
I'm not knowledgeable enough to know if this is possible to correct with a bot, but there are a considerable amount of articles that are affected by this. These atlas links have been used for NRHP citations, as well as other historical marker citations.
The home for the Texas Historical Commission atlas URL remains the same: http://atlas.thc.state.tx.us/
However, once you access information, those links have changed. Whatever is linked to THC as sources in articles are now dead links. I just made a recent change to an article. You can see by the diff how it's been changed. — Maile (talk) 22:25, 8 September 2015 (UTC)
- Copied from Wikipedia:Village pump (technical)/Archive 140#Texas Historical Commission atlas has changed information links:
- Special:LinkSearch finds 718 links to http://atlas.thc.state.tx.us. The count includes all namespaces and cases with multiple links on the same page. There are around 370 different articles. http://atlas.thc.state.tx.us currently says: "Welcome to the new Atlas! The original Atlas, now located at http://atlas1.thc.state.tx.us, will eventually be phased out in the coming weeks. Please begin transitioning your use to the new Atlas." The links I examined work if atlas is replaced by atlas1 but it sounds like this is temporary. It would be good to find and update to new atlas url's while the old content can be seen at atlas1 (not all url changes are of the same form). PrimeHunter (talk) 22:40, 8 September 2015 (UTC)
findarticles.com
Mark all links to findarticles.com as dead. The links are being redirected to a another website. However, they are not marked as 404, or soft 404. That includes links to https://web.archive.org/web/$1/findarticles.com etc. which has been deleted retroactively from the archives. Examples:
- http://findarticles.com/p/articles/mi_m1058/is_1998_Nov_18/ai_53365282
- http://web.archive.org/web/20050922160253/http://www.findarticles.com/p/articles/mi_m1058/is_1998_Nov_18/ai_53365282
See more detailed reasoning read this on my blog and FindArticles. (t) Josve05a (c) 08:25, 30 September 2015 (UTC)
- insource:findarticles.com: 16,609 mainspace matches. (t) Josve05a (c) 08:31, 30 September 2015 (UTC)
- Josve05a so that I understand the request correctly (and to help guide the answer) what you're looking for is: For every occurrence where the pattern findarticles.com appears inside a ref block (i.e. regex 'ref>*?findarticles.com*?</ref') append a
{{deadlink}}
template (with appropriate year/month for categorization) just inside the close of the reference tag. Is this correct? Hasteur (talk) 18:36, 30 September 2015 (UTC)- Yes, unless one {{dead link}} already exists. (t) Josve05a (c) 19:12, 30 September 2015 (UTC)
- Doesn't just has to be in refs, but in all external links, but if that's to complicated, then the refs are good enough. (t) Josve05a (c) 19:13, 30 September 2015 (UTC)
- Yes, unless one {{dead link}} already exists. (t) Josve05a (c) 19:12, 30 September 2015 (UTC)
- Josve05a so that I understand the request correctly (and to help guide the answer) what you're looking for is: For every occurrence where the pattern findarticles.com appears inside a ref block (i.e. regex 'ref>*?findarticles.com*?</ref') append a
Transclusion of daily Copyright problems subpages onto the main WP:CP page
(Repeat of my posting of 27 February) Would some kind bot take this on? The subpage name is of the form Wikipedia:Copyright problems/2015 February 26; it needs to be added to Wikipedia:Copyright problems after seven days; i.e., the page for 19 February is added at midnight on 26 February. It's being done manually at the moment, would be good if it could be automated. Thanks, Justlettersandnumbers (talk) 20:06, 11 October 2015 (UTC)
- Also it'd be really good if a bot could be asked to create the daily listing subpages of the copyright problems page, such as Wikipedia:Copyright problems/2015 October 11. I'm doing them manually at the moment, but I'd rather do other things. This should be a no-brainer for a bot. Thanks, Justlettersandnumbers (talk) 20:06, 11 October 2015 (UTC)
BC births and deaths categorizations
RfC: BC births and deaths categorization scheme has just been closed on:
(option 5:) Return to earlier guideline-conforming scheme adding "rollup" categories by decade/century
Could we have bot-assistance on realising that? Pinging a few people that may be able to give some assistance:
- @Fayenatic london: may have some experience as to what can be handled (semi-)bot-wise at the end of categorisation discussions
- @Rick Block: seems to have some experience with the "roll-up" systems
- @Good Olfactory: commented in a prior discussion here
If I need to be more specific on possible tasks involved, please ask me. --Francis Schonken (talk) 17:18, 14 October 2015 (UTC)
- The "roll-up" on decade categories, as currently seen at Category:0s deaths, is simply done using <categorytree mode=pages>0s deaths</categorytree> on that page. The parameter in the middle of that string has to match the name of the page that it is on. There is a way to show an ordinary category tree using the PAGENAME parameter: {{#categorytree:{{PAGENAME}}}}. However, I do not know of a way to combine that with
mode=pages
. For more info see MW:Extension:CategoryTree. So AFAIK this "rollup" code will have to be added manually. - The old categories will have to be undeleted by admins; I don't know a way to automate that. After undeletion, we would then list them at WP:CFDWR so that Cydebot would remove the CFD templates from them.
- I believe the member pages (biography articles) will also have to be reverted manually. The best that I can offer would be to provide links to the diffs made by Cydebot when emptying the old categories. – Fayenatic London 11:01, 15 October 2015 (UTC)
- The "roll-up" on decade categories, as currently seen at Category:0s deaths, is simply done using <categorytree mode=pages>0s deaths</categorytree> on that page. The parameter in the middle of that string has to match the name of the page that it is on. There is a way to show an ordinary category tree using the PAGENAME parameter: {{#categorytree:{{PAGENAME}}}}. However, I do not know of a way to combine that with
- @Armbrust:: I manually undeleted Category:1 BC deaths to Category:9 BC deaths. Would you be able to automate reversals of your bot's edits starting from [3]? See [4] for the instruction at CFDW for deaths from 1 to 599 BC. – Fayenatic London 21:56, 17 October 2015 (UTC)
- @Armbrust: I've manually reverted from the bottom of that page of contribs up to Curia (wife of Quintus Lucretius). Is it any trouble to you if we use rollback or undo on your bot's edits? – Fayenatic London 12:45, 19 October 2015 (UTC)
- I don't mind, although some articles were edited after the bot. Armbrust The Homunculus 19:51, 19 October 2015 (UTC)
- Thanks.
I've now done up to Horace.– Fayenatic London 21:23, 19 October 2015 (UTC)
- Thanks.
- I don't mind, although some articles were edited after the bot. Armbrust The Homunculus 19:51, 19 October 2015 (UTC)
- @Armbrust: I've manually reverted from the bottom of that page of contribs up to Curia (wife of Quintus Lucretius). Is it any trouble to you if we use rollback or undo on your bot's edits? – Fayenatic London 12:45, 19 October 2015 (UTC)
As the work cannot be processed by bot, I have listed the CFDs listing the births/deaths categories to be reinstated at WT:WikiProject Years#BC births and deaths categories. – Fayenatic London 13:50, 20 October 2015 (UTC)
- I subsequently moved the list and progress marker to Wikipedia talk:WikiProject Biography#BC births and deaths categories. – Fayenatic London 21:36, 26 October 2015 (UTC)
- Re. "As the work cannot be processed by bot" – says who? I think part of the tasks can be processed by bot. I'd prefer to keep the discussion here (various bot operators may pick up on tasks for which they see a possibility to automate it), with a possible exception to logging tasks performed at WT:WikiProject Years#BC births and deaths categories. --Francis Schonken (talk) 15:31, 20 October 2015 (UTC)
- @Fayenatic london: again, please discuss these issues here. --Francis Schonken (talk) 13:39, 26 October 2015 (UTC)
- Your confidence in bot-kind is touching. I agree that this task would be best handled by a bot, but I have never come across an existing bot written to do what is required here. Well, I suppose there is little harm in waiting longer; perhaps somebody may write a new bot for us. The main disadvantage of waiting is that subsequent edits to the biographies will mean that an increasing proportion of the bot edits cannot be reverted using Undo. – Fayenatic London 21:22, 26 October 2015 (UTC)
- Actually it could be done with AWB alone (replace year category with birthsyear cat and remove birthsdecade category), but compiling a list of affected articles is troublesome. Armbrust The Homunculus 08:29, 28 October 2015 (UTC)
- @Armbrust: I had thought about using Cat-a-lot to do that, but ruled that out, because a year category on a bio could be for births or for deaths. A human editor could tell which, by referring to the decade categories, but that would probably be too difficult to program into a bot. So yes, it could be done using AWB, but requiring manual intervention on each one before clicking Save. – Fayenatic London 13:45, 28 October 2015 (UTC)
- If you use the bot's contributions list compile the articles, than this shouldn't be a problem. Armbrust The Homunculus 19:12, 28 October 2015 (UTC)
- @Armbrust: I had thought about using Cat-a-lot to do that, but ruled that out, because a year category on a bio could be for births or for deaths. A human editor could tell which, by referring to the decade categories, but that would probably be too difficult to program into a bot. So yes, it could be done using AWB, but requiring manual intervention on each one before clicking Save. – Fayenatic London 13:45, 28 October 2015 (UTC)
- Actually it could be done with AWB alone (replace year category with birthsyear cat and remove birthsdecade category), but compiling a list of affected articles is troublesome. Armbrust The Homunculus 08:29, 28 October 2015 (UTC)
- Your confidence in bot-kind is touching. I agree that this task would be best handled by a bot, but I have never come across an existing bot written to do what is required here. Well, I suppose there is little harm in waiting longer; perhaps somebody may write a new bot for us. The main disadvantage of waiting is that subsequent edits to the biographies will mean that an increasing proportion of the bot edits cannot be reverted using Undo. – Fayenatic London 21:22, 26 October 2015 (UTC)
- @Fayenatic london: again, please discuss these issues here. --Francis Schonken (talk) 13:39, 26 October 2015 (UTC)
- User:Francis Schonken: How long do you want to wait? Perhaps this bot request might be reactivated by posting separate requests under separate headings for the three tasks: posting "rollup" category trees on decade category pages; undeleting year category pages for births and deaths; and reverting selected contribs by ArmbrustBot on biography articles. – Fayenatic London 21:43, 15 November 2015 (UTC)
- "wait"? I didn't suggest to wait for anything. I'm only against splitting up the discussion, e.g. someone doing part of the reverts (bot-wise or not) and not logging them here, then someone else doing some reverts (bot-wise or not) and getting confused while not knowing what has been done etc... I'll make some subheaders to this thread (...opposing as I am separate threads not kept together). --Francis Schonken (talk) 03:36, 16 November 2015 (UTC)
Subthread 1 – undeletion of BC births and deaths categories
I'm not sure but from some comments I deduce this task has been done partially or completely – can someone give an overview whether this is done?
Have any BC births or deaths categories been undeleted that weren't populated before these categories were deleted? (I'd advise against that but have no clue where we are with that). Can someone give an update? --Francis Schonken (talk) 03:36, 16 November 2015 (UTC)
- I had undeleted deaths categories back to Category:89 BC deaths, and have just undeleted a lot of them again. I only undeleted those that were deleted in 2015; there are a few gaps which were not in use at the time of the 2015 CFDs.
- I have now added a temporary note to Template:DeathyrBC to discourage further re-deletions. The notice appears only on empty year-BC deaths categories.
- As the last batch of merges were on deaths categories, I have not systematically undeleted births categories yet, but only those which were repopulated by reverting two of the bot edits (death and birth year). – Fayenatic London 09:58, 16 November 2015 (UTC)
Subthread 2 – adding "rollup" to BC births and deaths categories
I've no clue where we are with this task? Have rollups been added to BC birth and death cats apart from the few examples that came up in the RfC? If not, to me this seems like an excellent job for a bot... any takers? --Francis Schonken (talk) 03:36, 16 November 2015 (UTC)
- No-one had started this. I have now done it on a few, Category:0s BC deaths back to Category:40s BC deaths. – Fayenatic London 22:45, 16 November 2015 (UTC)
Subthread 3 – repopulating BC births and deaths categories
(basicly reverting armbrustbot's dual upmerge edits)
- I've been doing three or four a long time ago;
- I understand Fayenatic london has been doing quite a few too, but am not clear how far this got?
I still think this is best handled by a bot: going through armbrustbots edits on these BC biography articles one by one (that is: reverting them one by one, from the most recent one to the oldest one), and (this is the important part) giving a dump of the articles where such reverts are no longer possible (because they have already been done or some other intermediate edits prevented a revert). Then sort out the items on this dump manually. I'd be happy to help sort out manually when presented with such dump list. --Francis Schonken (talk) 03:36, 16 November 2015 (UTC)
- Any editor can help with reverting the biography pages.
- The CFDs listing the births/deaths categories to be reinstated are:
- Wikipedia:Categories for discussion/Log/2015 May 25#1st to 5th century BC births
- Wikipedia:Categories for discussion/Log/2015 May 30#1st to 6th century BC deaths
- Wikipedia:Categories_for_discussion/Log/2015_May_22#6th-century_BC_births and 7th (below that)
- Wikipedia:Categories_for_discussion/Log/2015_May_16#8th_century_BC (just the births and deaths)
- Wikipedia:Categories_for_discussion/Log/2015_May_15#9th_century_BC and 10th (below that)
- Wikipedia:Categories_for_discussion/Log/2015_May_13#11th_century_BC
- Wikipedia:Categories_for_discussion/Log/2015_May_8#12th_century_BC
- Wikipedia:Categories_for_discussion/Log/2015_April_24#13th_century_BC
- Wikipedia:Categories_for_discussion/Log/2015_April_23#14th_century_BC to 16th
- Wikipedia:Categories_for_discussion/Log/2015_April_20#17th_century_BC
- The last list of categories deleted (instruction to bot at CFDW) was [6] for deaths from 1 to 599 BC.
- The contribs for the last set of bot edits (on BC deaths) ended here. Working up from the bottom, I have completed that page so the current page to be worked on is here.
- I have manually reverted from the bottom of that page of contribs up to: Iamblichus (phylarch). After completing the top one click "newer 50" and carry on from the bottom again.
- My workflow is:
- Mouse over the page history for next diff up the list. Review history using WP:POPUPS to see whether there have been subsequent edits after the category changes by ArmbrustBot.
- If no, use rollback.
- If yes, open the history, and Undo the one or two contribs by ArmbrustBot. For an edit summary, link to Wikipedia_talk:Categorization_of_people#RfC:_BC_births_and_deaths_categorization_scheme.
- @Nyttend: you also appear to have helped to diffuse Category:40s BC deaths back down to years; do you have any other recommendations? – Fayenatic London 10:06, 16 November 2015 (UTC)
- I discovered the situation because a few year categories were in CAT:CSD, and I figured that there surely would have been several notable Romans in each year; after moving several of them over, I just decided to move everything from the 40s into year categories, and I eventually discovered the bot's action. Are there a ton of edits that potentially need to be reverted? I'd just urge caution, because a lot of articles were wrongly categorised, so Armbrustbot's edit was helpful and shouldn't be reverted; for example, Antipater of Tyre died "shortly before 45 BC", so he shouldn't be in 45 BC deaths, and this edit was helpful, even though most of the bot's edits weren't. I did everything manually and would urge you to do likewise to avoid restoring overprecision like 45 BC for Antipater, although I'm not aware of how many articles are involved, so I understand that this might not be practical. PS, please don't have the bot do anything with the 40s BC deaths, since I've gone through them; none of them need work unless I messed up (e.g. I did Gaius Cassius Longinus just now, having overlooked him before), and the bot has no way to judge whether or not I messed up. Nyttend (talk) 13:39, 16 November 2015 (UTC)
- I've removed Antipater from the other new category 45 BC as it is not for biographies.
- This flags up a couple of points:
- Individuals like this, for whom we do not know the exact year of death, will appear in the categorytree ("rollup") listing below the sub-cats, if we leave them in the decade categories. See Category:40s BC deaths. The template ({{DeathyrBC}}) on Category:45 BC deaths does say "People who died c. 45 BC.", so it seems acceptable to me that he was categorised in 45 BC deaths, although 46 BC might have been a better choice. Alexander of Judaea is another case, "died 48 or 47 BC", categorised in 48 BC. I suggest that it is good enough to pick a date which might be one year out.
- Instead of working from ArmbrustBot's contribs, we could work from the decade/century categories as our starting point, diffusing the contents back down into the year categories where the date is stated. We could still do the actual edit by reverting ArmbrustBot's edits in most cases, but it would be a different method of working. However, it's probably quicker to work from the contribs.
- – Fayenatic London 22:41, 16 November 2015 (UTC)
- I discovered the situation because a few year categories were in CAT:CSD, and I figured that there surely would have been several notable Romans in each year; after moving several of them over, I just decided to move everything from the 40s into year categories, and I eventually discovered the bot's action. Are there a ton of edits that potentially need to be reverted? I'd just urge caution, because a lot of articles were wrongly categorised, so Armbrustbot's edit was helpful and shouldn't be reverted; for example, Antipater of Tyre died "shortly before 45 BC", so he shouldn't be in 45 BC deaths, and this edit was helpful, even though most of the bot's edits weren't. I did everything manually and would urge you to do likewise to avoid restoring overprecision like 45 BC for Antipater, although I'm not aware of how many articles are involved, so I understand that this might not be practical. PS, please don't have the bot do anything with the 40s BC deaths, since I've gone through them; none of them need work unless I messed up (e.g. I did Gaius Cassius Longinus just now, having overlooked him before), and the bot has no way to judge whether or not I messed up. Nyttend (talk) 13:39, 16 November 2015 (UTC)
Century-item redirects
My request for someone do this:
- For every page or category beginning with a cardinal number (e.g. 17th-, 21st-) century; or articles prefixed "List of..." matching that pattern:
- Create a redirect from the equivalent title, with no dash
- Create a redirect from the equivalent title, using words
- Create a redirect from the equivalent title, using words, with no dash
was markred as "not done - no wider discussion" and archived. What wider discussion is needed?
For example, for the existing Category:20th-century war artists, I just created:
- Category:20th century war artists
- Category:Twentieth-century war artists
- Category:Twentieth century war artists
Other examples matching the above pattern would include:
This might usefully be added to a list of monthly cleanup tasks, for new articles and categories matching the above pattern. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:27, 15 October 2015 (UTC)
- Anyone? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:14, 14 November 2015 (UTC)
Deadlink Fixing
Maybe a bot that fixes a collection of dead links? The Ohio Historical Society once maintained a few thousand pages with a well-maintained naming convention, http://ohsweb.ohiohistory.org/ohpo/nr/details.aspx?refnum=XXXXXXXX (the Xs represent an eight-digit number), but they took down these pages a good while ago. Now that OHS has renamed itself to Ohio History Connection, it's put up a new website, and these pages are once again good, but with different URLs, http://nr.ohpo.org/Details.aspx?refnum=XXXXXXXX. Could a bot go around and perform replacements? The work should be easy, and manual fixes will take a lot of work for a human but should be easy for a bot, given the careful adherence to the naming convention. A few of these links have been correctly marked with {{dead link}}; it would also help if the bot were to remove that tag when it's present. Nyttend (talk) 21:45, 19 October 2015 (UTC)
- Then my bot should come by soon and replace the tagged ones with a wayback link.—cyberpowerTrick or Treat:Limited Access 02:54, 20 October 2015 (UTC)
- I'm confused: why would that help? Almost none of these are in archive.org (I've checked), and why would it be good in the first place for the bot to use an archive URL instead of the URL of a currently active page from the same source with the same content? Nyttend (talk) 04:42, 20 October 2015 (UTC)
- Sorry that was meant to be more of a general comment. Cyberbot II now attempts to attach wayback links to tagged links.—cyberpowerTrick or Treat:Online 15:56, 20 October 2015 (UTC)
- I'm confused: why would that help? Almost none of these are in archive.org (I've checked), and why would it be good in the first place for the bot to use an archive URL instead of the URL of a currently active page from the same source with the same content? Nyttend (talk) 04:42, 20 October 2015 (UTC)
- It would be better to set up an external link template (like e.g. Template:Ofsted) for this, and editing the pages to use the template. Then, any future similar change to the external website could be dealt with simply by changing the template. – Fayenatic London 09:34, 6 November 2015 (UTC)
- Here's a complete list of pages that need to be updated, somewhat more than five hundred in total:
- Fayenatic london or C678, would either of you be able to run a bot to replace the old URL with the template that Fayenatic recommends? Again, it sometimes appears within <ref name=> tags or in the external links, so you'd just want to do a find-replace, and it would help if you'd remove {{dead link}} when it's present. I've not yet created the template; I'll create it once someone's agreed to run the bot. Nyttend (talk) 13:55, 16 November 2015 (UTC)
- Update — I've created the template at {{OHC NRHP}}. Nyttend (talk) 20:12, 16 November 2015 (UTC)
- Not me, I don't run bots, I just suggested a way to approach the task. – Fayenatic London 23:30, 3 December 2015 (UTC)
Http->Https for Newspaper.com links
Hi all. If you didn't know, we have a substantial donation of accounts to WP:Newspapers.com as part of the Wikipedia Library partnership program. As part of the recent expansion of access to 300 accounts, our contact noted that they can no longer track referral traffic from Wikipedia, because of the change earlier in the year for all Wikipedia readers to be on Https (Https only communicates referrals to https not http). We would like help changing http to https links for Newspapers.com, for several reason:
- From the start, the have been one of our most used partners by volunteers, and they are very much willing to expand our editor access to include more editors as demand; we want to keep currying this good will.
- In part this demand from editors, is in response to their Open Access "Clipping" function (read more), which allows our editors to pull their sources out from behind the paywall on Newspapers.com. This particular case study has been part of our business case for other partners creating more open access options (for example WP:Newspaperarchive.com created the exact same feature as part of the development of our partnership, and we are using it to propose other reader-favorable access negotiations with other partners). Having good metrics for this case study from both the Wikipedia side and from the Newspapers.com analytics side, which includes referrer information, helps us make the argument to other publishers/databases
- Https links are more secure for our readers that do click through to their project (even though Newspapers.com plans to redirect any traffic from Wikiepdia to a https url, that redirect loses the referral information, which effects 1 and 2, and temporarily routes readers through a insecure server).
Could someone run a bot that substitutes http with https when it precedes newspapers.com? Our contact assures me that none of the link should break. A tool/bot that can substitute http to https link like this might be useful for a number of different TWL and WP:GLAM partnerships in the future: part of the business case for most partnerships is increased traffic, and many of our historical allies will be converting to https in the near future, if they haven't already (for example, JSTOR plans to). Thanks much from the Wikipedia Library team, Astinson (WMF) (talk) 15:15, 20 October 2015 (UTC)
- (Thanks Izno for pinging.) This is somewhat related to my request above. I will include
http://www.newspapers.com/
→https://www.newspapers.com/
in my AWB settings, but I think this could be done more efficiently by a bot. Because with Google Books links, I also remove the link clutter on the fly, but this doesn't seem necessary for newspaper.com, or does it? --bender235 (talk) 16:16, 20 October 2015 (UTC)- @Izno: Thanks! @Bender235: I had considered doing it semi-automatically with AWB, but it such a simple conversion that a bot should do it. All the urls are one of two types of structure URIs, so there shouldn't be any clutter, since we have been giving very clear recommendations with the Newspapers.com donation, that they shouldn't be inconsistent. It would be great to have a bot (or a bot activated by a tool), that can help with these kinds of conversions, because I am sure there will be a myriad of requests in the next year or so from GLAMs, etc. Astinson (WMF) (talk) 18:48, 20 October 2015 (UTC)
- @Bender235: Saw the first 4000 links changed, thanks for doing it with AWB! Is there any chance we can update the rest of them in the next week or two, if no-one picks it up with a bot? We would love to be able to keep capturing accurate metrics data to our Newspapers.com partner, sooner rather than later. Astinson (WMF) (talk) 16:06, 22 October 2015 (UTC)
- Hm, I could try. --bender235 (talk) 16:12, 22 October 2015 (UTC)
- @Bender235: Saw the first 4000 links changed, thanks for doing it with AWB! Is there any chance we can update the rest of them in the next week or two, if no-one picks it up with a bot? We would love to be able to keep capturing accurate metrics data to our Newspapers.com partner, sooner rather than later. Astinson (WMF) (talk) 16:06, 22 October 2015 (UTC)
- @Izno: Thanks! @Bender235: I had considered doing it semi-automatically with AWB, but it such a simple conversion that a bot should do it. All the urls are one of two types of structure URIs, so there shouldn't be any clutter, since we have been giving very clear recommendations with the Newspapers.com donation, that they shouldn't be inconsistent. It would be great to have a bot (or a bot activated by a tool), that can help with these kinds of conversions, because I am sure there will be a myriad of requests in the next year or so from GLAMs, etc. Astinson (WMF) (talk) 18:48, 20 October 2015 (UTC)
- Related is m:Research:Wikimedia referrer policy. Legoktm (talk) 18:44, 23 October 2015 (UTC)
- @Bender235: Per Legoktm's link, I am going to ask for a pause in targeting Newspapers.com links (for a month or two, keep updating them when you come across them in other updates, but don't target the http:// links with AWB). We are going to look at Newspapers.com in the referrer metadata pilot.Astinson (WMF) (talk) 21:22, 3 December 2015 (UTC)
Give out Deletion to Quality Awards
Is there a way a bot could give out WP:Deletion to Quality Awards ?
Here's what it would have to do:
- Check Category:Deletion to Quality Award candidates
- Find out who the FA, FL, or GAN nominator was.
- Place the corresponding Banner Award from Wikipedia:Deletion_to_Quality_Award#Banner_awards on their user talk page = linking to the article and the AFD page as the two parameters in those Banner Awards.
You can say, on behalf of Cirt and WP:Deletion to Quality Awards.
And also, any way a bot could update the "Hall of Fame" table at Wikipedia:Deletion_to_Quality_Award#Deletion_to_Quality_Award_Hall_of_Fame ?
Thoughts ?
Any help would be most appreciated,
— Cirt (talk) 05:04, 21 October 2015 (UTC)
- Note: Please note that a one-time-run would be totally acceptable. :) — Cirt (talk) 09:20, 21 October 2015 (UTC)
- Can anyone help me out with above, please? — Cirt (talk) 07:59, 29 October 2015 (UTC)
- Any help would be appreciated, please? — Cirt (talk) 03:38, 4 November 2015 (UTC)
- Too much effort required. Most other awards (such as WP:1M or WP:FOUR) do not have dedicated bots. sst✈discuss 08:45, 4 November 2015 (UTC)
- @SSTflyer:Not asking for a dedicated bot. Just a one-time-run, please? — Cirt (talk) 08:47, 4 November 2015 (UTC)
- For a one-time run you could always use AWB. sst✈discuss 08:48, 4 November 2015 (UTC)
- @SSTflyer:I'm not personally that familiar with how to use AWB like that, perhaps you could help? — Cirt (talk) 08:52, 4 November 2015 (UTC)
- For a one-time run you could always use AWB. sst✈discuss 08:48, 4 November 2015 (UTC)
- @SSTflyer:Not asking for a dedicated bot. Just a one-time-run, please? — Cirt (talk) 08:47, 4 November 2015 (UTC)
- Too much effort required. Most other awards (such as WP:1M or WP:FOUR) do not have dedicated bots. sst✈discuss 08:45, 4 November 2015 (UTC)
- Any help would be appreciated, please? — Cirt (talk) 03:38, 4 November 2015 (UTC)
- Can anyone help me out with above, please? — Cirt (talk) 07:59, 29 October 2015 (UTC)
- I suspect that the problem is figuring out who should get "credit". Once you have a list of usernames, then Special:MassMessage can do the delivery (if the message is identical for everyone) or a simple script (if the message should say "Thanks User:Example for saving This Named Article"). WhatamIdoing (talk) 19:31, 4 November 2015 (UTC)
- Using Special:MassMessage to give out awards seems too impersonal. Similar argument as for why we don't have a welcome bot. Compiling a list of the
(user, article, award_type)
tuples is fine, updating the WP:DQUAL list with it is fine, and manually giving out awards from that list is fine, but I'm wary of an automated thing. More on the higher-level merits of the task, I note many of these AfDs were closed with strong, even speedy, keep rationales—I wonder if those should be exempt. — Earwig talk 19:49, 4 November 2015 (UTC)- @WhatamIdoing and The Earwig:Thank you very much for your helpful input! I can try to give out the awards, myself, with AWB, if there's an easier way to do it. Is there a way to generate these lists you speak of ? — Cirt (talk) 22:08, 4 November 2015 (UTC)
- I don't know how to do that, other than manually copying and pasting everything into a spreadsheet. WhatamIdoing (talk) 00:16, 7 November 2015 (UTC)
- No time at the moment, Cirt, but if you still need a list by this coming weekend, let me know. — Earwig talk 10:38, 8 November 2015 (UTC)
- @WhatamIdoing and The Earwig:Thank you very much for your helpful input! I can try to give out the awards, myself, with AWB, if there's an easier way to do it. Is there a way to generate these lists you speak of ? — Cirt (talk) 22:08, 4 November 2015 (UTC)
- Using Special:MassMessage to give out awards seems too impersonal. Similar argument as for why we don't have a welcome bot. Compiling a list of the
This report was last updated only in March 2014 but most (if not all) of the IP talk pages listed here are still blank. A bot should be used to apply the template {{OW}} to all the pages which were blanked to remove stale warnings. An easy criterion for identifying such pages is this: the last editor should hve been User:BD2412 and the edit summary should have a link to WP:AWB. In all the pages that I checked at random, the page was blanked to remove the stale warnings and this was done by BD2412 using AWB. 103.6.159.89 (talk) 17:50, 23 October 2015 (UTC)
- That was the practice at the time. How many of these are there now? bd2412 T 18:13, 23 October 2015 (UTC)
- Well, I did not accuse you of doing anything wrong, of course. I don't know how many are there. Nevertheless, I think this task should be done by a bot rather than through AWB because bot edits, if marked as minor, would not trigger the unnecessary "You have new messages" note at the IPs' end. 103.6.159.89 (talk) 19:07, 23 October 2015 (UTC)
- BD2412 Is there a reasonable consensus to do this? I could look at coding up a bot to do this. Hasteur (talk) 19:24, 23 October 2015 (UTC)
- Yes, but I'll have to find the discussions. I'm actually headed out right now, but will get back to the question tonight. Cheers! bd2412 T 20:10, 23 October 2015 (UTC)
- There have been lots of small discussions, e.g., Wikipedia_talk:Criteria_for_speedy_deletion/Archive_9#IP_talk_pages, Wikipedia:Bot_requests/Archive_50#Bot_to_remove_patently_stale_warnings_from_IP_talk_pages, Wikipedia:Village_pump_(proposals)/Archive_110#Bot_blank_and_template_really.2C_really.2C_really_old_IP_talk_pages. bd2412 T 00:55, 24 October 2015 (UTC)
- Yes, but I'll have to find the discussions. I'm actually headed out right now, but will get back to the question tonight. Cheers! bd2412 T 20:10, 23 October 2015 (UTC)
- BD2412 Is there a reasonable consensus to do this? I could look at coding up a bot to do this. Hasteur (talk) 19:24, 23 October 2015 (UTC)
- Well, I did not accuse you of doing anything wrong, of course. I don't know how many are there. Nevertheless, I think this task should be done by a bot rather than through AWB because bot edits, if marked as minor, would not trigger the unnecessary "You have new messages" note at the IPs' end. 103.6.159.89 (talk) 19:07, 23 October 2015 (UTC)
- Template:OW is pretty terrible, in my opinion. Why not just delete the pages? User talk:67.173.42.13 is an easy example: that IP hasn't edited in over a decade. There's no good reason to indefinitely keep a templated warning from 10 years ago. --MZMcBride (talk) 18:22, 24 October 2015 (UTC)
- @BD2412 and Hasteur: BTW, there are 68 782 blank talk pages for IP users. You can see the list here. Sorry for non-wiki list, but Mediawiki didn't allow me to save the page :D --Edgars2007 (talk/contribs) 10:59, 29 October 2015 (UTC)
- @MZMcBride: My primary objection to just deleting the pages (which was also my initial practice) is that some unpredictable portion of them will become blue links again at some point, but with unexplained deleted history. bd2412 T 14:33, 30 October 2015 (UTC)
- BD2412: And? Are the odds really that good that it's the same person from 2004 that it is in 2015? And even if so, who cares about some templated message from so long ago? Is there any value to us keeping thousands of duplicative old templated warnings indefinitely in database dumps, in Special:Search, tracked in pagelinks tables, etc.? I think there's real value in de-cluttering the user talk namespace and Wikipedia generally, especially as the site gets older. --MZMcBride (talk) 05:28, 4 November 2015 (UTC)
- Even though I was fine with the bot from a little while ago to undelete a couple thousand talk pages, the more I think about it, the more unsure I am. There's definitely something to be said for declutter, but there's also an argument for keeping as much harmless stuff public as reasonably possible, in the event that people want to do research on old user warnings, or just as a general principle of openness. Also, deletion would add more entries to certain tables, so I'm not sure if that's a great argument. I may have mentioned this before, but in practice I doubt it adds much to database dumps, and people who want smaller dumps
can eat more fibercan download the ones that only include articles or current revisions. I don't know. — Earwig talk 06:03, 4 November 2015 (UTC)- We already have fairly disparate treatment of IP talk pages (including thousands of even more useless "IP talk page/archive" pages that were created by a bot at some point, containing old warnings with no associated edit history). I think that we should be 1) consistent; and 2) transparent. Deletion does not achieve transparency. bd2412 T 13:23, 4 November 2015 (UTC)
- Radical openness requires constant maintenance. Deletion exists for a reason. Transparency is great, but we also must consider whether these ancient templated messages are providing any value to the encyclopedia. I agree that we currently have disparate treatment of IP talk pages, but I don't think that should prevent us from cleaning house. Regular housekeeping should allow us to eventually have more consistent treatment. --MZMcBride (talk) 05:37, 5 November 2015 (UTC)
- There's also an argument to be made about the "right to vanish," as IPs get reassigned to various organizations and people. That's why we don't indefinitely block IPs. Plus, a vandalism warning from literally 10 years ago, for example, means literally nothing to anyone. It's not useful for any purpose other than, maybe, nostalgia at what user warnings looked like 10 years ago. If a page is deleted with, say, hypothetical "CSD U99," it can always be speedily undeleted via WP:UNDELETE if someone wants to investigate it. In fact, if we really wanted, we could make an adminbot for the task. However, on the flip side, if warnings exist from 10 years ago and suddenly someone gets a warning about their current conduct, they'll see the 10-year-old warnings and think, realistically, nothing's going to happen to them if they fail to heed it, since nothing ever happened before. --slakr\ talk / 06:59, 5 November 2015 (UTC)
- Radical openness requires constant maintenance. Deletion exists for a reason. Transparency is great, but we also must consider whether these ancient templated messages are providing any value to the encyclopedia. I agree that we currently have disparate treatment of IP talk pages, but I don't think that should prevent us from cleaning house. Regular housekeeping should allow us to eventually have more consistent treatment. --MZMcBride (talk) 05:37, 5 November 2015 (UTC)
- We already have fairly disparate treatment of IP talk pages (including thousands of even more useless "IP talk page/archive" pages that were created by a bot at some point, containing old warnings with no associated edit history). I think that we should be 1) consistent; and 2) transparent. Deletion does not achieve transparency. bd2412 T 13:23, 4 November 2015 (UTC)
- Even though I was fine with the bot from a little while ago to undelete a couple thousand talk pages, the more I think about it, the more unsure I am. There's definitely something to be said for declutter, but there's also an argument for keeping as much harmless stuff public as reasonably possible, in the event that people want to do research on old user warnings, or just as a general principle of openness. Also, deletion would add more entries to certain tables, so I'm not sure if that's a great argument. I may have mentioned this before, but in practice I doubt it adds much to database dumps, and people who want smaller dumps
- BD2412: And? Are the odds really that good that it's the same person from 2004 that it is in 2015? And even if so, who cares about some templated message from so long ago? Is there any value to us keeping thousands of duplicative old templated warnings indefinitely in database dumps, in Special:Search, tracked in pagelinks tables, etc.? I think there's real value in de-cluttering the user talk namespace and Wikipedia generally, especially as the site gets older. --MZMcBride (talk) 05:28, 4 November 2015 (UTC)
- @MZMcBride: My primary objection to just deleting the pages (which was also my initial practice) is that some unpredictable portion of them will become blue links again at some point, but with unexplained deleted history. bd2412 T 14:33, 30 October 2015 (UTC)
- @BD2412 and Hasteur: BTW, there are 68 782 blank talk pages for IP users. You can see the list here. Sorry for non-wiki list, but Mediawiki didn't allow me to save the page :D --Edgars2007 (talk/contribs) 10:59, 29 October 2015 (UTC)
If you copy this page to en:WP you will have a 2 click method of putting the OW on each of the first 995 pages, almost as efficient as AWB.
Using the same technique when Bernstein bot updates the list will rapidly clear the list down if anyone cares enough to put the time in.
All the best: Rich Farmbrough, 02:23, 7 November 2015 (UTC).
Section headers
A bot should change all h1s (surrounded by just one equal sign on each side) to h2s and all h2s, h3s, h4s, and h5s under an h1 to h3s, h4s, h5s, and h6s respectively. H6s cannot be changed into h7s because h7s do not work, instead they will be treated as h6s where the section header name begins and ends with an equal sign. GeoffreyT2000 (talk) 16:02, 24 October 2015 (UTC)
- I don't think this can be done with a bot. What if you had a header structure like this:
- h2
- h2
- h1
- h2
- h2
- h2
- in which someone has erroneously inserted an h1 into an otherwise well-formatted set of h2s? You would want to change that h1 to an h2 without changing the h2 below it.
- If h1 is not allowed in articles, a bot might be able to tag or make a list of articles that are afflicted with h1s so that editors could work from a list of those malformed articles. – Jonesey95 (talk) 17:48, 24 October 2015 (UTC)
- Could a bot at least look for that and post a list of articles with that issue? Frankly, it may be something for a cleanup bot or as part of an AWB check or something. -- Ricky81682 (talk) 19:55, 26 October 2015 (UTC)
- Are you seeing articles with this problem right now? Checkwiki task 19 already looks for lines that start with a single "=" character, and its most recent report shows no articles with this error. – Jonesey95 (talk) 21:50, 26 October 2015 (UTC)
- Could a bot at least look for that and post a list of articles with that issue? Frankly, it may be something for a cleanup bot or as part of an AWB check or something. -- Ricky81682 (talk) 19:55, 26 October 2015 (UTC)
- AWB will correct some of these errors as GFs I believe. All the best: Rich Farmbrough, 02:19, 7 November 2015 (UTC).
Remove Persondata
Persondata was deprecated: by this RfC which closed on 26 May this year and included consensus to remove Persondata from Wikipedia.
An earlier request for a bot to undertake this task was closed on 7 September, with the comment:
a discussion about a bot operation of this magnitude needs to be held in a broader forum, with more participants and a more focused discussion
This has now taken place, and the second RfC has just been closed with the comment:
There is consensus to have a bot remove all the persondata from all the articles.
Please can we now have a bot to do this? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:24, 27 October 2015 (UTC)
- Andy Mabbett does this mean that
23Wikipedia:Bots/Requests for approval/Yobot 24 can be resumed? -- Magioladitis (talk) 05:18, 28 October 2015 (UTC)- I assume you meant Wikipedia:Bots/Requests for approval/Yobot 24? — Earwig talk 05:25, 28 October 2015 (UTC)
- Yes. Fixed. -- Magioladitis (talk) 11:55, 28 October 2015 (UTC)
- @Magioladitis: Yes. Please do. We now have two RfCs that have found conssensus to remove Persondata. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:51, 28 October 2015 (UTC)
- I assume you meant Wikipedia:Bots/Requests for approval/Yobot 24? — Earwig talk 05:25, 28 October 2015 (UTC)
- What Andy fails to mention is the rest of the close, which I think is rather important:
Yobot 24's denial reason is probably just as applicable to this recent RFC, which established no time period for removal beyond the quoted material. I'm happy to help iron out myself how a bot or set of bot tasks should take care of this, but Andy's pointy-effort to remove this template immediately is not doing him any favors, and I would echo Guy's previous comment to him about this template were it not that I think that Dirtlawyer1 had already done so enough times…. --Izno (talk) 11:36, 28 October 2015 (UTC)As a side note there is common sense coming from the minority and even some supports. That the removal be done in steps and that moving what has not been moved to wikidata to some other place so that it can be done at a later date is a good plan that may save time of less informed editors. But there can not be said to be consensus for this, though I see no opposes.
- Oh FFS. How much longer are people going to Wikilawyer this? The text you cite concludes
"there can not be said to be consensus for this..."
. It is important only in that it gives a green light to proceed immediately with the removal of Persondata. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:48, 28 October 2015 (UTC)- You didn't answer the question you thought you did in the RFC. Please review--the question you asked and the majority if not entirety of the discussion centered on "should it be bot-removed" and not "should it be bot-removed now". Wikilawyer? No, certainly not. --Izno (talk) 14:03, 28 October 2015 (UTC)
- Oh FFS. How much longer are people going to Wikilawyer this? The text you cite concludes
- @GoingBatty: because I think you probably did the most in the most-recent discussion to move this forward sensibly. --Izno (talk) 11:38, 28 October 2015 (UTC)
- @Izno: On one end of the spectrum, there are people who think that all {{Persondata}} templates should be deleted immediately. On the other end, there are those who think there is more work to be done to copy the data to Wikidata. As an attempt to find a compromise, I submitted a bot request to remove Persondata where ALL of the values are found elsewhere in the article, such as an infobox, the lead, and/or categories. However, the bot request was only approved to remove Persondata where it only contained
|NAME=
. I would be happy to resubmit my original bot request if there's a reasonable chance of it being approved. GoingBatty (talk) 16:53, 28 October 2015 (UTC)- @GoingBatty: as far as I understand the Wikidata guys do not need the Persondata info so any concerns about transferring data to Wikidata are mute. Am I wrong? -- Magioladitis (talk) 16:59, 28 October 2015 (UTC)
- @Magioladitis: In previous conversations, there were those who said the Wikidata editors don't want the remaining Persondata, and others who thought there was still opportunity for manual (and possibly automated) copying. GoingBatty (talk) 00:57, 29 October 2015 (UTC)
- @GoingBatty: as far as I understand the Wikidata guys do not need the Persondata info so any concerns about transferring data to Wikidata are mute. Am I wrong? -- Magioladitis (talk) 16:59, 28 October 2015 (UTC)
- @Izno: On one end of the spectrum, there are people who think that all {{Persondata}} templates should be deleted immediately. On the other end, there are those who think there is more work to be done to copy the data to Wikidata. As an attempt to find a compromise, I submitted a bot request to remove Persondata where ALL of the values are found elsewhere in the article, such as an infobox, the lead, and/or categories. However, the bot request was only approved to remove Persondata where it only contained
- @T.seppelt: because I think [7] might be a useful and extensible method to preserving/moving the data that isn't checkable by a bot, which would help us remove the template. --Izno (talk) 11:46, 28 October 2015 (UTC)
- There is no need to "preserve" any data (it's already in article histories) and the recent RfC found no consensus to do so . Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:48, 28 October 2015 (UTC)
- Your comments are false. See below. I'm happy to take this to ANI given your behavior if you don't back off on the plainly-sensible suggestions. --Izno (talk) 14:03, 28 October 2015 (UTC)
- Ooh goody, more dramah. Off you go... Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:26, 28 October 2015 (UTC)
- I can extend my tool. Please let me know when you reach consensus on this topic. Regards, -- T.seppelt (talk) 22:29, 28 October 2015 (UTC)
- @T.seppelt: Thank you. Consensus has been reached, in not one, but two RfCs. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 23:36, 28 October 2015 (UTC)
- @Pigsonthewing, Izno, and Magioladitis: Now I read the whole discussion. What I can offer is the following: KasparBot goes through all pages which transclude {{Persondata}} and compares the information with the statements, labels, descriptions and aliases of the connected Wikidata item. Missing information is added to Wikidata. After all the data which is equally stored in Wikidata is removed from the article. Problems will be tracked in a special database which can be accessed using a tool I am going to develop. If no data remains in the article the whole template will be removed. This procedure is exactly the same as I am using for {{Authority control}}. What do you think? -- T.seppelt (talk) 20:13, 29 October 2015 (UTC)
- Thank you. I think you've overlooked all the issues about unreliable and unparsable data, discussed in the first RfC, and the Wikidata discussion to which it linked and the outcome of both that and the more recent RfC. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 20:25, 29 October 2015 (UTC)
- I think there were complains that the info in Persondata is not reliable. This is a lose-lose situation unless we really act. -- Magioladitis (talk) 21:54, 29 October 2015 (UTC)
- It will take time until all data can be parsed and moved. But I will improve the code continuously based on the experiences during the migration. -- T.seppelt (talk) 20:37, 29 October 2015 (UTC)
- So what do we want? Deleting everything (which is very easy. writing the code takes 10 minutes. running the program approximately one week) or copying each piece of information and make sure that nothing is lost (which will take months. days for the code. a week for running the program and hundreds of hours of manual work to decide whether the Wikidata or the Wikipedia value is more appropriate). We should make this decision before somebody starts to develop a program and set up a suitable frame to make the work of the user as easy as possible. -- T.seppelt (talk) 11:17, 30 October 2015 (UTC)
- The decision was made in the first RfC: Persondata is deprecated, and is to be removed. We want a bot to do this, as per consensus in the second RfC. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 18:13, 31 October 2015 (UTC)
- So what do we want? Deleting everything (which is very easy. writing the code takes 10 minutes. running the program approximately one week) or copying each piece of information and make sure that nothing is lost (which will take months. days for the code. a week for running the program and hundreds of hours of manual work to decide whether the Wikidata or the Wikipedia value is more appropriate). We should make this decision before somebody starts to develop a program and set up a suitable frame to make the work of the user as easy as possible. -- T.seppelt (talk) 11:17, 30 October 2015 (UTC)
- It will take time until all data can be parsed and moved. But I will improve the code continuously based on the experiences during the migration. -- T.seppelt (talk) 20:37, 29 October 2015 (UTC)
- @Pigsonthewing, Izno, and Magioladitis: Now I read the whole discussion. What I can offer is the following: KasparBot goes through all pages which transclude {{Persondata}} and compares the information with the statements, labels, descriptions and aliases of the connected Wikidata item. Missing information is added to Wikidata. After all the data which is equally stored in Wikidata is removed from the article. Problems will be tracked in a special database which can be accessed using a tool I am going to develop. If no data remains in the article the whole template will be removed. This procedure is exactly the same as I am using for {{Authority control}}. What do you think? -- T.seppelt (talk) 20:13, 29 October 2015 (UTC)
- @T.seppelt: Thank you. Consensus has been reached, in not one, but two RfCs. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 23:36, 28 October 2015 (UTC)
- I can extend my tool. Please let me know when you reach consensus on this topic. Regards, -- T.seppelt (talk) 22:29, 28 October 2015 (UTC)
- Ooh goody, more dramah. Off you go... Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:26, 28 October 2015 (UTC)
- Your comments are false. See below. I'm happy to take this to ANI given your behavior if you don't back off on the plainly-sensible suggestions. --Izno (talk) 14:03, 28 October 2015 (UTC)
- There is no need to "preserve" any data (it's already in article histories) and the recent RfC found no consensus to do so . Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:48, 28 October 2015 (UTC)
- @Magnus Manske: because you have experience with setting up neat tools for Wikidata--I'm not sure this would be exactly up your alley to up this but figured I'd ping you. --Izno (talk) 11:46, 28 October 2015 (UTC)
- All the data that can sensibly be transferred to Wikidata by a bot has already been trasnfered; that was discussed at length in the first RfC. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:48, 28 October 2015 (UTC)
- And as multiple persons have stated (myself included), your statement is false (or perhaps misleading). You don't need to repeat yourself on this point. --Izno (talk) 14:03, 28 October 2015 (UTC)
- All the data that can sensibly be transferred to Wikidata by a bot has already been trasnfered; that was discussed at length in the first RfC. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:48, 28 October 2015 (UTC)
Izno should then AWB be altered to removed Persondata as part of general fixes? -- Magioladitis (talk) 14:30, 28 October 2015 (UTC)
Can you guys get moving on this? Once the template was deprecated in the first RfC, editors stopped adding it to new pages. Following the more recent RfC decision to remove, editors started routinely removing the template. My watch list shows that this is progressing. The WikiData editors said back in May that the data contained therein was unreliable and of no interest to them, but if they want it, then they need to get cracking before much of it disappears bot or no bot. Hawkeye7 (talk) 20:47, 29 October 2015 (UTC)
BRFA filed -- T.seppelt (talk) 18:13, 4 November 2015 (UTC)
I'm not really sure why we have this category in the first place, but in an ongoing discussion at WP:AN it has been mentioned that there are large numbers of users listed here who are not in fact indefinitely blocked for violations of the username policy. We're talking about tens of thousands of pages in total, so it's pretty much never going to be fixed by humans. I know I use a script that automatically strikes out the usernames of blocked users, I assume the same type of coding could be used to scan this cat and remove anyone who shouldn't be in it? Beeblebrox (talk) 01:33, 29 October 2015 (UTC)
- User:Betacommand has a tool that lists all accounts that are not blocked locally in that category. Legoktm (talk) 01:41, 29 October 2015 (UTC)
Stale userspace drafts
A more simple request. Could someone break out Category:Stale userspace drafts into subpages at Wikipedia:WikiProject Abandoned Drafts/Stale drafts? There's over 46k pages. I realized that AWB only outputs the first 25000 pages and in the time I've spent manually breaking these down there's been hundreds of changes so it's too much. It would be breaking these into pages of 1000 each titled Wikipedia:WikiProject Abandoned Drafts/Stale drafts (01), Wikipedia:WikiProject Abandoned Drafts/Stale drafts (02), etc. so about 47 pages. If possible, maybe create "Section 1" ,etc headings for each hundred listings. I've already asked at Wikipedia talk:WikiProject Abandoned Drafts but there's zero activity there. Feel free to re-write the pages already created. -- Ricky81682 (talk) 22:26, 29 October 2015 (UTC)
- @Ricky81682: if it helps, created list here (first part) and here (second part). There may be some redlinks, where qoutes are in title. If those lists are useless, then feel free to nominate them to deletion. --Edgars2007 (talk/contribs) 08:14, 30 October 2015 (UTC)
- That's a total of 32590 pages. What is that exactly? -- Ricky81682 (talk) 09:10, 30 October 2015 (UTC)
- Sorry, my mistake :) Will correct later. It is (will be) Stale userspace drafts splitted in two parts in list format. --Edgars2007 (talk/contribs) 15:57, 30 October 2015 (UTC)
- @Edgars2007: Did you fix it? I'd like to have at least one static version to work off rather than duplicate the work again and again. -- Ricky81682 (talk) 22:24, 31 October 2015 (UTC)
- @Ricky81682: Sorry, had some extra work at my homewiki. Thanks for patience. Now it should be fixed. For some reason, Quarry returned only 46 607 pages, but currently in category there are 46 610 pages, hope those 3 pages aren't a problem. Actually, the biggest problem in this task was selecting 20tk+ rows in Excel (if you have 40tk+ rows in total), it was quite boring. Now I found out, that a simple A1:A20000 (typing in the box) works :) So thanks also to you. --Edgars2007 (talk/contribs) 10:52, 2 November 2015 (UTC)
- @Edgars2007: Did you fix it? I'd like to have at least one static version to work off rather than duplicate the work again and again. -- Ricky81682 (talk) 22:24, 31 October 2015 (UTC)
- Sorry, my mistake :) Will correct later. It is (will be) Stale userspace drafts splitted in two parts in list format. --Edgars2007 (talk/contribs) 15:57, 30 October 2015 (UTC)
- That's a total of 32590 pages. What is that exactly? -- Ricky81682 (talk) 09:10, 30 October 2015 (UTC)
Please take part in the ongoing discussion at: Wikipedia:Village pump (technical)#Reducing the load of WP:TAFI unofficial-manager Northamerica1000 to make our lives over at WP:TAFI that little bit easier. :)--Coin945 (talk) 15:27, 30 October 2015 (UTC)
- Started a thread at WT:TAFI#Bot automation — MusikAnimal talk 01:39, 9 November 2015 (UTC)
Arbitration
A bot should automatically remove statements from subpages of Wikipedia:Arbitration/Requests/Case that are longer than 500 words. GeoffreyT2000 (talk) 01:29, 3 November 2015 (UTC)
- No, it shouldn't. That's the job of clerks, and sometimes statements are allowed that go over the limit. Besides, shortening statements is better than removing them outright, and bots can't do that. — Earwig talk 01:34, 3 November 2015 (UTC)
List of articles using a particular parameter of an infobox
Could a kind soul create a list of articles that use the parameter thermal_capacity
from {{Infobox power station}} please? You may dump the list on my sandbox. Feel free to overwrite the sandbox contents. Also, is there a faster way to do this myself next time? Thanks in advance! Rehman 13:13, 4 November 2015 (UTC)
- I've added a tracking category for this; see: Category:Infobox power station with thermal_capacity parameter. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:25, 4 November 2015 (UTC)
- Thanks Andy! Rehman 12:31, 5 November 2015 (UTC)
Month articles
Per Wikipedia talk:WikiProject Years#Using archives of Portal:Current events for month articles, a bot should move all month articles that transclude Portal:Current events pages to subpages of Portal:Current events/Archive without leaving redirects (bots can do this; no need to be an adminbot) and recreate the moved pages as redirects to the corresponding year article. Also, the same bot should automatically create an archive page at the beginning of each month. GeoffreyT2000 (talk) 02:47, 5 November 2015 (UTC)
- @SugarRat: This might interest you. GeoffreyT2000 (talk) 03:01, 5 November 2015 (UTC)
Detect copy-paste plagarism from published papers via unique phrases
I've seen a number of articles recently that include standard academic-paper phrases like "This paper demonstrates" or "In our research".
In at least one case, a researcher has asked me for help resolving the plagiarism.
I know there are some bots that automatically flag pages for review based on content; it'd be great if this could be added to one of the existing patrol bots. -- Metahacker (talk) 15:35, 6 November 2015 (UTC)
- Some of the instances for this kind of thing occur within quotations and references. It will have to be made sure that it's been taken from the actual article text. Rcsprinter123 (interface) 21:42, 6 November 2015 (UTC)
- This sounds like something EranBot should be doing (since I would expect Turnitin to catch papers not detected by CorenSearchBot or toollabs:copyvios). — Earwig talk 22:18, 6 November 2015 (UTC)
Template:YouTube
Since I've been the only one active on Template talk:YouTube for the past two months, I am going to claim consensus for my proposed changes to the template. But before I rewrite the template, I need a bot to go to every page using it, and replace the channel
parameter with user
. Thanks, 117Avenue (talk) 00:57, 9 November 2015 (UTC)
- This could probably be done with AWB. It looks like there are only a few hundred articles in Category:Articles using YouTube with deprecated parameters. – Jonesey95 (talk) 06:29, 9 November 2015 (UTC)
- I've been trying to avoid learning AWB. 117Avenue (talk) 02:55, 11 November 2015 (UTC)
- @117Avenue: Doing... with AWB. Kharkiv07 (T) 02:45, 12 November 2015 (UTC)
- @117Avenue: Done Kharkiv07 (T) 03:27, 12 November 2015 (UTC)
- Thanks, 117Avenue (talk) 02:37, 14 November 2015 (UTC)
- @117Avenue: Done Kharkiv07 (T) 03:27, 12 November 2015 (UTC)
- @117Avenue: Doing... with AWB. Kharkiv07 (T) 02:45, 12 November 2015 (UTC)
- I've been trying to avoid learning AWB. 117Avenue (talk) 02:55, 11 November 2015 (UTC)
de la Fuente Marcos
Would it be possible to scan the astronomical articles for occurrences of "C." and "R." "de la Fuente Marcos" and expanding those to "Carlos" and "Raúl", respectively? There seems to be over a 1000 such pages, at first glance. Urhixidur (talk) 15:44, 11 November 2015 (UTC)
- @Urhixidur: Is there a specific category that you want to be checked? Kharkiv07 (T) 20:48, 15 November 2015 (UTC)
- Category:Astronomy and its subcategories should be enough. Urhixidur (talk) 00:47, 16 November 2015 (UTC)
Template:Non English
I recently changed the template {{Not English}} which now requires that it be substituted. The task that would need to be done by a bot is replace all instances of {{Not English}} with {{subst:Not English}}. Would that be possible? Thanks! --rayukk | talk 12:30, 14 November 2015 (UTC)
- Your header was causing it to act screwy here. Why did you make the change? AWB could be used to help out. -- Ricky81682 (talk) 12:33, 14 November 2015 (UTC)
- I made the change, so that the date is automatically added to the template and articles that have not been translated in a two-week period will be listed for deletion. (also see here). How would one use AWB? Thanks! --rayukk | talk
- @Rayukk: One could make a list of all the articles that transcluded Template:Not English and replace
{{Not English}}
with{{subst:Not English}}
. Or you could follow the directions at Category:Wikipedia templates to be automatically substituted so AnomieBOT would substitute them for you. GoingBatty (talk) 13:59, 14 November 2015 (UTC)
- @Rayukk: One could make a list of all the articles that transcluded Template:Not English and replace
- I made the change, so that the date is automatically added to the template and articles that have not been translated in a two-week period will be listed for deletion. (also see here). How would one use AWB? Thanks! --rayukk | talk
Move protection
An adminbot should automatically remove move protection from all pages that are protected with "move=autoconfirmed". GeoffreyT2000 (talk) 19:15, 14 November 2015 (UTC)
- Why? Where is consensus that this needs to be done? Anomie⚔ 00:10, 16 November 2015 (UTC)
- It's a housekeeping matter. There's no point to using this protection setting, because it has absolutely no effect: autoconfirmed status is required to move pages anyway! I've always wondered why it existed in the first place, unless perhaps there was at one time a possibility of moving pages without an account or with a new account. Nyttend (talk) 14:00, 16 November 2015 (UTC)
- Doesn't it automatically make "move=autoconfirmed" whenever you make "edit=autoconfirmed"? Or have I been protecting pages incorrectly? I would disagree with having a bot change this mainly because it would change the protection log, so instead of an editor being unable to edit the page and seeing the reason as vandalism/edit warring/etc, they would see a housekeeping message by a bot. Maybe it's something that could fixed on the dev end so that it's no longer an option. Jenks24 (talk) 14:31, 16 November 2015 (UTC)
- If the protection log is the problem, we could always have the bot use a modified version of the edit summary. For example, if the previous message were "Edit warring/content dispute", the new message could be "Edit warring/content dispute; slight modification for housekeeping purposes" or something of the sort, and the linked page would be a short paragraph explaining the reason for removing move=autoconfirmed from the entry. Nyttend (talk) 20:10, 16 November 2015 (UTC)
- If we really want this (and I don't see the purpose), it should be a software change. Otherwise the bot's gonna be re-protecting nearly every semi-protected page, since this is the default (just head over to Special:Log/protect and search for "move=allow only autoconfirmed"). It's there because moving doesn't have to be restricted to autoconfirmed users; if at some point we chose to give that ability to any registered user, suddenly users would be able to move pages they couldn't edit, which doesn't make sense. — Earwig talk 07:40, 21 November 2015 (UTC)
- If the protection log is the problem, we could always have the bot use a modified version of the edit summary. For example, if the previous message were "Edit warring/content dispute", the new message could be "Edit warring/content dispute; slight modification for housekeeping purposes" or something of the sort, and the linked page would be a short paragraph explaining the reason for removing move=autoconfirmed from the entry. Nyttend (talk) 20:10, 16 November 2015 (UTC)
- Doesn't it automatically make "move=autoconfirmed" whenever you make "edit=autoconfirmed"? Or have I been protecting pages incorrectly? I would disagree with having a bot change this mainly because it would change the protection log, so instead of an editor being unable to edit the page and seeing the reason as vandalism/edit warring/etc, they would see a housekeeping message by a bot. Maybe it's something that could fixed on the dev end so that it's no longer an option. Jenks24 (talk) 14:31, 16 November 2015 (UTC)
- It's a housekeeping matter. There's no point to using this protection setting, because it has absolutely no effect: autoconfirmed status is required to move pages anyway! I've always wondered why it existed in the first place, unless perhaps there was at one time a possibility of moving pages without an account or with a new account. Nyttend (talk) 14:00, 16 November 2015 (UTC)
- WP:COSMETICBOT. The protection setting has no effect, so what's the point in removing it? 103.6.159.68 (talk) 04:57, 17 November 2015 (UTC)
Convert deprecated parameter "or" for template:s-rel
Would someone be ever-so-kind as to set-up a bot to convert a deprecated parameter. The total number of article would be about 340, with one edit in each article. The lists are at Template talk:S-rel/oc lists, with the new parameter for each. For example "Change these {{s-rel|oc}} to {{s-rel|chal}}". The discussion was/is at Template talk:S-rel#Introduce two new parameters. tahc chat 03:59, 15 November 2015 (UTC)
- Done (manually). Thine Antique Pen (talk) 18:57, 16 November 2015 (UTC)
- Thank you. tahc chat 20:31, 16 November 2015 (UTC)
many links of accesstoinsight.org Buddhism section
There are many links referring to certain pages on accesstoinsight.org, which are 1. redirected to a different page as before and give now a link to a pdf file, which is 2. different to the original and 3. not a direct html.
It could be changed (here a sample) like this to make the original page content of the reference available on a living "Mirror page" or ATI:
Original link: http://accesstoinsight.org/lib/authors/thanissaro/bmc2/bmc2.ch20.html (= Redirect)
Addressing the original content: http://accesstoinsight.eu/lib/authors/thanissaro/bmc2/bmc2.ch20_old_en.html or http://zugangzureinsicht.org/html/lib/authors/thanissaro/bmc2/bmc2.ch20_old_en.html
The part which would needed to be changed is bold marked. So it would require something like this: find "accesstoinsight.org\/lib\/authors\/thanissaro\/bmc(.*?).html
" and replace with "accesstoinsight.eu/lib/authors/thanissaro/bmc\1_old_en.html
" or "zugangzureinsicht.org/html/lib/authors/thanissaro/bmc\1_old_en.html
" (the natural link). May it be useful and understandable. Samana Johann --203.144.93.201 (talk) 16:44, 16 November 2015 (UTC)
WikiProject Mountains banner update
After the recent update of the Wikipedia:WikiProject Mountains banner (Template:WikiProject Mountains) to include two new parameters for mountains in the Alps (see discussion here), I would like to update the talk page of everey article concerned (all in Category:Mountains of the Alps, no subcategories) by adding:
|alps=yes | alps-importance=
to:
{{WikiProject Mountains | class= | importance= }}
result:
{{WikiProject Mountains | class= | importance= | alps=yes | alps-importance=[same as "importance"] }}
ZachG (Talk) 18:50, 16 November 2015 (UTC)
- @Zacharie Grossen: What about cases where {{WikiProject Mountains}} is not on the talk pages of articles in Category:Mountains of the Alps? Would you want that added (and if so, how), or just skipped? Talk:Acherkogel is an example. Hazard SJ 07:34, 21 November 2015 (UTC)
- You made a very good point. I guess the best option is to add a blank template like this one:
{{WikiProject Mountains | class= | importance= | alps=yes | alps-importance= }}
- or, if the article is a stub:
{{WikiProject Mountains | class=stub | importance= | alps=yes | alps-importance= }}
- Would it be possible? ZachG (Talk) 16:45, 21 November 2015 (UTC)
- @Zacharie Grossen: Definitely! I've coded and trialed this (my bot is already approved for this sort of task). As you can see, in the trial, there was an issue where alps-importance=importance=(value) was set to, rather than just alps-importance=(value). That's been fixed. My second noticed issue was where the only edit was the addition of an empty alps-importance= parameter (see this and this), which would be undesirable (I'm gonna prevent adding that parameter unless there's actually a value, or unless I'm adding alps=yes). Otherwise, I hope everything's okay with everything else in the sample run I made? Hazard SJ 02:40, 22 November 2015 (UTC)
- Glad to hear that! I'm definitely not an expert on the matter but if the issues you mentioned are fixed then I think it's ok to run the bot. Thank you for your help. ZachG (Talk) 13:23, 22 November 2015 (UTC)
- @Zacharie Grossen: Definitely! I've coded and trialed this (my bot is already approved for this sort of task). As you can see, in the trial, there was an issue where alps-importance=importance=(value) was set to, rather than just alps-importance=(value). That's been fixed. My second noticed issue was where the only edit was the addition of an empty alps-importance= parameter (see this and this), which would be undesirable (I'm gonna prevent adding that parameter unless there's actually a value, or unless I'm adding alps=yes). Otherwise, I hope everything's okay with everything else in the sample run I made? Hazard SJ 02:40, 22 November 2015 (UTC)
Hazard-SJ I can help with the task. For instance in this one the wikproject should have been below the other template. This can e done if you enable general fixes in AWB. You should also enable this module to normalise all wikiproject banners and avoid placement problems. -- Magioladitis (talk) 16:23, 24 November 2015 (UTC)
- @Magioladitis: I just made changes to better determine where to place the template (see example from above). Additionally, I'm not opposed to your offer to help, did you have anything specific in mind (so our bots don't spill oil in each others' way ;) )? P.S. If you haven't noticed, I'm not using AWB, I'm using Python. Hazard SJ 09:43, 25 November 2015 (UTC)
Hazard-SJ my mistake. I thought you were using AWB. What I can do is to ensure the correct placement of the banners etc. You can do the rest. -- Magioladitis (talk) 12:32, 28 November 2015 (UTC)
All done here. -- Magioladitis (talk) 09:43, 29 November 2015 (UTC)
List of old TWA user pages
Can someone come up with a list of userspace pages generated using TWA, that are over 6 months old? These have names of the form "User:Example/TWA", "User:Example/TWA/Earth" and "User:Example/TWA/Teahouse", etc. 103.6.159.71 (talk) 19:19, 16 November 2015 (UTC)
- How about quarry:query/6047? – Giftpflanze 22:16, 20 November 2015 (UTC)
Peer review bot down - please help!
From village pump (Wikipedia:Village_pump_(technical)#Peer_review_bot_down_-_please_help.21):
We have a crisis brewing... the bot that closes old reviews (PeerReviewBot) has stopped working, last edit June 19. This is a very time-consuming and labourious task to be done manually that was previously easily automated. The bot is owned by CBM who is mostly retired.
Is it possible to either get the bot started again, or create a similar bot that does the same thing? Yours very gratefully, --Tom (LT) (talk) 23:09, 20 November 2015 (UTC)
Yours very gratefully! Per This, that and the other asking here. CBM states he is happy to give the code to someone via email if they request it. --Tom (LT) (talk) 21:47, 21 November 2015 (UTC)
Add linebreaks
Perhaps someone's running a bot that already does this, but I thought I'd bring it up anyway, in case nobody was.
When text precedes a header, the header doesn't work, and the coding appears as normal text; run a Ctrl+F search for the equals sign at [8]. Fixing it is easy, because you just have to add a couple of new lines. If this isn't already being done, could someone's wikisyntax-fixing bot be given this as an additional task? Nyttend (talk) 15:31, 22 November 2015 (UTC)
- I would try asking at WT:CHECKWIKI, they love that sort of thing, and would probably even generate monthly reports with offending articles. Frietjes (talk) 00:43, 24 November 2015 (UTC)
Fix library-specific proquest links
this external link search shows proquest external links which are tied to a specific dclibrary. these can be made into a generic search request by replacing search.proquest.com.dclibrary.idm.oclc.org
with search.proquest.com
and removing the trailing ?accountid=46320
. the resulting links still may not return any useful information for people without a proquest account, but at least it will provide something other than a DC Library login screen (e.g., the citation information). I imagine there are more than the dclibrary ones. can someone help with this? thank you. Frietjes (talk) 00:42, 24 November 2015 (UTC)
Offsite: Linkrot avoiding bot
This is a request for an off-Wikipedia bot; if this is offtopic, feel free to remove it, or tell me where is more appropriate.
On the ArchiveTeam wiki, we have a template (called, simply enough, url) that can be applied to external links to add links to various web archiving services (currently just the Wayback Machine, WebCite, and archive.today). We don't currently have any automatic way to ensure that all the external links added to the wiki use this template, or that all such links are archived in one (or better, all) of the linked services. If a public-spirited bot owner wanted to contribute such a thing, I, for one, would be delighted. JesseW, the juggling janitor 07:29, 24 November 2015 (UTC)
Idea: sectional redirect appender bot
This would turn redirects into sectional redirects for those that the title (being redirected) matches a heading in the article being redirected to.
For example, a plain redirect from Text cloud to Tag cloud would be changed to Tag cloud#Text cloud because the article "Tag cloud" has a section titled "Text cloud". So, rather than just going to the top of the "Tag cloud" article, the redirect would go straight to the section titled "Text cloud".
The bot would need to look at all redirects, checking the target article in each for a heading that matches the topic title being redirected.
Note that if the redirect is already a sectional redirect that indicates a section other than the title of the redirect, then the section in the redirect would be changed to that title, but only if a section by that title was found in the article. The Transhumanist 19:40, 24 November 2015 (UTC)
Idea: sectional redirect updater bot
When a section title (heading) gets changed, sectional redirects leading to it are broken. Those redirects then go to the top of the article rather than to the intended section.
What this bot would do is check redirects, and for those that are sectional, check the target article for the section title indicated in the redirect. If it doesn't exist, then it checks the historical diffs of the article to see what the section was changed to (and whether that section title was later changed, and so on), and then it updates the redirect so that it points to the current title of that section.
If the section title doesn't exist at all in the article or its previous versions, the section is removed from the redirect. The Transhumanist 19:36, 24 November 2015 (UTC)
- Brought up here. @Ladsgroup: It looks like Dexbot is not working on these anymore. --Bamyers99 (talk) 20:19, 24 November 2015 (UTC)
- @Bamyers99 and Ladsgroup: What's the next step? The Transhumanist 20:50, 24 November 2015 (UTC)
- This process is far more complicated than it seems. I built a system to handle that. I'll improve it and then re-run it.
:)
Ladsgroupoverleg 08:16, 25 November 2015 (UTC)
- This process is far more complicated than it seems. I built a system to handle that. I'll improve it and then re-run it.
- @Bamyers99 and Ladsgroup: What's the next step? The Transhumanist 20:50, 24 November 2015 (UTC)
Idea: WikiProject stale participant member remover bot
Many WikiProjects have participant lists. Many of the editors on those lists haven't edited in months, or even years—rendering those lists out-of-date.
This bot would find and update participant lists. Once it found a list, it would remove users who haven't edited Wikipedia for more than three months. The Transhumanist 20:26, 24 November 2015 (UTC)
- I imagine that this bot would work on an opt-in basis. Each Wikiproject would determine if there is consensus to subscribe their participant list to this bot's service, in a manner similar to Cluebot's talk page archiving service. Are participant lists standardized enough to allow this to happen? – Jonesey95 (talk) 21:10, 24 November 2015 (UTC)
Dead links in ELs
Dead links in external-links sections are useless; we provide the links for additional reading, not for citations, so if you can't access them, they're pointless — they always need to be fixed or removed. Could a bot go through Category:All articles with dead external links and record ones with dead links in the EL sections, either adding by a new category (e.g. Category:Articles with dead links in External Links sections, or something of the sort) or listing them on a tracking page? I'm imagining that it opens each page, finds each occurrence of {{dead link}} or redirects thereto, and records the ones in which one or more of these templates appears below ==External links== (or == External links ==) and above the next set of equals signs. I'm asking that the bot only record these pages, without doing anything else, because fixing or removing these links is a CONTEXTBOT situation. Nyttend (talk) 01:14, 27 November 2015 (UTC)
New French regions on 1 January
On 1 January 2016, the number of administrative "regions" in Metropolitan France (the part of France in continental Western Europe, excluding Corsica and overseas regions) will be reduced from 22 to 13. Six current regions will remain unchanged, while the remaining 16 will be consolidated into 7 new regions (new articles have already been created for the new regions). In France "regions" are divided into departments, which are divided into arrondissements, which are divided into cantons (this level of government generally doesn't have articles on en-WP), which are divided into communes (towns). With one exception (Lower Normandy & Upper Normandy will be merged to form "Normandy"), all of the new regions will be known by a provisional name until their legislatures meet after 1 January and decide on a new name (must be selected by 1 July). That name must then be approved by France's Conseil d'État (which has until October 2016 to approve the new permanent names).
A bot will be needed to change the name of the "region" in infoboxes of tens of thousands of articles for subunits below the region level (departments, arrondissements, communes, & the few canton articles that may exist). For example, there are 10 departments (infobox template for regions & departments), 44 arrondissements (infobox template), and 5189 communes (infobox template) in the new region Alsace-Champagne-Ardenne-Lorraine. That's a total of 5243 infoboxes (I don't know if any cantons in this region have articles) in this new region alone that will need their infoboxes changed from the current region (Alsace, Champagne-Ardenne, or Lorraine) to Alsace-Champagne-Ardenne-Lorraine. The new regions are based on combinations of existing regions (none of the present regions are divided between 2+ new regions), so it is simply a matter of replacing the name of the present region with the name of the new region. In the future, a bot will be needed to change the name to the permanent name for the region. Because of the multiple changes, I thought this could be a good test case for Wikidata integration into infoboxes and made a suggestion here (no support at the time of making the bot request here).
Regions to be changed
|
---|
The following comes from Regions of France#Reform and mergers of regions. Italics are temporary names that will eventually be changed to a permanent name. The other regions of France will be unaffected. |
In addition to just infoboxes on articles about political subdivisions (departments, arrondissements, & communes), which seems like a simple find-and-replace task, the infoboxes of many other types of articles ought to be changed, but discerning when to change seems like a more challenging task (should locations be changed in articles for historical events?). In those cases, the bot should change the region when it is mentioned in the "location" parameter of an infobox, eg. on the article Saverne Tunnel "Alsace" will need to be replaced with "Alsace-Champagne-Ardenne-Lorraine" on 1 January. To be clear, this request is only for a bot to make the necessary changes on 1 January; I've mentioned the fact that they will need to be changed again to the permanent name in case that is relevant to this request. AHeneen (talk) 01:32, 29 November 2015 (UTC)
Pinging when a "task" section is edited
First: sections of a page can't be added to a watchlist, and this whole subject has seen some pushback over the years; see for instance this Phabricator page, which I learned about yesterday at WP:VPT#Section-specific notifications. Wikipedia is hurt every day, significantly, when people check their watchlists less frequently because they're forced to check hundreds or thousands of edits just so they can monitor the particular sections that represent the task requests they're interested in. This happened to me just yesterday; I don't pull up my watchlist as frequently nowadays because I have to keep WP:ERRORS watchlisted in case anything shows up in the Today's Featured Article section, which represents a small fraction of the edits to that page. There are many editors who struggle with the same problem daily. I'm asking for a bot that runs frequently, takes a diff of WP:ERRORS if there have been any edits, discards everything from the diff above and below specific text markers, and notifies me in some way (a ping would be fine) if the relevant part of the diff has changed. (It would also be nice if it didn't keep pinging me with each edit ... once per 24 hours would be fine ... but that's optional.) If anyone is willing to code this simple bot to run at ERRORS and a few other high-traffic pages, a lot of people will love you for it. (A red herring sometimes gets thrown into these discussions that searching for sections is hard to do ... that's both dubious and irrelevant. All we need is a bot that can search for specific, perhaps hidden, text, and can discard the parts of a diff above and below that text.) - Dank (push to talk) 15:33, 29 November 2015 (UTC)
- Dank, simulating a watchlist via pings and other (ab)uses of Echo would be pretty kludgy. One of the solutions mentioned in that Phabricator task you linked to was breaking up the page in question into a bunch of transcluded subpages. In your WP:ERRORS example, all TFA errors would go into WP:ERRORS/TFA (or something), which would be transcluded onto WP:ERRORS and which you could watchlist directly. This would be a pretty good way to solve the problem you're talking about, and we'd only need consensus on WT:ERRORS. APerson (talk!) 23:33, 6 December 2015 (UTC)
Adding Template:Research help to batches of WP:WPMED and WP:MILHIST articles
Hi all, I wanted to put in a request for adding the template Template:Research help to batches of articles in WP:MILHIST articles and WP:WPMED articles with clear messaging. There is a consensus from many of the core community members at WikiProject Medicine and WikiProject Military History and I notified the village pump.
I need a bot to insert {{Research help|MIL}}
and {{Research help|MED}}
into batches of articles under the ==References==, ==Footnotes== or ==Works cited==. In both, we will do this in batches: starting with 100 articles, then 500, then 2000, then 5000, then more. Moreover, in Military history, the consensus is to pilot on WWI and WWII task force articles first. The edit summary, needs to point towards WP:Research help/Proposal, asking for feedback/discussion on the talk page.
Also, cc-ing bot operators that have helped The Wikipedia Library in the past @Cyberpower678:. Would probably be able to implement this with AWB- its a insert-after activity. Astinson (WMF) (talk) 17:06, 30 November 2015 (UTC)
- Pinging a few more people who show interest/activity with similar projects here on Bot Requests: @Bender235, Harej, Fayenatic london, BD2412, Magioladitis, Kharkiv07, and Hazard-SJ: Anyone interested? Astinson (WMF) (talk) 21:52, 3 December 2015 (UTC)
- Sure, I can do this. I'll get the articles via {{WikiProject Military history}} (well, since this has no category that tracks
{{{WWI}}}
and{{{WWII}}}
usages, either I check each page for the params or I check for Category:World War I and Category:World War II) and {{WikiProject Medicine}} transclusions, and do half the number of article edits for each WikiProject. Let me know if that sounds okay. Hazard SJ 07:08, 4 December 2015 (UTC)- @Hazard-SJ: Brilliant! You are amazing!
- For the milhist ones, couldn't you use: Category:World War I task force articles and Category:World War II task force articles? AWB allows conversion of talk pages to article pages.
- Otherwise sounds good! Make sure that the link to the proposal is clear in the edit summaries. Also, as you update the different batches of articles, can you make sure you add a typestamped {{done}} in the pilot stages marked at: Wikipedia:Research_help/Proposal#Project_steps. This will help us measure pageviews in the experimental conditions, etc. to figure out the if/when of the changes. Astinson (WMF) (talk) 16:57, 4 December 2015 (UTC)
- @Astinson (WMF): Definitely, I was unaware of those categories. I'll also go head and use Category:All WikiProject Medicine articles, while I'm at it.
- Questions:
- Should I simply skip if none of the three sections (references, footnotes, and works cited) are on the page?
- Then there's Rivadavia-class battleship, and possibly others, with the template in an endnotes section, even though both a footnotes and a references section exists.
- If more than one of the sections exist, is there any specific way I should handle that?
- Where in the section should the template be placed (e.g. at the very top, at the very bottom, just before/after reflists if any, etc.)?
- Should I simply skip if none of the three sections (references, footnotes, and works cited) are on the page?
- Once I get these sorted out, I could proceed with the implementation of this task (P.S. I'm using Python, not AWB). Hazard SJ 08:23, 5 December 2015 (UTC)
- @Hazard-SJ: Thanks for the questions: the template should be between the {{reflist}} or the <references/> (these might be more consistent than the section headers). You might use those as the insertion criteria, but you are going to need a filter that removes just plain "Notes" or named reference sections (so reflists that use "|group ="?)). Once inserted the templates should look like: Wikipedia:Research_help/Proposal#Proposed_design_for_links_on_article_pages. As for the multiple sections: it should be the main referenced footnote section used throughout the articles. In the first couple small batch insertions, if you take add it to all the articles that have only one possible sections and/or one version of {{reflist}} and/or <references/>, and keep a log of the articles that don't get inserted, we can find where there are machine implementable rules for exceptions in the larger batches. However, this is a pilot: so as long as we know the number of articles added too, it doesn't matter if we skip a few (as long as we have a count/log those as well).
- For section titles, I did some research a few years back and the most frequent section headers were: "Footnotes", "References", and "Works cited". If you add "Endnotes" to that list: it should cover something like 80%+ of the articles. Thank you so much for the thorough examination, Astinson (WMF) (talk) 14:45, 7 December 2015 (UTC)
- @Astinson (WMF): Pings don't work unless you sign in the same edit. — HELLKNOWZ ▎TALK 20:37, 8 December 2015 (UTC)
- @Hellknowz and Hazard-SJ: That I didn't remember (I am sure I read that at some point) Astinson (WMF) (talk) 21:57, 8 December 2015 (UTC)
- Sure, I can do this. I'll get the articles via {{WikiProject Military history}} (well, since this has no category that tracks
Citation style
The citation style should be consistent in an article. A bot should automatically fix citations in some articles in Category:All articles needing references cleanup so that they will be consistent. GeoffreyT2000 (talk) 02:30, 3 December 2015 (UTC)
- How does the bot determine the right style for each article? – Jonesey95 (talk) 03:54, 3 December 2015 (UTC)
BOT request
I had liked to have a bot named 'KNOWLEDGEBOT'. I want a bot so that I could edit pages more speedily than I can and to help everyone here. I hereby accept the bot policy and take all responsibilities of bot I won't allow him to violate anything and see over his way of commenting or communication. It won't do any harm or go on editing too speedily I will supervise the bot and I hereby I accept the bot policy. I request you to create this bot and I as its Bot operator. I am responsible for all of its acts, repairs, communication language etc. I will supervise my bot and it will be in my control. RegardsBOTFIGHTER (talk) 13:57, 3 December 2015 (UTC)
- Not done. 1. this is the place to request that a bot do a task, not to request a bot. 2. I want a bot so that I could edit pages more speedily than I can and to help everyone here. Urm... Edit faster? Unless you have some precognition in your pocket, you can't do that. 3. Your username is exceedingly concerning given you want to fight bots and also want to have a bot. No. Hasteur (talk) 14:13, 3 December 2015 (UTC)
- I have kept this name because this was my first High rated game I programmed, I would not make the Bots fight, I didn't got any other username available I searched for many usernames, I really wont make the Bots fight.BOTFIGHTER (talk) 14:24, 3 December 2015 (UTC)
Per the discussion (and background) at Wikipedia:Administrators'_noticeboard#Category:AfD_debates_relisted_3_or_more_times, can we get a bot set up to check Category:AfD debates relisted 3 or more times, and remove the category from closed discussions. I had been doing this every few days using AWB, but would prefer to have something automated do it. There was talk of getting an AfD closing script to do it, however not everyone uses the same script, or a script at all. Much obliged. --kelapstick(bainuu) 21:20, 3 December 2015 (UTC)
- Could another bot do that to Category:Relisted AfD debates as well? The 3 times are a subcategory of that one. -- Ricky81682 (talk) 15:14, 4 December 2015 (UTC)
Could a bot sort out the drafts at Category:Userspace drafts by adding a "date=" parameter with the date of creation? If not, at the least, could someone make a table of those pages with their date of creation? I could do it faster if I had all those with the same month together. -- Ricky81682 (talk) 15:17, 4 December 2015 (UTC)
- Perhaps the bot could also drop users a note asking them if they still need the drafts (especially those that are more than a year old) and encourage them to tag them with {{db-u1}} if not? I suspect many of these have been forgotten about and that the users concerned would be content for them to be deleted without wasting time on MfD nominations... WJBscribe (talk) 15:35, 4 December 2015 (UTC)
- WJBscribe, umm, there's 46,000 drafts over a year old, including I think over 100 by at least 4 different editors. There's no bot who should handle that workload but feel free to suggest it. I'm just trying to keep it stable as I got a net 300 down in November (about 2000 more pages go stale a month). -- Ricky81682 (talk) 00:09, 5 December 2015 (UTC)
- BRFA filed I didn't attempt to drop any notes, that would definitely need wider discussion first. Hazard SJ 06:00, 5 December 2015 (UTC)
- WJBscribe, umm, there's 46,000 drafts over a year old, including I think over 100 by at least 4 different editors. There's no bot who should handle that workload but feel free to suggest it. I'm just trying to keep it stable as I got a net 300 down in November (about 2000 more pages go stale a month). -- Ricky81682 (talk) 00:09, 5 December 2015 (UTC)
Move protect DYK subpages
An adminbot should move protect all DYK subpages per Template talk:Did you know#How to move a nomination subpage to a new name. GeoffreyT2000 (talk) 23:07, 4 December 2015 (UTC)
- Can't imagine that's worth the effort. Is this ever a problem? I'm sure there are occasional situations where moving is necessary; a blanket ban would be counterproductive. — Earwig talk 23:19, 4 December 2015 (UTC)
- Note that there's a WP:VPP discussion about whether this should happen going on right now. APerson (talk!) 23:01, 6 December 2015 (UTC)
Neelix redirects
An adminbot should delete all redirects created by Neelix, many of which are currently at RfD. GeoffreyT2000 (talk) 17:43, 5 December 2015 (UTC)
- We've had seemingly endless discussions about this. The consensus arrived at was that admins, human admins, can use their judgement and delete any that seem silly under G6 and speedily close any RFDs. As tempting as I find this idea, consensus ws already established that some of these redirects re not utter garbage. If we could just blindly delete them en masse it would have already happened. Beeblebrox (talk) 18:58, 5 December 2015 (UTC)
Substitute all cite doi and cite pmid templates
Per the RfCs at [9] and [10], the use of {{cite doi}} and {{cite pmid}} templates has been deprecated. We need a bot to go through all the articles that currently transclude those templates and substitute them instead. Kaldari (talk) 21:33, 9 December 2015 (UTC)
- It looks like Dexbot may have already done this. Kaldari (talk) 22:02, 9 December 2015 (UTC)
- @Ladsgroup: What's the status of Dexbot's clean-up of these templates? Can they all be deleted now? Kaldari (talk) 22:03, 9 December 2015 (UTC)
- Hey, Yes. I did what I could. All removed now
:)
Ladsgroupoverleg 01:33, 10 December 2015 (UTC)- I don't think the following is controversial: All instances of {{cite doi/*}} and {{cite pmid/*}} with no incoming links, transclusions, or incoming redirects can be deleted. It will take at least two passes to delete as many as possible, since some {{cite pmid/*}} templates are redirects to {{cite doi/*}} templates. – Jonesey95 (talk) 02:29, 10 December 2015 (UTC)
- Hey, Yes. I did what I could. All removed now
Help with BAFTA articles
Hello. I recently split content from BAFTA Award for Best Film (which previously listed nominees for three different categories) to make BAFTA Award for Best British Film and BAFTA Award for Best Film Not in the English Language. But I've noticed that lots of articles are fixed to pipe straight to the "Best Film" article so they are now directed to the wrong place. See for example Ida (film) and the BAFTA link right at the end of the lead. It's happening like this on most relevant articles I've looked at (which is annoying because the redirects would have worked anyway).
I don't quite know what bots are capable of, but I'm hoping it's possible to fix this. I imagine the best way would be for a bot to search for any articles with: [[BAFTA Award for Best Film|BAFTA Award for Best British Film]], [[BAFTA Award for Best Film|Best British Film]], [[BAFTA Award for Best Film|BAFTA Award for Best Film Not in the English Language]], [[BAFTA Award for Best Film|Best Film Not in the English Language]], [[BAFTA Award for Best Film|BAFTA Award for Best Foreign Film]], [[BAFTA Award for Best Film|Best Foreign Film]]. And then hopefully it could fix them by removing the piping? If it's at all possible that would be great because doing it manually will take ages. --Loeba (talk) 11:24, 10 December 2015 (UTC) WP:AWB baby 166.170.47.209 (talk) 16:30, 10 December 2015 (UTC)