Wikipedia:Bot requests
This page has a backlog that requires the attention of willing editors. Please remove this notice when the backlog is cleared. |
Commonly Requested Bots |
This is a page for requesting tasks to be done by bots per the bot policy. This is an appropriate place to put ideas for uncontroversial bot tasks, to get early feedback on ideas for bot tasks (controversial or not), and to seek bot operators for bot tasks. Consensus-building discussions requiring large community input (such as request for comments) should normally be held at WP:VPPROP or other relevant pages (such as a WikiProject's talk page).
You can check the "Commonly Requested Bots" box above to see if a suitable bot already exists for the task you have in mind. If you have a question about a particular bot, contact the bot operator directly via their talk page or the bot's talk page. If a bot is acting improperly, follow the guidance outlined in WP:BOTISSUE. For broader issues and general discussion about bots, see the bot noticeboard.
Before making a request, please see the list of frequently denied bots, either because they are too complicated to program, or do not have consensus from the Wikipedia community. If you are requesting that a template (such as a WikiProject banner) is added to all pages in a particular category, please be careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively (see example difference).
- Alternatives to bot requests
- WP:AWBREQ, for simple tasks that involve a handful of articles and/or only needs to be done once (e.g. adding a category to a few articles).
- WP:URLREQ, for tasks involving changing or updating URLs to prevent link rot (specialized bots deal with this).
- WP:USURPREQ, for reporting a domain be usurped eg.
|url-status=usurped
- WP:SQLREQ, for tasks which might be solved with an SQL query (e.g. compiling a list of articles according to certain criteria).
- WP:TEMPREQ, to request a new template written in wiki code or Lua.
- WP:SCRIPTREQ, to request a new user script. Many useful scripts already exist, see Wikipedia:User scripts/List.
- WP:CITEBOTREQ, to request a new feature for WP:Citation bot, a user-initiated bot that fixes citations.
Note to bot operators: The {{BOTREQ}} template can be used to give common responses, and make it easier to keep track of the task's current status. If you complete a request, note that you did with {{BOTREQ|done}}
, and archive the request after a few days (WP:1CA is useful here).
Legend |
---|
|
|
|
|
|
Manual settings |
When exceptions occur, please check the setting first. |
Bot-related archives |
---|
Add protection templates to recently protected articles
We have bots that remove protection templates from pages (DumbBOT and MusikBot), but we don't have a bot right now that adds protection templates to recently protected articles. Lowercase sigmabot used to do this until it stopped working about two years ago. I generally think it's a good idea to add protection templates to protected articles, so people know (especially if you're logged in and autoconfirmed, because then you would have no idea it would be semi-protected). —MRD2014 (talk • contribs) 13:06, 18 October 2016 (UTC)
- We need those bots because the expiration of protection is usually an automatic process. However, placing on the protection has to be done by an admin - and in this process as part of the instructions, a template that they use places that little padlock. Thus, any protected page will have the little padlock, I don't think many admins forget to do this. For it to be worth a bot to do this, there would have to be a substantial problem - can you show us any? If you can, then I will code and take this on. TheMagikCow (talk) 18:01, 15 December 2016 (UTC)
- @TheMagikCow: Sorry for the late reply, but it's not really a problem, it's just that some administrators don't add protection templates when protecting the page (example), so a logged-in autoconfirmed user would have no idea it's semi-protected or extended-protected unless they clicked "edit" and saw the notice about the page being semi-protected or extended-protected. I ended up adding {{pp-30-500}} to seven articles ([1]). This has nothing to do with removing protection templates (something DumbBOT and MusikBot already do). The adding of {{pp}} templates was previously performed by lowercase sigmabot. —MRD2014 (talk • contribs) 00:34, 28 December 2016 (UTC)
- @MRD2014: Ok, those examples make me feel that a bot is needed for this - and it would relieve the admins of the task of manually adding them. I think I will get Coding... and try to take this one on! TheMagikCow (talk) 10:39, 28 December 2016 (UTC)
- @TheMagikCow: Thanks! —MRD2014 (talk • contribs) 14:41, 28 December 2016 (UTC)
- @MRD2014: Ok, those examples make me feel that a bot is needed for this - and it would relieve the admins of the task of manually adding them. I think I will get Coding... and try to take this one on! TheMagikCow (talk) 10:39, 28 December 2016 (UTC)
- @TheMagikCow: Sorry for the late reply, but it's not really a problem, it's just that some administrators don't add protection templates when protecting the page (example), so a logged-in autoconfirmed user would have no idea it's semi-protected or extended-protected unless they clicked "edit" and saw the notice about the page being semi-protected or extended-protected. I ended up adding {{pp-30-500}} to seven articles ([1]). This has nothing to do with removing protection templates (something DumbBOT and MusikBot already do). The adding of {{pp}} templates was previously performed by lowercase sigmabot. —MRD2014 (talk • contribs) 00:34, 28 December 2016 (UTC)
- Or possibly a bot who sends the admin a notice that "It looks like during your protection action on X you may have forgotten to add the lock icon. Please check and add the appropriate lock icon. Thank you" Hasteur (talk) 02:07, 2 January 2017 (UTC)
- Hasteur's suggestion should probably be incorporated into the bot since it has the clear benefit of diminishing future instances of mismatched protection levels and protection templates by reminding admins for the future. Enterprisey (talk!) 03:05, 2 January 2017 (UTC)
- OK - Will try to add that - would it be easier if that was a template? TheMagikCow (talk) 11:53, 2 January 2017 (UTC)
- Some admins have {{nobots}} on their talk pages (Materialscientist for example) so the bot couldn't message those users. Also, lowercase sigmabot (the last bot to add protection templates) would correct protection templates too. —MRD2014 (talk • contribs) 17:20, 2 January 2017 (UTC)
- In some cases, there is no need to add a prot padlock, such as when the page already bears either
{{collapsible option}}
or{{documentation}}
; mostly these are pages in Template: space. Also, redirects should never be given a prot padlock - if done like this, for example, it breaks the redirection. Insread, redirects have a special set of templates which categorise the redir - they may be tagged with{{r fully protected}}
or equivalent ({{r semi-protected}}
, etc.), but it is often easier to ensure that either{{redirect category shell}}
or the older{{this is a redirect}}
is present, both of which determine the protection automatically, in a similar fashion to{{documentation}}
. --Redrose64 🌹 (talk) 12:11, 3 January 2017 (UTC)- About the notifying admins thing, MediaWiki:Protect-text says "Please update the protection templates on the page after changing the protection level." in the instructions section. Also, the bot should not tag redirects with pp templates per Redrose64. If it tags articles that aren't redirects, it shouldn't have any major issues. —MRD2014 (talk • contribs) 19:26, 3 January 2017 (UTC)
- In some cases, there is no need to add a prot padlock, such as when the page already bears either
- Some admins have {{nobots}} on their talk pages (Materialscientist for example) so the bot couldn't message those users. Also, lowercase sigmabot (the last bot to add protection templates) would correct protection templates too. —MRD2014 (talk • contribs) 17:20, 2 January 2017 (UTC)
- OK - Will try to add that - would it be easier if that was a template? TheMagikCow (talk) 11:53, 2 January 2017 (UTC)
- Hasteur's suggestion should probably be incorporated into the bot since it has the clear benefit of diminishing future instances of mismatched protection levels and protection templates by reminding admins for the future. Enterprisey (talk!) 03:05, 2 January 2017 (UTC)
- This would be better as a mediawiki feature - see Wikipedia:Village_pump_(technical)#Use_CSS_for_lock_icons_on_protected_pages.3F, meta:2016_Community_Wishlist_Survey/Categories/Admins_and_stewards#Make_the_display_of_protection_templates_automatic, phab:T12347. Two main benefits: not depending on bots to run, and not spamming the edit history (protections are already displayed, no need to double up). As RedRose has pointed out, we already have working Lua code. Samsara 03:48, 4 January 2017 (UTC)
- TheMagikCow has filed a BRFA for this request (see Wikipedia:Bots/Requests for approval/TheMagikBOT 2). —MRD2014 (talk • contribs) 18:29, 5 January 2017 (UTC)
BSicons
Could we have a bot that
- creates a daily-updated log of uploads, re-uploads, page moves and edits in BSicons (Commons files with prefix
File:BSicon_
); - makes a list of Commons redirects with prefix
File:BSicon_
; - uses the list (as well as a list of exceptions, probably this Commons category and its children) and uses it to edit RDT code (both {{Routemap}} and {{BSrow}}/{{BS-map}}/{{BS-table}}) which uses those redirects, replacing the redirect name with the newer name (for instance, replacing (
HUB83
) with (HUBe
) and (STRl
) with (STRfq
)); - goes through Category:Pages using BSsplit instead of BSsrws and replaces
\{\{BSsplit\|([^\|]+)\|([^\|]+)\|$1 $2 ([^\|\{\}])+\}\}
with{{BSsrws|$1|$2|$3}}
; and - creates a list of BSicons with file size over 1 KB.
| ||||||||||||||||||||||||||||
The example diagram. |
This request is primarily for #2 and #3, since there've been a lot of page moves from confusing icon names recently and CommonsDelinker doesn't work for BSicons because they don't use file syntax. The others would be nice extras, but they're not absolutely necessary if no one wants to work on them. For clarity, an example of #3 would be changing
{{Routemap |map= CONTg\CONTg BHF!~HUB84\BHF!~HUB82 CONTf\CONTf }}
to
{{Routemap |map= CONTg\CONTg BHF!~HUBaq\BHF!~HUBeq CONTf\CONTf }}
(Pinging Useddenim, Lost on Belmont, Sameboat, AlgaeGraphix, Newfraferz87, Redrose64 and YLSS.) Jc86035 (talk) Use {{re|Jc86035}}
to reply to me 08:59, 25 October 2016 (UTC)
- Point 1. should be all BSicon files, regardless of filetype, so that those (occasionally uploaded) .png files also get listed. Useddenim (talk) 10:48, 25 October 2016 (UTC)
- Updated request. Thanks. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me 11:42, 25 October 2016 (UTC)
- Updated request. Thanks. Jc86035 (talk) Use {{re|Jc86035}}
To further clarify, the regex for #3 is \n\{\{BS[^\}]+[\|\=]\s*$icon\s*\|
for BS-map. I have no idea what it'd be for Routemap, but to the left of the icon ID could be one of \n
(newline), ! !
, !~
and \\
(escaped backslash); and to the right could be one of \n
, !~
, ~~
, !@
, __
, !_
and \\
. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me 06:21, 26 October 2016 (UTC)
I started to do some coding for this request, but I will not have time to continue working on it until January. (I have no objections to another botop is handling this request before then.) I'm not familiar with the route diagram templates, so I will likely have questions. — JJMC89 (T·C) 18:16, 9 December 2016 (UTC)
- Update: I've done most of the coding for this, but I've run into a Pywikibot bug. The bug effects getting the list of redirects to exclude for #3. (If the bug is not resolved soon, I will try to work around it.) See below for example output for #1. (It will normally be replaced daily and only contain the previous day's changes.) #2 and #5 will be simple bulleted or numbered lists. Which would you prefer? Some clarification for #3, for
{{routemap}}
replace in|map=
based on the separators above and for any template with a name starting withBS
(or[Bb][Ss]
?) replace entire parameter values that match, correct? Example for-BS
tov-BSq
on Minami-Urawa Station:@@ -61 +61 @@ - {{BS6|dSTRq- orange|O1=dv-NULgq|STRq- orange|O2=-BS|STRq- orange|O3=-BS|STRq- orange|O4=-BS|STRq- orange|O5=-BS|dSTRq- orange|O6=dv-NULgq|5|← {{ja-stalink|Fuchūhommachi}}}} + {{BS6|dSTRq- orange|O1=dv-NULgq|STRq- orange|O2=v-BSq|STRq- orange|O3=v-BSq|STRq- orange|O4=v-BSq|STRq- orange|O5=v-BS q|dSTRq- orange|O6=dv-NULgq|5|← {{ja-stalink|Fuchūhommachi}}}}
Example for #1
|
---|
{| class="wikitable sortable" |+ Updated: ~~~~~ ! File !! Oldid !! Date/time !! User !! Edit summary |- | [[commons:File:BSicon -3BRIDGE.svg]] || 219463995 || 2016-11-25T18:24:57Z || SchlurcherBot || Bot: Removingcategory 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]] |- | [[commons:File:BSicon -3BRIDGEq.svg]] || 220150188 || 2016-11-26T09:13:44Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]] |- | [[commons:File:BSicon -3KRZ.svg]] || 220150226 || 2016-11-26T09:13:47Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]] |- | [[commons:File:BSicon -3KRZo.svg]] || 220150264 || 2016-11-26T09:13:50Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]] |- | [[commons:File:BSicon -3KRZu.svg]] || 220150305 || 2016-11-26T09:13:52Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]] |- | [[commons:File:BSicon -3STRq.svg]] || 220150349 || 2016-11-26T09:13:55Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]] |- | [[commons:File:BSicon -BRIDGE.svg]] || 220150391 || 2016-11-26T09:13:58Z || AkBot || Category:Uploaded withUploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]] |- | [[commons:File:BSicon -BRIDGE.svg]] || 203687181 || 2016-08-11T08:35:01Z || Jc86035 || Jc86035 uploaded a new version of [[File:BSicon -BRIDGE.svg]] |- | [[commons:File:BSicon -BRIDGE.svg]] || 203686022 || 2016-08-11T08:21:46Z || Jc86035 || Jc86035 uploaded a new version of [[File:BSicon -BRIDGE.svg]] |- | [[commons:File:BSicon -BRIDGEl.svg]] || 219463892 || 2016-11-25T18:24:47Z || SchlurcherBot || Bot: Removingcategory 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]] |- | [[commons:File:BSicon -BRIDGEq.svg]] || 219463860 || 2016-11-25T18:24:43Z || SchlurcherBot || Bot: Removingcategory 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]] |- | [[commons:File:BSicon -BRIDGEq.svg]] || 204432312 || 2016-08-21T08:45:59Z || Jc86035 || Jc86035 uploaded a new version of [[File:BSicon -BRIDGEq.svg]] |- | [[commons:File:BSicon -BRIDGEr.svg]] || 219463904 || 2016-11-25T18:24:48Z || SchlurcherBot || Bot: Removingcategory 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]] |- | [[commons:File:BSicon -BRIDGEvq.svg]] || 219463868 || 2016-11-25T18:24:44Z || SchlurcherBot || Bot: Removing category 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]] |- | [[commons:File:BSicon -BS.svg]] || 203149593 || 2016-08-05T01:20:39Z || Tuvalkin || Tuvalkin moved page [[File:BSicon -BS.svg]] to [[File:BSicon v-BSq.svg]] over redirect: Because that’s how it should be named. |- | [[commons:File:BSicon -DSTq.svg]] || 219463968 || 2016-11-25T18:24:54Z || SchlurcherBot || Bot: Removing category 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]] |- | [[commons:File:BSicon -GIPl.svg]] || 190769935 || 2016-03-20T19:57:19Z || Plutowiki || Lizenz |- | [[commons:File:BSicon -GIPl.svg]] || 190769739 || 2016-03-20T19:53:46Z || Plutowiki || User created page with UploadWizard |- | [[commons:File:BSicon -GRZq.svg]] || 219464007 || 2016-11-25T18:24:59Z || SchlurcherBot || Bot: Removing category 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|discussion]] |- | [[commons:File:BSicon -KBSTl.svg]] || 220150428 || 2016-11-26T09:14:01Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]] |- | [[commons:File:BSicon -KBSTr.svg]] || 220150461 || 2016-11-26T09:14:03Z || AkBot || Category:Uploaded with UploadWizard removed per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard|community decision]] |- | [[commons:File:BSicon -L3STRq.svg]] || 219464058 || 2016-11-25T18:25:05Z || SchlurcherBot || Bot: Removing category 'Uploaded with UploadWizard' per [[Commons:Categories for discussion/2016/08/Category:Uploaded with UploadWizard |discussion]] |- | [[commons:File:BSicon -LSTR+l.svg]] || 228341148 || 2017-01-02T00:38:51Z || Zyxw59 || |- | [[commons:File:BSicon -LSTR+l.svg]] || 228341113 || 2017-01-02T00:37:48Z || Zyxw59 || User created page with UploadWizard |} |
- I haven't coded for #4 yet. Does it need to be done on a regular basis or only once? — JJMC89 (T·C) 02:14, 19 January 2017 (UTC)
- @JJMC89: (pinging Useddenim, Sameboat and AlgaeGraphix) Many thanks, looks good. Don't think #4 is necessary in retrospect, because it would catch some links to rail lines as well and changing those might be counterintuitive. Not sure about #3 but the whole icon name should be changed, if that's what you're saying. For #3 it might be a good idea to have a blacklist of redirects which shouldn't be changed (or a whitelist), because some icons, including (
v-BSq
), might have been moved to a bad/incorrect name. It might be better to use numbered lists so we could find the length of the lists easily, but I don't mind if they're bulleted. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me 02:40, 19 January 2017 (UTC)- My preference would be for bulleted list, as it is easier to manipulate with a text editor (vs. stripping out all of the item numbers). If you need the total number of files, it should be trivial to add a
count
statement to the bot's code. AlgaeGraphix (talk) 18:40, 19 January 2017 (UTC)- @AlgaeGraphix: Bullets vs numbered is
*
vs#
as the character at the beginning of each line. @Jc86035: There will be an onwiki configuration that contains a blacklist of sorts. It will initially be excluding c:Category:Icons for railway descriptions/Exceptional permanent redirects (recursively including subcategories). Would using {{bsq}} for #1 and/or #2&5 instead of or in addition to the linked file name be beneficial? Also, if desired for #1 the table could include the last n days of changes. For the BRFA: Have there been any prior discussions for this? What pages would you like to use for #1, #2, and #5? (I'll but them in userspace if you don't have a place for them in projectspace.) Do you have an estimate of edits per day for #3? — JJMC89 (T·C) 05:50, 20 January 2017 (UTC)- @JJMC89: I guess #1, #2 and #5 could go on your Commons userspace, but I don't really mind. For #1 maybe log pages could be sectioned into 24-hour periods (starting 00:00 UTC), like Chumwa's Commons new file logs but in tabular format. Using {{bsq}} would be great. I'm not aware if there have been any prior discussions (Useddenim, Tuvalkin, Sameboat?), although CommonsDelinker has never worked well with BSicons and I believe the only bot that previously did this was Chrisbot for a few months in 2009. For #3, there's probably going to be a very large number of edits on the first day (possibly as many as 5,000), but very few after that. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me 09:29, 20 January 2017 (UTC)- @Jc86035:: The bot action as requested and their suggested splitting bweteen Commons and Wikiepdias make sense in my opinion. There were several discussions concerning CommonsDelinker in the past, but the matter is as you presented it. Tuvalkin (talk) 11:09, 20 January 2017 (UTC)
- @JJMC89: I guess #1, #2 and #5 could go on your Commons userspace, but I don't really mind. For #1 maybe log pages could be sectioned into 24-hour periods (starting 00:00 UTC), like Chumwa's Commons new file logs but in tabular format. Using {{bsq}} would be great. I'm not aware if there have been any prior discussions (Useddenim, Tuvalkin, Sameboat?), although CommonsDelinker has never worked well with BSicons and I believe the only bot that previously did this was Chrisbot for a few months in 2009. For #3, there's probably going to be a very large number of edits on the first day (possibly as many as 5,000), but very few after that. Jc86035 (talk) Use {{re|Jc86035}}
- @AlgaeGraphix: Bullets vs numbered is
- My preference would be for bulleted list, as it is easier to manipulate with a text editor (vs. stripping out all of the item numbers). If you need the total number of files, it should be trivial to add a
- @JJMC89: (pinging Useddenim, Sameboat and AlgaeGraphix) Many thanks, looks good. Don't think #4 is necessary in retrospect, because it would catch some links to rail lines as well and changing those might be counterintuitive. Not sure about #3 but the whole icon name should be changed, if that's what you're saying. For #3 it might be a good idea to have a blacklist of redirects which shouldn't be changed (or a whitelist), because some icons, including (
Commons bot request filed. I will file a BRFA here after that has run its course. — JJMC89 (T·C) 00:35, 22 January 2017 (UTC)
- Initial redirect-changing blacklist should include all of these icons; convoluted and boring discussion under way on exactly what's wrong with them or if anything's wrong with them. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me 14:03, 22 January 2017 (UTC)- BRFA filed. — JJMC89 (T·C) 04:22, 30 January 2017 (UTC)
- @JJMC89: Just one thing –
{{bsq|redirect}}
(if it's going to be edited by the bot) should be replaced with{{bsq|new name|alt=redirect}}
. Jc86035 (talk) Use {{re|Jc86035}}
to reply to me 08:51, 31 January 2017 (UTC)- @Jc86035: The bot is only editing
{{Routemap}}
and{{BS.*}}
route diagram templates.{{BSicon quote}}
({{bsq}}
) is a rail routemap template, so it will be ignored. — JJMC89 (T·C) 16:38, 31 January 2017 (UTC)
- @Jc86035: The bot is only editing
- @JJMC89: Just one thing –
- BRFA filed. — JJMC89 (T·C) 04:22, 30 January 2017 (UTC)
Bot to notify editors when they add a duplicate template parameter
Category:Pages using duplicate arguments in template calls has recently been emptied of the 100,000+ pages that were originally in there, but editors continue to modify articles and inadvertently add duplicate parameters to templates. It would be great to have a bot, similar to ReferenceBot, to notify editors that they have caused a page to be added to that category. ReferenceBot, which notifies editors when they create certain kinds of citation template errors, has been successful in keeping the categories in Category:CS1 errors from overflowing.
Pinging A930913, the operator of ReferenceBot, in case this seems like a task that looks like fun. – Jonesey95 (talk) 21:13, 3 November 2016 (UTC)
- This sounds like an interesting idea; I think I'd support the creation of such a bot to keep that category from refilling. Dustin (talk) 21:16, 5 November 2016 (UTC)
- I would support this if (and only if) an opt-out was included. This is mostly because of a tagging task my bot runs that occasionally results in duplicate parameters that I must review. This occurs when unexpected parameter values are included on certain talk page templates, and allowing the bug to persist is my (perhaps unorthodox) method of drawing my attention to fixing those erroneous parameter values. I'm quick about cleaning up behind my bot, and I just don't want to have my bot spammed with talk page messages while it's trying to run. Talk page messages shut off the bot until viewed, so that would be very annoying. ~ Rob13Talk 14:16, 8 November 2016 (UTC)
- That sounds reasonable. I believe that ReferenceBot runs once per day at 00:00 UTC, which gives people plenty of time to clean up unless they happen to be editing around that time. Other bots appear to wait for some time after the last edit to a page before tagging the article or notifying the editor who made the erroneous edit. – Jonesey95 (talk) 17:38, 8 November 2016 (UTC)
- We have had at least 500 articles added to this category in the last nine days. A notification bot would be very helpful. – Jonesey95 (talk) 07:48, 18 November 2016 (UTC)
- That sounds reasonable. I believe that ReferenceBot runs once per day at 00:00 UTC, which gives people plenty of time to clean up unless they happen to be editing around that time. Other bots appear to wait for some time after the last edit to a page before tagging the article or notifying the editor who made the erroneous edit. – Jonesey95 (talk) 17:38, 8 November 2016 (UTC)
- I would support this if (and only if) an opt-out was included. This is mostly because of a tagging task my bot runs that occasionally results in duplicate parameters that I must review. This occurs when unexpected parameter values are included on certain talk page templates, and allowing the bug to persist is my (perhaps unorthodox) method of drawing my attention to fixing those erroneous parameter values. I'm quick about cleaning up behind my bot, and I just don't want to have my bot spammed with talk page messages while it's trying to run. Talk page messages shut off the bot until viewed, so that would be very annoying. ~ Rob13Talk 14:16, 8 November 2016 (UTC)
- I would strongly support such a bot; cleaning out this category is like pulling out weeds - they just come back again. Opt-out as suggested by Rob is sensible. --NSH002 (talk) 21:11, 18 November 2016 (UTC)
How about generalising this?
Basically, the bot has a table of categories: each line of the table has (1) name of category (2) definition of polite, friendly message to be posted on the perpetrator's talk page (3) how long to wait before notifying editor.
Bot looks at the categories page was in before and after the edit. If it wasn't in the category before the edit AND it is in the category immediately after the edit AND it's still in the category when the bot is run, then post the message.
So in future, all we need to do for similar cases is to get consensus that a particular cat warrants this treatment, and if so, agreement on the message. Job done.
--NSH002 (talk) 19:28, 21 November 2016 (UTC)
- Sounds good to me. We need a bot operator. – Jonesey95 (talk) 19:47, 21 November 2016 (UTC)
- Just finished coding another bot task. I suppose I'll start {{BOTREQ|doing}} this. Any other botop can feel free to work on this however, as this may take some time. Dat GuyTalkContribs 15:48, 9 December 2016 (UTC)
- @NSH002 and Jonesey95: Could you create a page similar to User:DeltaQuad/UAA/Blacklist in your, DatBot's, or my userspace? Dat GuyTalkContribs 20:14, 13 December 2016 (UTC)
- DatGuy, thank you very much for offering to do this, very much appreciated. But I don't understand what relevance the "Blacklist" you link to has to this particular task. Could you explain, please? --NSH002 (talk) 20:24, 13 December 2016 (UTC)
- I was about to respond, but reread the proposal and found it very different than the original one (in the main section). Could you rephrase it in other words? Dat GuyTalkContribs 20:47, 13 December 2016 (UTC)
- The original proposal warns editors when, as the result of some oversight or mistake, they inadvertently add an article to an error-tracking category (the one specified at the top of the proposal). Note that I said the "definition" and not the "text" of the error message, since the definition would incorporate parameters that would be evaluated at run time. Apart from this subtlety, exactly the same code should be able to do the job whatever the category involved. There would simply be a separate, fully-protected file that an admin could update whenever consensus and agreement has been reached that a particular category warrants this treatment. No need to write a separate bot each time we want to do this for another category (though sometimes a new parameter may be needed for the message definition, but that should be a fairly simple job).
- I was about to respond, but reread the proposal and found it very different than the original one (in the main section). Could you rephrase it in other words? Dat GuyTalkContribs 20:47, 13 December 2016 (UTC)
- DatGuy, thank you very much for offering to do this, very much appreciated. But I don't understand what relevance the "Blacklist" you link to has to this particular task. Could you explain, please? --NSH002 (talk) 20:24, 13 December 2016 (UTC)
-
- Note that this bot has the potential to drastically reduce the workload of fixing errors. Remember GIGO: "Garbage in, garbage out" - much better to trap errors at the earliest possible stage.
-
- --NSH002 (talk) 22:19, 13 December 2016 (UTC)
- DatGuy: The original proposal is a specific case of the more general system described in this section. In the original case:
- An editor makes an edit to a page, inadvertently adding a duplicate template parameter. This adds the page to the error-tracking category for duplicate parameters.
- If the error persists after a specified period of time, the editor is notified that he/she created an error and is provided a link to the diff and to the error category, or to a page that explains how to fix the error.
- DatGuy: The original proposal is a specific case of the more general system described in this section. In the original case:
- --NSH002 (talk) 22:19, 13 December 2016 (UTC)
- If you look at ReferenceBot's user page, you can see a list of error-tracking categories that are monitored by that bot. Here's a link to one of that bot's notifications. You might be able to start with that bot's source code and generalize it to a variety of categories. The "generalising" proposal would result in an admin-modifiable page that specified the error categories that should be checked and the messages (or links to messages) that should be delivered to editors. – Jonesey95 (talk) 23:48, 13 December 2016 (UTC)
- Asap, I'll start coding it for only the category referenced above. If we want to generalise it, I'm sure it will be easy. Dat GuyTalkContribs 15:56, 14 December 2016 (UTC)
- If you look at ReferenceBot's user page, you can see a list of error-tracking categories that are monitored by that bot. Here's a link to one of that bot's notifications. You might be able to start with that bot's source code and generalize it to a variety of categories. The "generalising" proposal would result in an admin-modifiable page that specified the error categories that should be checked and the messages (or links to messages) that should be delivered to editors. – Jonesey95 (talk) 23:48, 13 December 2016 (UTC)
I'm super sorry, but I won't be able to do it. The one main point in ReferenceBot is to look for a class error, which I can't understand for these errors. I don't have enough time to code a whole new bot task. Again, sorry. Dat GuyTalkContribs 20:17, 25 December 2016 (UTC)
VeblenBot
User:VeblenBot handles many of the routine chores associated with Peer Review. It was developed by User:CBM, and is currently in my care, but neither of us has the time or inclination to run it. Would someone be able to take it over? If so, please reply here. Thanks, Ruhrfisch ><>°° 19:06, 20 November 2016 (UTC)
- Much of this is just a task-specific archiving job. It would be possible for someone else to rewrite this in a bot framework of their choice without too much work, instead of taking over the existing code. It's an important task for the Peer Review system, but I can't manage it any longer. — Carl (CBM · talk) 13:05, 21 November 2016 (UTC)
- Are there details anywhere of what exactly the bot does? I'm not finding a relevant-looking BRFA for "many routine chores". Anomie⚔ 17:27, 26 November 2016 (UTC)
- @Anomie: I'm also approved for this task, and I also can't maintain it, sadly. My implementation was kind of shit anyway. See Wikipedia:Bots/Requests_for_approval/BU_RoBOT_9 for task details, though. ~ Rob13Talk 05:13, 21 January 2017 (UTC)
- One BRFA is Wikipedia:Bots/Requests_for_approval/VeblenBot_5. The PR system worked by having the bot make a page that tracks category contents. This is a relatively straightforward task: given a list of categories and templates, it generates wiki pages which list the category contents using the templates. By looking at subpages of User:VeblenBot/C/, it should be possible to recreate the list of categories that need to be tracked. There is more information at Template:CF and Wikipedia:Peer_review/Tools#Peer_review_process_-_technical_details. — Carl (CBM · talk) 01:41, 22 January 2017 (UTC)
- @BU Rob13: BRFA filed Anomie⚔ 01:21, 23 January 2017 (UTC)
- @Enterprisey: Didn't realize Anomie had taken this on. It wouldn't be a horrible thing to have multiple bot ops able to do this, but you may prefer to devote time elsewhere. Entirely up to you. Sorry if you've already started development. ~ Rob13Talk 02:29, 23 January 2017 (UTC)
- No problem, and thanks for letting me know. Enterprisey (talk!) 03:40, 23 January 2017 (UTC)
- @Enterprisey: Didn't realize Anomie had taken this on. It wouldn't be a horrible thing to have multiple bot ops able to do this, but you may prefer to devote time elsewhere. Entirely up to you. Sorry if you've already started development. ~ Rob13Talk 02:29, 23 January 2017 (UTC)
- @Anomie: I'm also approved for this task, and I also can't maintain it, sadly. My implementation was kind of shit anyway. See Wikipedia:Bots/Requests_for_approval/BU_RoBOT_9 for task details, though. ~ Rob13Talk 05:13, 21 January 2017 (UTC)
- Are there details anywhere of what exactly the bot does? I'm not finding a relevant-looking BRFA for "many routine chores". Anomie⚔ 17:27, 26 November 2016 (UTC)
Replacement Peer review bot - VeblenBot - URGENT
Peer review generally has 30-50 active reviews. We rely on a single bot, VeblenBot, to process new reviews and archive old reviews, otherwise the whole system crumbles. Unfortunately for the last 3 or so years we have had a large number of problems because the bot is not well supported and frequently is inactive.
We and the thousands of Wikipedians who use peer reviews would be very grateful if a functional replacement bot could be created that works consistently. I can supply more technical details about the process later, it is documented at WP:PR. Many thanks if you can solve this!! --Tom (LT) (talk) 12:57, 30 December 2016 (UTC)
- Tom (LT), to understand what it does. The bot takes as input Category:Arts peer reviews and produces as output User:VeblenBot/C/Arts peer reviews. Does it also remove entries? Does it retrieve data from other places? -- GreenC 19:06, 30 December 2016 (UTC)
- Green Cardamom, see WP:PR tab "technical details" --Tom (LT) (talk) 00:07, 31 December 2016 (UTC)
- I've seen that. It's called the "Tools" tab BTW. -- GreenC 00:54, 31 December 2016 (UTC)
- Green Cardamom, see WP:PR tab "technical details" --Tom (LT) (talk) 00:07, 31 December 2016 (UTC)
- As it's hosted on Labs, I think the easiest thing would be for User:CBM (who I think is inactive as a bot op) or User:Ruhrfisch to add another keen Perl enthusiast to their Labs project to help out from time to time. Unfortunately I don't do Perl but I know plenty of people watching this page do! - Jarry1250 [Vacation needed] 20:13, 30 December 2016 (UTC)
- @LT910001: At one point, I had taken this on I believe, but I petered out on running the task. That was my fault. Unfortunately, I can't run the bot for the next two weeks because I'm out-of-town. I can run it when I get back, but I'm much busier these days than I used to be, so I probably can't do it long-term. ~ Rob13Talk 23:44, 30 December 2016 (UTC)
- Thanks for your offer, and if you could activate the bot infrequently that would be better than not at all, but we really need a longer term solution here. --Tom (LT) (talk) 00:07, 31 December 2016 (UTC)
- @LT910001: Totally fell off my radar over the two weeks I was out of town, but I eventually remembered this. The bot is running now. ~ Rob13Talk 05:11, 21 January 2017 (UTC)
- Thank you BU Rob13, much appreciated!--Tom (LT) (talk) 05:43, 21 January 2017 (UTC)
- @LT910001: Totally fell off my radar over the two weeks I was out of town, but I eventually remembered this. The bot is running now. ~ Rob13Talk 05:11, 21 January 2017 (UTC)
- Thanks for your offer, and if you could activate the bot infrequently that would be better than not at all, but we really need a longer term solution here. --Tom (LT) (talk) 00:07, 31 December 2016 (UTC)
- @LT910001: At one point, I had taken this on I believe, but I petered out on running the task. That was my fault. Unfortunately, I can't run the bot for the next two weeks because I'm out-of-town. I can run it when I get back, but I'm much busier these days than I used to be, so I probably can't do it long-term. ~ Rob13Talk 23:44, 30 December 2016 (UTC)
Can a bot operator have one of their bots fill in for what this bot above is supposed to do, because I've noticed that the peer review nomination pages haven't been updated since November. -- 1989 (talk) 23:38, 21 January 2017 (UTC)
- Coding... Anomie⚔ 01:28, 22 January 2017 (UTC)
- The same is true for the two GAR-related pages handled by VeblenBot: one is at User:VeblenBot/C/Wikipedia good article reassessment and controls the community reassessments are transcluded at WP:GAR, the main Good Article Reassessment page—we've been adding and subtracting these by hand as community GARs show up at Category:Good article reassessment nominees (which include both community and individual reassessments), or the GARs are closed and vanish from the category. The other is at User:VeblenBot/C/Good articles in need of review, and is based on the {{GAR request}} templates and their associated category, Category:Good articles in need of review. This last hasn't been updated since November either. I'm not sure what it would take to get these two VeblenBot chores up and working again, but it would be greatly appreciated. Many thanks. BlueMoonset (talk) 01:57, 22 January 2017 (UTC)
- I've gotten code done to start populating those lists as subpages of User:AnomieBOT/C (since edits to the bot's own userspace don't need a BRFA). The categories to listify are configurable on-wiki if more are needed, see the instructions on that page. Going to look at the replacement for Wikipedia:Bots/Requests for approval/BU RoBOT 9 next. Anomie⚔ 03:06, 22 January 2017 (UTC)
- @Anomie: You forgot to make this one, User:AnomieBOT/C/List peer reviews, list articles for PR. -- 1989 (talk) 03:23, 22 January 2017 (UTC)
- I've gotten code done to start populating those lists as subpages of User:AnomieBOT/C (since edits to the bot's own userspace don't need a BRFA). The categories to listify are configurable on-wiki if more are needed, see the instructions on that page. Going to look at the replacement for Wikipedia:Bots/Requests for approval/BU RoBOT 9 next. Anomie⚔ 03:06, 22 January 2017 (UTC)
- The same is true for the two GAR-related pages handled by VeblenBot: one is at User:VeblenBot/C/Wikipedia good article reassessment and controls the community reassessments are transcluded at WP:GAR, the main Good Article Reassessment page—we've been adding and subtracting these by hand as community GARs show up at Category:Good article reassessment nominees (which include both community and individual reassessments), or the GARs are closed and vanish from the category. The other is at User:VeblenBot/C/Good articles in need of review, and is based on the {{GAR request}} templates and their associated category, Category:Good articles in need of review. This last hasn't been updated since November either. I'm not sure what it would take to get these two VeblenBot chores up and working again, but it would be greatly appreciated. Many thanks. BlueMoonset (talk) 01:57, 22 January 2017 (UTC)
Anomie, thanks for helping them with this. It is not a hard task, but I needed to move on to other things. — Carl (CBM · talk) 14:55, 22 January 2017 (UTC)
- Anomie, my thanks, too. Do you or CBM know why the User:AnomieBOT/C/Wikipedia good article reassessment page display omits the last several entries (the ones from 2017)? The same thing was happening on the VeblenBot page, and while it doesn't prevent the page from working correctly with WP:GAR, something doesn't seem to be working as it should. Please let me know when you consider these pages ready to be used officially—when the bot runs on a regular schedule—and I'll adjust the GAR pages accordingly. BlueMoonset (talk) 16:09, 22 January 2017 (UTC)
- @BlueMoonset: All 12 articles named like "Wikipedia:Good article reassessment/" in Category:Wikipedia good article reassessment are showing up on User:AnomieBOT/C/Wikipedia good article reassessment; the answer is probably that some change made the last several entries not be in that category anymore. Anomie⚔ 19:06, 22 January 2017 (UTC)
- Also, AnomieBOT is running on a regular schedule already (it checks for updates to the lists hourly). Feel free to change things over. Anomie⚔ 19:10, 22 January 2017 (UTC)
- Anomie, thanks for letting me know that AnomieBOT is now handling these pages and checking on an hourly basis. I'll update the affected GAR pages in a few minutes. As for the 2017-dated entries not displaying on the User:AnomieBOT/C/Wikipedia good article reassessment page, they're still in the category; this has been an issue since they were first manually added to the User:VeblenBot/C/Wikipedia good article reassessment page starting back on January 7. It doesn't affect the transclusions on the WP:GAR page, but it's odd that the AnomieBOT page, like the VeblenBot page before it, doesn't display them. This may be something down in the weeds of the CF suite workings; I didn't see that any of the peer review pages use this name-only format, so there may be a parameter somewhere that prevents post-2016 entries from displaying in this one case. BlueMoonset (talk) 19:34, 22 January 2017 (UTC)
- @BlueMoonset: Oh, I see what you're referring to now, they're not showing up in the rendered page. Template:CF/GAR/Default (used by Template:CF/Wikipedia good article reassessment) has logic to only show reassessments that are more than 17 days old. The first 2017 assessment, from Jan 6, should start showing up around 16 hours from now. Anomie⚔ 20:39, 22 January 2017 (UTC)
- Anomie, thanks for looking in to it. Presumably there's a good historical reason for the 17 day delay; it's good to know what's causing the display to work as it does. BlueMoonset (talk) 20:57, 22 January 2017 (UTC)
- @BlueMoonset: Oh, I see what you're referring to now, they're not showing up in the rendered page. Template:CF/GAR/Default (used by Template:CF/Wikipedia good article reassessment) has logic to only show reassessments that are more than 17 days old. The first 2017 assessment, from Jan 6, should start showing up around 16 hours from now. Anomie⚔ 20:39, 22 January 2017 (UTC)
- Adding: it looks like 1989 made the switchover at 03:30, so AnomieBOT has been on the job for over 16 hours at GAR. Thanks again. BlueMoonset (talk) 19:39, 22 January 2017 (UTC)
- Anomie, thanks for letting me know that AnomieBOT is now handling these pages and checking on an hourly basis. I'll update the affected GAR pages in a few minutes. As for the 2017-dated entries not displaying on the User:AnomieBOT/C/Wikipedia good article reassessment page, they're still in the category; this has been an issue since they were first manually added to the User:VeblenBot/C/Wikipedia good article reassessment page starting back on January 7. It doesn't affect the transclusions on the WP:GAR page, but it's odd that the AnomieBOT page, like the VeblenBot page before it, doesn't display them. This may be something down in the weeds of the CF suite workings; I didn't see that any of the peer review pages use this name-only format, so there may be a parameter somewhere that prevents post-2016 entries from displaying in this one case. BlueMoonset (talk) 19:34, 22 January 2017 (UTC)
Missing BLP template
We need a bot that will search for all articles in Category:Living people, but without a {{BLP}} (or alternatives) on article's talk page, and add to these pages missing template. --XXN, 21:21, 20 November 2016 (UTC)
- Ideally, not
{{BLP}}
directly, but indirectly via{{WikiProject Biography|living=yes}}
. But we once had a bot that did that, I don't know what happened to it. --Redrose64 (talk) 10:33, 21 November 2016 (UTC){{WikiProject Biography|living=yes}}
add the biography to Category:Biography articles of living people. TheMagikCow (talk) 18:48, 16 January 2017 (UTC)
- Hi@Redrose64:, what was that bot's name? We faced such need recently during the Wiki Loves Africa photo contest on Commons. Hundreds of pictures from a parent category missed a certain template. I am planning to build of bot or adapt an existing one for similar cases.--African Hope (talk) 17:08, 4 February 2017 (UTC)
- I'll code this. If there is a living people category but no {{BLP}} or {{WikiProject Banner Shell}} or {{WikiProject Biography}}? I might expand this to see if it has a 'living people' category but it does not have a living or not parameter. Dat GuyTalkContribs 17:28, 4 February 2017 (UTC)
- I don't recall. Maybe Rich Farmbrough (talk · contribs) knows? --Redrose64 🌹 (talk) 00:20, 5 February 2017 (UTC)
- I have been fixing members of Category:Biography articles without living parameter along with User:Vami_IV for some time. Menobot ensures that most biographies get tagged. I also did a one-off to tag such biographies a couple of moths ago. All the best: Rich Farmbrough, 00:32, 5 February 2017 (UTC).
- I have been fixing members of Category:Biography articles without living parameter along with User:Vami_IV for some time. Menobot ensures that most biographies get tagged. I also did a one-off to tag such biographies a couple of moths ago. All the best: Rich Farmbrough, 00:32, 5 February 2017 (UTC).
- I don't recall. Maybe Rich Farmbrough (talk · contribs) knows? --Redrose64 🌹 (talk) 00:20, 5 February 2017 (UTC)
- I'll code this. If there is a living people category but no {{BLP}} or {{WikiProject Banner Shell}} or {{WikiProject Biography}}? I might expand this to see if it has a 'living people' category but it does not have a living or not parameter. Dat GuyTalkContribs 17:28, 4 February 2017 (UTC)
IP-WHOIS bot
During vandal hunting I've noticed that IP vandals usually stop in their tracks the moment you add the 'Shared IP' template (with WHOIS info) to their Talk page. I assume they then realise they're not as anonymous as they thought. A bot that would automatically add that WHOIS template to an IP vandal's Talk page, let's say once they've reached warning level 2, would prevent further vandalism in a lot of cases. I don't know if this needs to be a new bot or if it could be added to ClueBot's tasks. I think ClueBot would be the best option since it already leaves warnings on those Talk pages, so adding the Shared/WHOIS template as well would probably be the fastest option. Any thoughts? Mind you, I'm not a programmer so there's no way I could code this thing myself. Yintan 20:27, 30 November 2016 (UTC)
- This would be fairly easy to do. Coding... Tom29739 [talk] 17:32, 8 December 2016 (UTC)
- Nice idea, Tom29739 what's the status on this? 103.6.159.67 (talk) 08:04, 16 January 2017 (UTC)
- This is still being coded, development has slowed unfortunately due to being very busy in real life. Tom29739 [talk] 22:40, 18 January 2017 (UTC)
Update WikiWork factors
Hi, per what was discussed at Wikipedia talk:Version 1.0 Editorial Team/Index#Wikiwork factors, I'm asking that the WikiWork factors for WikiProjects be updated. I myself frequently reference them, they're pretty useful overall to scope out a project, so it would be nice to get them working again. Thanks, Icebob99 (talk) 16:16, 13 December 2016 (UTC)
- Hello? Is this request feasible? Icebob99 (talk) 16:17, 16 December 2016 (UTC)
Just to give more context, at Wikipedia talk:Version 1.0 Editorial Team/Index we are having a discussion regarding the WP 1.0 bot (talk · contribs). One of the functions of the bot is to update the Wikiwork factors to be displayed in the WikiProject assessment tables. However the bot stopped updating it since July 2015. REquests to the bot owner has not been answered since he/she seems to have retired. Can someone here please see what's the issue with the bot and make it run again? The bot in question which updates the Wikiwork numbers is called Theo's Little Bot (talk · contribs). It has been doing other jobs as can be seen, just skipping the Wikiwork updation. —IB [ Poke ] 14:35, 17 December 2016 (UTC)
- Just for reference: here's the manual calculator; you can divide the score you get from that tool by the total number of articles in the project to get the relative score. Getting the bot to do this, of course, would be the ideal scenario. Icebob99 (talk) 00:52, 18 December 2016 (UTC)
- Thanks for the URL @Icebob99:, now I can get the progression of each Wikiproject. Do I need to update the Wikiwork page to reflect this so that its assimilated in the project assessment table? —IB [ Poke ] 06:33, 19 December 2016 (UTC)
- @IndianBio: I went from User:WP 1.0 bot/Tables/Project/Microbiology to User:WP 1.0 bot/WikiWork and found in the documentation that there are four different pages that the User:Theo's Little Bot used to update: User:WP 1.0 bot/WikiWork/ww, the overall WikiWork score; User:WP 1.0 bot/WikiWork/ar, the total number of articles in the project; User:WP 1.0 bot/WikiWork/om, the relative WikiWork score; and User:WP 1.0 bot/WikiWork/ta, the table that contains the overall and relative scores. If you look at the history of each of those pages, the bot was updating them until 2 July 2015. You can update those pages manually by inputting the numbers by hand and then inputting the score from the calculator into the overall WikiWork score page, the number of articles page, or the relative WikiWork score page. The table generator page used values from those three pages. So to answer your question, yes you do need to update one of those WikiWork pages for it to show up in the project assessment table. (Anyone who is looking at reviving User:Theo's Little Bot could also use this info). Icebob99 (talk) 16:15, 19 December 2016 (UTC)
- @Icebob99: I can't thank you enough for guiding me in generating the score and updating them. The project templates are finally reflecting the current status. —IB [ Poke ] 04:36, 20 December 2016 (UTC)
- It's my pleasure just as much as yours! Although, it would be nice to have the bot do it rather than individual WikiProject editors... Icebob99 (talk) 04:39, 20 December 2016 (UTC)
- @1989:, I see that you are active on this page, may I ask you to please look through this request once? —IB [ Poke ] 07:36, 20 December 2016 (UTC)
- It's my pleasure just as much as yours! Although, it would be nice to have the bot do it rather than individual WikiProject editors... Icebob99 (talk) 04:39, 20 December 2016 (UTC)
- @Icebob99: I can't thank you enough for guiding me in generating the score and updating them. The project templates are finally reflecting the current status. —IB [ Poke ] 04:36, 20 December 2016 (UTC)
- @IndianBio: I went from User:WP 1.0 bot/Tables/Project/Microbiology to User:WP 1.0 bot/WikiWork and found in the documentation that there are four different pages that the User:Theo's Little Bot used to update: User:WP 1.0 bot/WikiWork/ww, the overall WikiWork score; User:WP 1.0 bot/WikiWork/ar, the total number of articles in the project; User:WP 1.0 bot/WikiWork/om, the relative WikiWork score; and User:WP 1.0 bot/WikiWork/ta, the table that contains the overall and relative scores. If you look at the history of each of those pages, the bot was updating them until 2 July 2015. You can update those pages manually by inputting the numbers by hand and then inputting the score from the calculator into the overall WikiWork score page, the number of articles page, or the relative WikiWork score page. The table generator page used values from those three pages. So to answer your question, yes you do need to update one of those WikiWork pages for it to show up in the project assessment table. (Anyone who is looking at reviving User:Theo's Little Bot could also use this info). Icebob99 (talk) 16:15, 19 December 2016 (UTC)
- Thanks for the URL @Icebob99:, now I can get the progression of each Wikiproject. Do I need to update the Wikiwork page to reflect this so that its assimilated in the project assessment table? —IB [ Poke ] 06:33, 19 December 2016 (UTC)
- Just for reference: here's the manual calculator; you can divide the score you get from that tool by the total number of articles in the project to get the relative score. Getting the bot to do this, of course, would be the ideal scenario. Icebob99 (talk) 00:52, 18 December 2016 (UTC)
@IndianBio and Icebob99: The source code is available, and I might be able to take over, especially because of the recent m:Requests for comment/Abandoned Labs tools discussion. Would that be useful? Dat GuyTalkContribs 17:26, 20 December 2016 (UTC)
- @DatGuy: That would be great! I'm not too knowledgeable in the world of bots, but I think that this is one of those where it gets running and goes on for a long time. Thanks! Icebob99 (talk) 18:01, 20 December 2016 (UTC)
- @DatGuy: thanks for your response, did you have any progress with the bot's functionality? Sorry for asking. —IB [ Poke ] 16:11, 26 December 2016 (UTC)
- No problem at all. The "committee" that should oversee the take-overs isn't actually created yet. I'll try and test it on my own computer and fix any minor bugs related to new updates before I start a BRFA. However, the code will still be private to respect Theo's wishes. Dat GuyTalkContribs 10:45, 27 December 2016 (UTC)
- Hi, is there any progress? I'm not familiar with how long bots take to fix (you folks do some arcane sorcery), so I might be asking preemptively. Icebob99 (talk) 03:27, 19 January 2017 (UTC)
- Coding.... Dat GuyTalkContribs 11:41, 1 February 2017 (UTC)
- Hi, is there any progress? I'm not familiar with how long bots take to fix (you folks do some arcane sorcery), so I might be asking preemptively. Icebob99 (talk) 03:27, 19 January 2017 (UTC)
- No problem at all. The "committee" that should oversee the take-overs isn't actually created yet. I'll try and test it on my own computer and fix any minor bugs related to new updates before I start a BRFA. However, the code will still be private to respect Theo's wishes. Dat GuyTalkContribs 10:45, 27 December 2016 (UTC)
- @DatGuy: thanks for your response, did you have any progress with the bot's functionality? Sorry for asking. —IB [ Poke ] 16:11, 26 December 2016 (UTC)
BRFA filed. Dat GuyTalkContribs 17:09, 1 February 2017 (UTC)
- Thanks! WikiWork works again! Icebob99 (talk) 14:19, 13 February 2017 (UTC)
Bot proposal: convert archive.org djvu.txt links
This is for a bot I have mostly already written for other things, wanted to pass it by here before BRFA.
Regarding links that look like this:
It would be better pointed to the main work page:
There are multiple formats available at Internet Archive and linking to the main work page is better than the djvu.txt file which is raw OCR output with considerable error rate. The djvu.txt is available from the main work page along with other formats such as PDF and a GUI interface to the scanned book ("flip book"). I think the reason most editors use the djvu.txt is because it was found in a Google search then copy-paste the URL into Wikipedia.
A database search shows about 8200 articles contain the "_djvu.txt" suffix (about 9500 links), and doing the conversion would be simple and mostly error free (other than unforeseen gigo). -- GreenC 18:22, 18 December 2016 (UTC)
- Go for it, mate ProgrammingGeek (Page! • Talk! • Contribs!) 19:56, 18 December 2016 (UTC)
MarkAdmin.js
Hello.
I would like to transfer the following script to Wikipedia so users such as myself could identify which users are the following:
- Administrators (by default)
- Bureaucrats (by default)
- Checkusers (by default)
- Oversighters (by default)
- ARBCOM Members (optional)
- OTRS Members (optional)
- Edit Filter Managers (optional)
- Stewards (optional)
https://commons.wikimedia.org/wiki/MediaWiki:Gadget-markAdmins.js
I would like a bot to frequently update the list to make the information accurate.
https://commons.wikimedia.org/wiki/MediaWiki:Gadget-markAdmins-data.js 1989 (talk) 19:30, 18 December 2016 (UTC)
- On hold See Wikipedia:Village_pump_(technical)#MarkAdmin.js. 1989 (talk) 21:10, 18 December 2016 (UTC)
- Add importScript('User:Amalthea/userhighlighter.js'); to your "skin".js file to show admins Ronhjones (Talk) 21:27, 3 January 2017 (UTC)
add birthdate and age to infoboxes
Here's a thought... How about a bot to add {{birth date and age}}/{{death date and age}} templates to biography infoboxes that just have plain text dates? --Zackmann08 (Talk to me/What I been doing) 18:13, 20 December 2016 (UTC)
- These templates provide the dates in microformat, which follows ISO 8601. ISO 8601 only uses the Gregorian calendar, but many birth and death dates in Wikipedia use the Julian calendar. A bot can't distinguish which is which, unless the date is after approximately 1924, so this is not an ideal task to assign to a bot. (Another problem is that if the birth date is Julian and the death date is Gregorian the age computation could be wrong.) Jc3s5h (talk) 19:07, 20 December 2016 (UTC)
- @Jc3s5h: that is a very valid point... One thought, the bot could (at least initially) focus on only people born after 1924 (or whichever year is decided). --Zackmann08 (Talk to me/What I been doing) 19:13, 20 December 2016 (UTC)
- Without comment on feasibility, I support this as useful for machine-browsing. The ISO 8601 format is useful even if the visual output of the page doesn't change. ~ Rob13Talk 08:22, 30 December 2016 (UTC)
- @Jc3s5h: that is a very valid point... One thought, the bot could (at least initially) focus on only people born after 1924 (or whichever year is decided). --Zackmann08 (Talk to me/What I been doing) 19:13, 20 December 2016 (UTC)
I all go for it. I am filing a BFRA after my wikibreak. -- Magioladitis (talk) 08:24, 30 December 2016 (UTC)
- When I open the edit window, I just see a bunch of template clutter, so I would like to understand what the template is used for, who on WP uses it, and specifically what the purpose of micro format dates is; It strikes me that the info boxes are sufficiently well labelled for any party to pull date metadata off them without recourse to additional templates. -- Ohc ¡digame! 23:13, 14 February 2017 (UTC)
Can a useful bot be taken over and repaired.
(Was posted at WP:VPT, user:Fastily suggested to post here if there was no takers)
User:Theopolisme is fairly inactive (last edit May). He mde User:Theo's Little Bot. Of late the bot has not been behaving very well on at least one of it's tasks (Task 1 - reduction of non-free images in Category:Wikipedia non-free file size reduction requests. It typically starts at 06:00 and will drop out usually within a minute of two (although sometimes one is lucky and it runs for half an hour occasionally). Messages on talk pages and github failed to contact user. User:Diannaa and I both sent e-mails, and Diannaa did get a reply - He is very busy elsewhere, and hopes to maybe look over Xmas... In view of the important work it does, Dianna suggested I ask at WP:VPT if there was someone who could possibly take the bot over? NB: See also Wikipedia:Bot requests#Update WikiWork factors Ronhjones (Talk) 19:44, 25 December 2016 (UTC)
- Now this should be a simple task. Doing... Dat GuyTalkContribs 12:39, 27 December 2016 (UTC)
- @DatGuy: FWIW, I'm very rusty on python, but I tried running the bot off my PC (with all saves disabled of course), and the only minor error I encountered was resizer_auto.py:49: DeprecationWarning: page.edit() was deprecated in mwclient 0.7.0 and will be removed in 0.9.0, please use page.text() instead.. I did note that the log file was filling up, maybe after so long unattended, the log file is too big. Ronhjones (Talk) 16:24, 28 December 2016 (UTC)
- Are you sure? See [2]. When it tries to upload it, the file is corrupted. However, the file is fine on my local machine. Can you test it on the file? Feel free to use your main account, I'll ask to make it possible for you to upload files. As a side note, could you join ##datguy connect so we can talk more easily (text, no voice). Thanks. Dat GuyTalkContribs 16:33, 28 December 2016 (UTC)
- Well just reading the files is one thing, writing them back is a whole new ball game! Commented out the "theobot.bot.checkpage" bit, changed en.wiki to test.wiki (2 places), managed to login OK, then it goes bad - see User:Ronhjones/Sandbox2 for screen grab. And every run adds two lines to my "resizer_auto.log" on the PC. Bit late now for any more. Ronhjones (Talk) 01:44, 29 December 2016 (UTC)
- Ah, just spotted the image files in the PC directory - 314x316 pixels, perfect sizing. Does that mean the bot's directory is filling up with thousands of old files? Just a thought. Ronhjones (Talk) 01:49, 29 December 2016 (UTC)
- See for yourself :). Weird thing for me is, I can upload it manually from the API sandbox on testwiki just fine. When the bot tries to do it via coding? CORRUPT! Dat GuyTalkContribs 10:28, 30 December 2016 (UTC)
- 25 GB of temp image files !! - it there a size limit per user on that server? Somewhere (in the back of my mind - I know not where - trouble with getting old..., and I could be very wrong) I read he was using a modified mwclient... My PC fails when it hits the line site.upload(open(file), theimage, "Reduce size of non-free image... and drops to the error routine, I tried to look up the syntax of that command (not a lot of documentation) and it does not seems to fully agree with his format. Ronhjones (Talk) 23:29, 30 December 2016 (UTC)
- OTOH, I just looked at the test image, have you cracked it? Ronhjones (Talk) 23:31, 30 December 2016 (UTC)
- 25 GB of temp image files !! - it there a size limit per user on that server? Somewhere (in the back of my mind - I know not where - trouble with getting old..., and I could be very wrong) I read he was using a modified mwclient... My PC fails when it hits the line site.upload(open(file), theimage, "Reduce size of non-free image... and drops to the error routine, I tried to look up the syntax of that command (not a lot of documentation) and it does not seems to fully agree with his format. Ronhjones (Talk) 23:29, 30 December 2016 (UTC)
- See for yourself :). Weird thing for me is, I can upload it manually from the API sandbox on testwiki just fine. When the bot tries to do it via coding? CORRUPT! Dat GuyTalkContribs 10:28, 30 December 2016 (UTC)
- Ah, just spotted the image files in the PC directory - 314x316 pixels, perfect sizing. Does that mean the bot's directory is filling up with thousands of old files? Just a thought. Ronhjones (Talk) 01:49, 29 December 2016 (UTC)
- Well just reading the files is one thing, writing them back is a whole new ball game! Commented out the "theobot.bot.checkpage" bit, changed en.wiki to test.wiki (2 places), managed to login OK, then it goes bad - see User:Ronhjones/Sandbox2 for screen grab. And every run adds two lines to my "resizer_auto.log" on the PC. Bit late now for any more. Ronhjones (Talk) 01:44, 29 December 2016 (UTC)
- Are you sure? See [2]. When it tries to upload it, the file is corrupted. However, the file is fine on my local machine. Can you test it on the file? Feel free to use your main account, I'll ask to make it possible for you to upload files. As a side note, could you join ##datguy connect so we can talk more easily (text, no voice). Thanks. Dat GuyTalkContribs 16:33, 28 December 2016 (UTC)
- @DatGuy: FWIW, I'm very rusty on python, but I tried running the bot off my PC (with all saves disabled of course), and the only minor error I encountered was resizer_auto.py:49: DeprecationWarning: page.edit() was deprecated in mwclient 0.7.0 and will be removed in 0.9.0, please use page.text() instead.. I did note that the log file was filling up, maybe after so long unattended, the log file is too big. Ronhjones (Talk) 16:24, 28 December 2016 (UTC)
BRFA filed. Dat GuyTalkContribs 09:19, 1 January 2017 (UTC)
- @DatGuy: And approved I see - Is it now running? I'll stop the original running. I see it was that "open" statement that was the issue I had! Ronhjones (Talk) 00:34, 3 January 2017 (UTC)
Autoassess redirects
A bot that patrols articles and reassesses WikiProject banners when an article has been redirected. (As far as I can tell, this doesn't exist.) It's an easy place to save editor patrol time. I would suggest that such a bot remove the class/importance parameters altogether (rather than assessing as |class=Redirect
) because then the template itself will (1) autoassess to redirect as necessary, and (2) autoassess to "unassessed" (needing editor attention) if/when the redirect is undone. But bot assistance in that first step should be uncontroversial maintenance. Alternatively, the bot could remove WP banners when the project doesn't assess redirects, though I think the better case would be to leave them (let the project banners autoassess as "N/A") rather than not having the page tracked. czar 14:32, 6 January 2017 (UTC)
- This seems like a good idea, and I would agree with removing class/importance as the best implementation. --Izno (talk) 14:49, 6 January 2017 (UTC)
- Yeah, good idea. No coders for this yet? 103.6.159.72 (talk) 19:14, 18 January 2017 (UTC)
- I have some code for this. Czar and Izno, how do you think the bot should react when the talk page has banners with class set to redirect and importance set to some value? I think there's some value to keeping the importance parameter, but it may be unimportant in the long run. Enterprisey (talk!) 21:01, 19 January 2017 (UTC)
- If the page has been redirected for a week, I'd consider it uncontroversial to wipe both quality and importance parameters, which both should be reassessed by a human if/when the article is restored. (The WikiProject template automatically recategorizes when the redirect is removed for a human to do this.) I see two cases, though, (1) updating the WikiProject templates when the redirecting editor does not, and (2) removing manual assessments as "Redirects" to let the template autoassess on its own. There could be issues with the latter, so I'd focus on the former case, which is the most urgent. I would think some kind of widespread input would be needed for the latter, considering how some project may desire manual assessments across the board and/or keeping their importance params on redirects, for whatever reason. Thanks for your work! Looking forward to the bot. czar 21:06, 19 January 2017 (UTC)
- Perhaps the bot could leave a comment; something like
<!-- EnterpriseyBot reassessed this from High Start to no parameters -->
. Articles which are un-redirected would then give the editor an opportunity to review the old classification. --Izno (talk) 13:11, 20 January 2017 (UTC)- I'm strongly, strongly opposed to wiping importance. Some projects may use that to identify targets needing creation which are currently redirects. This should be a project-level decision. As for removing classes, that's uncontroversial. Not sure how you're implementing this, but I'd suggest you want articles that have been redirects for at least 48 hours to avoid wiping classes from articles which are blanked and redirected by vandals, etc. ~ Rob13Talk 05:01, 21 January 2017 (UTC)
- At the moment, the bot skips anything that hasn't been a redirect for a week. I agree that the importance parameter shouldn't be affected by whatever the page contains, so at the moment it isn't touched. The BRFA might also be a good place to discuss this. Enterprisey (talk!) 01:57, 22 January 2017 (UTC)
- I'm strongly, strongly opposed to wiping importance. Some projects may use that to identify targets needing creation which are currently redirects. This should be a project-level decision. As for removing classes, that's uncontroversial. Not sure how you're implementing this, but I'd suggest you want articles that have been redirects for at least 48 hours to avoid wiping classes from articles which are blanked and redirected by vandals, etc. ~ Rob13Talk 05:01, 21 January 2017 (UTC)
- Perhaps the bot could leave a comment; something like
- If the page has been redirected for a week, I'd consider it uncontroversial to wipe both quality and importance parameters, which both should be reassessed by a human if/when the article is restored. (The WikiProject template automatically recategorizes when the redirect is removed for a human to do this.) I see two cases, though, (1) updating the WikiProject templates when the redirecting editor does not, and (2) removing manual assessments as "Redirects" to let the template autoassess on its own. There could be issues with the latter, so I'd focus on the former case, which is the most urgent. I would think some kind of widespread input would be needed for the latter, considering how some project may desire manual assessments across the board and/or keeping their importance params on redirects, for whatever reason. Thanks for your work! Looking forward to the bot. czar 21:06, 19 January 2017 (UTC)
- BRFA filed Enterprisey (talk!) 02:45, 22 January 2017 (UTC)
- And marking this as Done, as I'm continuing to run the task. Enterprisey (talk!) 20:16, 10 February 2017 (UTC)
Copy coordinates from lists to articles
Virtually every one of the 3000-ish places listed in the 132 sub-lists of National Register of Historic Places listings in Virginia has an article, and with very few exceptions, both lists and articles have coordinates for every place, but the source database has lots of errors, so I've gradually been going through all the lists and manually correcting the coords. As a result, the lists are a lot more accurate, but because I haven't had time to fix the articles, tons of them (probably over 2000) now have coordinates that differ between article and list. For example, the article about the John Miley Maphis House says that its location is 38°50′20″N 78°35′55″W / 38.83889°N 78.59861°W, but the manually corrected coords on the list are 38°50′21″N 78°35′52″W / 38.83917°N 78.59778°W. Like most of the affected places, the Maphis House has coords that differ only a small bit, but (1) ideally there should be no difference at all, and (2) some places have big differences, and either we should fix everything, or we'll have to have a rather pointless discussion of which errors are too little to fix.
Therefore, I'm looking for someone to write a bot to copy coords from each place's NRHP list to the coordinates section of {{infobox NRHP}} in each place's article. A few points to consider:
- Some places span county lines (e.g. bridges over border streams), and in many of these cases, each list has separate coordinates to ensure that the marked location is in that list's county. For an extreme example, Skyline Drive, a scenic 105-mile-long road, is in eight counties, and all eight lists have different coordinates. The bot should ignore anything on the duplicates list; this is included in citation #4 of National Register of Historic Places listings in Virginia, but I can supply a raw list to save you the effort of distilling a list of sites to ignore.
- Some places have no coordinates in either the list or the article (mostly archaeological sites for which location information is restricted), and the bot should ignore those articles.
- Some places have coordinates only in the list or only in the article's {{Infobox NRHP}} (for a variety of reasons), but not in both. Instead of replacing information with blanks or blanks with information, the bot should log these articles for human review.
- Some places might not have {{infobox NRHP}}, or in some cases (e.g. Newport News Middle Ground Light) it's embedded in another infobox, and the other infobox has the coordinates. If {{infobox NRHP}} is missing, the bot should log these articles for human review, while embedded-and-coordinates-elsewhere is covered by the previous bullet.
- I don't know if this is the case in Virginia, but in some states we have a few pages that cover more than one NRHP-listed place (e.g. Zaleski Mound Group in Ohio, which covers three articles); if the bot produced a list of all the pages it edits, a human could go through the list, find any entries with multiple appearances, and check them for fixes.
- Finally, if a list entry has no article at all, don't bother logging it. We can use WP:NRHPPROGRESS to find what lists have redlinked entries.
No discussion has yet been conducted for this idea; it's just something I've thought of. I've come here basically just to see if someone's willing to try this route, and if someone says "I think I can help", I'll start the discussion at WT:NRHP and be able to say that someone's happy to help us. Of course, I wouldn't ask you actually to do any coding or other work until after consensus is reached at WT:NRHP. Nyttend (talk) 00:55, 16 January 2017 (UTC)
Off-topic discussion
|
---|
|
Bot to help with FA/GA nomination process
The process is as follows: (Pasted from FA nomination page):
Before nominating an artic)le, ensure that it meets all of the FA criteria and that peer reviews are closed and archived. The featured article toolbox (at right) can help you check some of the criteria. Place
{{FAC}}
should be substituted at the top of the article talk page at the top of the talk page of the nominated article and save the page. From the FAC template, click on the red "initiate the nomination" link or the blue "leave comments" link. You will see pre-loaded information; leave that text. If you are unsure how to complete a nomination, please post to the FAC talk page for assistance. Below the preloaded title, complete the nomination page, sign with~~~~
and save the page.Copy this text: Wikipedia:Featured article candidates/name of nominated article/archiveNumber (substituting Number), and edit this page (i.e., the page you are reading at the moment), pasting the template at the top of the list of candidates. Replace "name of ..." with the name of your nomination. This will transclude the nomination into this page. In the event that the title of the nomination page differs from this format, use the page's title instead.
May be a bot could automate that process? Thanks.47.17.27.96 (talk) 13:08, 16 January 2017 (UTC)
- This was apparently copied here from WP:VPT; the original is here. --Redrose64 🌹 (talk) 21:34, 16 January 2017 (UTC)
- I think that at WP:VPT, the IP was directed here - see here. TheMagikCow (talk) 19:40, 17 January 2017 (UTC)
- There is some information that is required from the user, both with the FAC and GAN templates, that can't be inferred by a bot but requires human decision making. I don't think this would be that useful or feasible. BlueMoonset (talk) 21:35, 22 January 2017 (UTC)
- I think that at WP:VPT, the IP was directed here - see here. TheMagikCow (talk) 19:40, 17 January 2017 (UTC)
Bot for category history merges
Back in the days when the facility to move category pages wasn't available, Cydebot made thousands of cut-and-paste moves to rename categories per CFD discussions. In the process of the renames, a new category would be created under the new name by the bot with the the edit summary indicating that it was "Moved from CATEGORYOLDNAME" and identifying the the editors of the old category to account for the attribution. An example is here.
This method of preserving attribution is rather crude and so it is desirable that the complete editing history of the category page be available for attribution. The process of recovering the deleted page histories has since been taken on by Od Mishehu who has performed thousands of history merges.
I suggest that an adminbot be optimised to go through Cydebot's contribs log, identify the categories that were created by it (i.e, the first edit on the page should be by Cydebot) and
- undelete the category mentioned in the Cydebot's edit summary
- history-merge it into the new category using Special:MergeHistory.
- Delete the left-over redirect under CSD G6.
This bot task is not at all controversial. This is just an effort to fill in missing page histories. Obviously, there would be no cases of any parallel histories encountered - and even if there were, it wouldn't be an issue since Special:MergeHistory cannot be used for merging parallel histories - which is to say that there is no chance of any unintended history mess-up. This should an easy task for a bot. 103.6.159.72 (talk) 10:52, 18 January 2017 (UTC)
- There's one thing that i have overlooked above, though it is again not a problem. In some rare cases, it may occur that after the source page has been moved to the destination page, the source page may later have been recreated - either as a category redirect or as a real category. In such cases, just skip step #3 in the procedure described above. There will be edits at the source page that postdate the creation of the destination page, and hence by its design, Special:MergeHistory will not move these edits over - only the old edits that the bot has undeleted would be merged. (It may be noted that the MergeHistory extention turns the source page into a redirect only when all edits at the source are merged into the destination page, which won't be the case in such cases - this means that the source page that some guy recreated will remain intact.) All this is that simple. 103.6.159.72 (talk) 19:37, 18 January 2017 (UTC)
- Is this even needed? I would think most if not all edits to category pages do not pass the threshold of originality to get copyright in the first place. Our own guidelines on where attribution is not needed reinforce this notion under US law, stating duplicating material by other contributors that is sufficiently creative to be copyrightable under US law (as the governing law for Wikipedia), requires attribution. That same guideline also mentions that a List of authors in the edit summary is sufficient for proper attribution, which is what Cydebot has been doing for years. Avicennasis @ 21:56, 20 Tevet 5777 / 21:56, 18 January 2017 (UTC)
- Cydebot doesn't do it any longer. Since
2011 or sometime2015, Cydebot renames cats by actually moving the page. So for the sake of consistency, we could do this for the older cats also. The on-wiki practise, for a very lomg time, has been to do a history merge wherever it is technically possible. The guideline that edit summary is sufficient attribution is quite dated and something that's hardly ever followed. It's usually left as a worst-case option where a histmerge is not possible. History merge is the preferred method of maintaining attribution. Some categories like Category:Members of the Early Birds of Aviation do have some descriptive creative content. 103.6.159.72 (talk) 02:21, 19 January 2017 (UTC)- I'm not completely opposed to this, but I do think that we need to define which category pages are in scope for this. I suspect the vast majority of pages wouldn't need attribution, and we should be limiting the amount of pointless bot edits. Avicennasis @ 02:49, 21 Tevet 5777 / 02:49, 19 January 2017 (UTC)
- It wasn't 2011 (it can't have been, since the ability to move category pages wasn't available to anybody until 22 May 2014, possibly slightly later, but certainly no earlier). Certainly Cydebot was still making cutpaste moves when I raised this thread on 14 June 2014; raised this thread; and commented on this one. These requests took some months to be actioned: checking Cydebot's move log, I find that the earliest true moves of Category: pages that were made by that bot occurred on 26 March 2015. --Redrose64 🌹 (talk) 12:07, 19 January 2017 (UTC)
- Since we are already talking about using a bot, I think it makes sense to do them all (or lest none at all) since that would come at no extra costs. Selecting a cherry-pick for the bot to do is just a waste of human editors' time. The edits won't be completely "pointless" - it's good to be able to see full edit histories. Talking of pointless edits, I should remind people that there are bots around that perform hundreds of thousands of pointless edits. 103.6.159.84 (talk) 16:14, 19 January 2017 (UTC)
- As to when it became technically possible, I did it on May 26, 2014. עוד מישהו Od Mishehu 05:32, 20 January 2017 (UTC)
- I'm not completely opposed to this, but I do think that we need to define which category pages are in scope for this. I suspect the vast majority of pages wouldn't need attribution, and we should be limiting the amount of pointless bot edits. Avicennasis @ 02:49, 21 Tevet 5777 / 02:49, 19 January 2017 (UTC)
- Cydebot doesn't do it any longer. Since
- This would be the beginning of the selection set of pages for this proposed bot to process, beginning from 00:03, 25 March 2015 – you can see that newer edits in this set were all page moves. Then process all the way back to the first such move of 15:31, 28 April 2006. This looks to be tens of thousands of pages. Correct? wbm1058 (talk) 22:19, 20 January 2017 (UTC)
- ~94,899 pages, by my count. Avicennasis @ 03:36, 23 Tevet 5777 / 03:36, 21 January 2017 (UTC)
- That should keep a bot busy for a week or more. The Usercontribs module pulls the processing queue. Here's the setup in the API sandbox. Click "make request" to see the results of a query to get the first three. Though I've never written an admin-bot before, I may take a stab at this within the next several days. – wbm1058 (talk) 04:28, 21 January 2017 (UTC)
- The other major API modules to support this are Undelete, Mergehistory and Delete. This would be a logical second task for my Merge bot to take on. The PHP framework I use supports undelete and delete, but it looks like I'll need to add new functions for user-contribs and merge-history. In my RfA I promised to work the Wikipedia:WikiProject History Merge backlog, so it would be nice to take that off my back burner in a significant way. I'm hoping to leverage this into another bot task to clear some of the article-space backlog as well...
- Coding... wbm1058 (talk) 13:06, 21 January 2017 (UTC)
- My count is 89,894 pages. wbm1058 (talk) 00:58, 24 January 2017 (UTC)
- @Wbm1058: Did you exclude the pages that have already been histmerged (by Od Mishehu and probably a few by other admins also)?— Preceding unsigned comment added by 103.6.159.67 (talk • contribs) 12:39, 24 January 2017 (UTC)
- I was about to mention that. My next step is to check the deleted revisions for mergeable history. No point in undeleting if there is no mergeable history. Working on that now. – wbm1058 (talk) 14:40, 24 January 2017 (UTC)
- Note this example of a past histmerge by Od Mishehu: Category:People from Stockport
- 07:04, 26 August 2014 Od Mishehu changed visibility of a revision on page Category:People from Stockport: edit summary hidden (No longer needed for attribution, but prevents listed users from vanishing)
- Should this bot do that with its histmerges too? wbm1058 (talk) 21:51, 25 January 2017 (UTC)
- Yes, when there is a list of users present (there were periods when the bot didn't do it, but most of the time it did). עוד מישהו Od Mishehu 22:24, 25 January 2017 (UTC)
- @Wbm1058: Did you exclude the pages that have already been histmerged (by Od Mishehu and probably a few by other admins also)?— Preceding unsigned comment added by 103.6.159.67 (talk • contribs) 12:39, 24 January 2017 (UTC)
An other issue: Some times, a category was renamed multiple times. For example, Category:Georgian conductors->Category:Georgian conductors (music)->Category:Conductors (music) from Georgia (country); this must be supported also for categories where the second rename was recent. e.g Category:Visitor attractions in Washington (U.S. state)->Category:Visitor attractions in Washington (state)->Category:Tourist attractions in Washington (state). Back-and-forth renames must also be considered, for example, Category:Tornadoes in Hawaii->Category:Hawaii tornadoes->Category:Tornadoes in Hawaii; this also must be handled in cases where the second rename was recent, e.g Category:People from San Francisco->Category:People from San Francisco, California->Category:People from San Francisco. עוד מישהו Od Mishehu 05:35, 26 January 2017 (UTC)
- Od Mishehu, this is also something I noticed. I'm thinking the best way to approach this is to start with the oldest contributions, and then merge forward so the last merge would be into the newest, currently active, category. Is that the way you would manually do this? So I think I need to reverse the direction that I was processing this, and work forward from the oldest rather than backward from the newest. Category:Georgian conductors was created at 22:56, 23 June 2008 by a human editor; that's the first (oldest) set of history to merge. At 22:38, 7 June 2010 Cydebot moved Category:Conductors by nationality to Category:Conductors (music) by nationality per CFD at Wikipedia:Categories for discussion/Log/2010 May 24#Category:Conductors. At 00:12, 8 June 2010 Cydebot deleted page Category:Georgian conductors (Robot - Moving Category Georgian conductors to Category:Georgian conductors (music) per CFD at Wikipedia:Categories for discussion/Log/2010 May 24#Category:Conductors.) So we should restore both Category:Georgian conductors and Category:Georgian conductors (music) in order to merge the 5 deleted edits of the former into the history of the latter. The new category creation by Cydebot that would trigger this history restoration and merging is
- 00:11, 8 June 2010 . . Cydebot (187 bytes) (Robot: Moved from Category:Georgian conductors. Authors: K********, E***********, O************, G*********, Cydebot)
- However, if you look at the selection set I've been using, you won't find this new category creating edit: 8 June 2010 Cydebot contributions
- It should slot in between these:
- 00:12, 8 June 2010 . . (+1,499) . . Nm Category:German conductors (music) (Robot: Moved from Category:German conductors. Authors: ...
- 00:09, 8 June 2010 . . (+275) . . Nm Category:Populated places established in 1644 (Robot: Moved from Category:Settlements established in 1644. Authors: ...
- To find the relevant log item, I need to search the Deleted user contributions
- I'm looking for the API that gets deleted user contributions. This is getting more complicated. – wbm1058 (talk) 16:38, 26 January 2017 (UTC)
- OK, Deletedrevs can list deleted contributions for a certain user, sorted by timestamp. Not to be confused with Deletedrevisions. wbm1058 (talk) 17:18, 26 January 2017 (UTC)
- After analyzing these some more, I think my original algorithm is fine. I don't think it should be necessary for the bot to get involved with the deleted user contributions. What this means is that only the most recent moves will be merged on the first pass, as my bot will only look at Cydebot's active contributions history. The first pass will undelete and merge the most recently deleted history, which will expose additional moves that my bot will see on its second pass through the contributions. I'll just re-run until my bot sees no more mergeable items. The first bot run will merge Category:Georgian conductors (music) into Category:Conductors (music) from Georgia (country). The second bot run will merge Category:Georgian conductors into Category:Conductors (music) from Georgia (country). The first bot run will merge Category:Visitor attractions in Washington (U.S. state) into Category:Tourist attractions in Washington (state), and there's nothing to do on the second pass (there is no mergeable history in Category:Visitor attractions in Washington (state)). The first pass would merge Category:Hawaii tornadoes into Category:Tornadoes in Hawaii – I just did that for testing. The second pass will see that Category:Tornadoes in Hawaii should be history-merged into itself. I need to check for such "self-merge" cases and report them (a "self-merge" is actually a restore of some or all of a page's deleted history)... I suppose I should be able to restore the applicable history (only the history that predates the page move). Category:People from San Francisco just needs to have the "self-merge" procedure performed, as Category:People from San Francisco, California has no mergeable history. Thanks for giving me these use-cases, very helpful.
- I should mention some more analysis from a test run through the 89,893 pages in the selection set. 2369 of those had no deleted revisions, so I just skip them. HERE is a list of the first 98 of those. Of the remaining 87,524 pages, these 544 pages aren't mergeable, because the timestamp of the oldest edit isn't old enough, so I skip them too. Many of these have already been manually history-merged. That leaves 86,980 mergeable pages that my bot should history-merge on its first pass. An unknown number of additional merges to be done on the second pass, then hopefully a third pass will either confirm we're done or mop up any remaining – unless there are cats that have moved four times... wbm1058 (talk) 22:42, 26 January 2017 (UTC)
- Some of the pages with no deleted reivsions are the result of a category rename where the source category was changed into something else (a category redirect or disambiguation), and a history merge in those caes should be done (I juse did onesuch merge, the thirds on the list of 99). However, this may be too difficult for a bot to handle; I can deal with those over time if you give me a full list. The first 2 on the list you gave are different - the bot didn't delete them (it did usually, but not always), and they were removed without deletion by Jc37 and used as new categories. I believe, based on the link to the CFD discussion at the beginning, that the aanswer to that would be in Wikipedia:Categories for discussion/Log/2015 January 1#Australian politicians. עוד מישהו Od Mishehu 05:34, 27 January 2017 (UTC)
- I should mention some more analysis from a test run through the 89,893 pages in the selection set. 2369 of those had no deleted revisions, so I just skip them. HERE is a list of the first 98 of those. Of the remaining 87,524 pages, these 544 pages aren't mergeable, because the timestamp of the oldest edit isn't old enough, so I skip them too. Many of these have already been manually history-merged. That leaves 86,980 mergeable pages that my bot should history-merge on its first pass. An unknown number of additional merges to be done on the second pass, then hopefully a third pass will either confirm we're done or mop up any remaining – unless there are cats that have moved four times... wbm1058 (talk) 22:42, 26 January 2017 (UTC)
This whole thing seems a waste of time (why do we need to see old revisions of category pages that were deleted years ago), but if you want to spend your time writing and monitoring a bot that does this, I won't complain; it won't hurt anything. I'm just concerned by the comments up above that point out a lot of not-so-straightforward cases, like the tornadoes in Hawaii and the visitor attractions in Washington. How will the bot know what information is important to preserve and what isn't? Nyttend (talk) 05:28, 27 January 2017 (UTC)
- The reasons for it, in my opinion:
- While most categories have no copyrightable information, some do; on these, we legally need to maintain the history. While Cydebot did this well for categories which were renamed once, it didn't for categories which were renamed more than once. Do any of these have copyrightable information? It's impossible to know.
- If we nominate acategory for deletion, we generally should inform its creator - even if the creation was over 10 years ago, as long as the creator is still active. With deleted history, it's difficult for a human admin to do this, and impossible for automated nomination tools (such as [[WP::TW|Twinkle]]) or non-admins.
- עוד מישהו Od Mishehu 05:37, 27 January 2017 (UTC)
- Because writing a bot is fun, isn't it? As only programmers know. And especially if the bot's gonna perform hundreds of thousands of admin actions.
- Because m:wikiarchaeologists will go to any lengths to make complete edoting histories of pages visible, even if it's quite trivial. Using a bot shows a far more moderate level of eccentricity than doing it manually would. Why do you think Graham87 imported thousands of old page revisions from nostwiki?
- 103.6.159.76 (talk) 08:59, 27 January 2017 (UTC)
- Here is the list of 760 "self-mergeable" categories, which should simply have at least part of their deleted history restored
- And the list of 2371 pages with no deleted history
I think it may be best to defer any bot processing of these on the first iteration of this. Maybe after a first successful run, we can come back and focus on an automated solution for these as well. It's still a lot to be left for manual processing. I'll work on the piece that actually performs the merges later today. – wbm1058 (talk) 13:49, 27 January 2017 (UTC)
- @Wbm1058: For the pages that were copy-pasted without rhe source catgeory being delted, you can still merge them. Use of Special:MergeHistory ensures that only the edits that predate the creation of the destination category will be merged. 103.6.159.90 (talk) 08:32, 29 January 2017 (UTC)
BRFA filed I think this is ready for prime time. wbm1058 (talk) 01:17, 28 January 2017 (UTC)
Website suddenly took down a lot of its material, need archiving bot!
Per Wikipedia_talk:WikiProject_Academic_Journals#Urgent:_Beall.27s_list, several (if not) most links to https://scholarlyoa.com/ and subpages just went dead. Could a bot help with adding archive links to relevant citation templates (and possibly bare/manual links too)? Headbomb {talk / contribs / physics / books} 00:31, 19 January 2017 (UTC)
- Cyberpower678, could you mark this domain is dead in IABot's database so that it will handle adding archive urls? — JJMC89 (T·C) 01:13, 19 January 2017 (UTC)
- @Cyberpower678: ? Headbomb {talk / contribs / physics / books} 10:50, 2 February 2017 (UTC)
- Sorry, I never got the first ping. I'll mark it in a moment.—CYBERPOWER (Chat) 16:52, 2 February 2017 (UTC)
- Only 61 urls were found in the DB with the domain.—CYBERPOWER (Chat) 17:39, 2 February 2017 (UTC)
- @Cyberpower678: Well that's 61 urls that we needed! Would it be possible to have a list of those urls, or is that complicated? It would be really useful to project members to have those centralized in one place. Headbomb {talk / contribs / physics / books} 20:04, 13 February 2017 (UTC)
- Only 61 urls were found in the DB with the domain.—CYBERPOWER (Chat) 17:39, 2 February 2017 (UTC)
- Sorry, I never got the first ping. I'll mark it in a moment.—CYBERPOWER (Chat) 16:52, 2 February 2017 (UTC)
- @Cyberpower678: ? Headbomb {talk / contribs / physics / books} 10:50, 2 February 2017 (UTC)
Apply editnotice to talk pages of pages in Category:Wikipedia information pages
Apply editnotice {{Wikipedia information pages talk page editnotice}} to talk pages of pages in Category:Wikipedia information pages, Per unopposed proposal here. (A change to common.js to produce this edit notice was suggested but rejected; a bot task was suggested as the best approach.) Thanks! —swpbT 16:15, 20 January 2017 (UTC)
- (The other route for placing the edit notice is a default-on gadget, which hasn't been put to a discussion yet; bot ops considering doing this task may want to wait until that route has been considered and rejected as well.) Enterprisey (talk!) 18:58, 20 January 2017 (UTC)
- I'm very hesitant to start applying edit notices en masse without some stronger consensus, especially with a bot. ~ Rob13Talk 04:58, 21 January 2017 (UTC)
- To editor BU Rob13: Let another bot operator do it then. The proposal has sat in the appropriate location for 19 days, and the edit notice has appeared on every page currently in the category for 10 days, without a single voice of opposition. "Insufficient consensus" is simply not a valid reason to withhold action in this case. —swpbT 13:18, 23 January 2017 (UTC)
- I'm very hesitant to start applying edit notices en masse without some stronger consensus, especially with a bot. ~ Rob13Talk 04:58, 21 January 2017 (UTC)
- As these are project pages, and the common.js route has been practically killed going forward with this as a one-off bot run shouldn't be to bad. If someone files a BRFA it should be able to go to trial quickly. NOTE: The OPERATOR will need to be an admin or template editor. — xaosflux Talk 03:00, 24 January 2017 (UTC)
- This is not meant to be a one-off run – the pages currently in the category are already tagged with the edit notice. The request is for a bot to continuously put the notice on pages that are newly added to the category. —swpbT 15:10, 24 January 2017 (UTC)
- @Xaosflux: Why does the operator need to be an admin or template editor? Everybody can edit Wikipedia information pages, right? PhilrocMy contribs 16:13, 24 January 2017 (UTC)
- To edit the info page itself, yes, but to create editnotices in any namepsace besides User/User talk requires the template editor right. —swpbT 18:41, 24 January 2017 (UTC)
- Yes, it is just for the editnotices, the bot's edits are the responsibility of the operator and will need this extra access to make these notices, so the operator will need to be trusted for that access as well. — xaosflux Talk 21:17, 24 January 2017 (UTC)
- @Xaosflux: @Swpb: @BU Rob13: If the bot gets approved, how often should it run? It can't run continously because to my knowledge, AWB can't check if new pages are added to a category. Yes, I want to use AWB. Anyway, how often should the bot run? PhilrocMy contribs 16:02, 26 January 2017 (UTC)
- AWB is not a good option for a bot that should be run automatically on a schedule. — JJMC89 (T·C) 18:41, 27 January 2017 (UTC)
- @JJMC89: I have now decided to use DotNetWikiBot instead of AWB. PhilrocMy contribs 20:41, 27 January 2017 (UTC)
- AWB is not a good option for a bot that should be run automatically on a schedule. — JJMC89 (T·C) 18:41, 27 January 2017 (UTC)
- @Xaosflux: @Swpb: @BU Rob13: @JJMC89: I have now made code for the bot which is viewable here. PhilrocMy contribs 22:27, 28 January 2017 (UTC)
- That code is missing namespace restrictions (should only be for Wikipedia and Help pages in the category) and is creating the notice for the subject page instead of the talk page. Does
input.text
add text to any existing text? The bot needs to do this, not replace anything that is currently there. Also it should only add the editnotice if it is not already there. — JJMC89 (T·C) 22:50, 28 January 2017 (UTC)- @JJMC89: I figured that since most of the pages in the category are Wikipedia and Help already, there was no need for any restrictions. Also, the code checks if each page has a corresponding {{Editnotices}} page. If that page exists, it checks if it is empty (which might happen). If it is, it adds the notice. If not, it goes to the next page. If the page doesn't exist in the first place, it creates the page by adding the editnotice. If you had looked at the code more thoroughly, I wouldn't have to explain this to you. Please look at the if statement inside the foreach if you want proof. PhilrocMy contribs 23:15, 28 January 2017 (UTC)
- Your attempted ping did not work. Assuming there
no need for any restrictions
is not a good idea. It is looking at the wrong editnotice page. It should be the editnoitce page for the talk page of each page in the category. It shouldn't just skip editnotices that exist. It should append the one in this task if it is not present. — JJMC89 (T·C) 23:58, 28 January 2017 (UTC)- @JJMC89: I have made a new paste. PhilrocMy contribs 16:51, 30 January 2017 (UTC)
- Your attempted ping did not work. Assuming there
- @JJMC89: I figured that since most of the pages in the category are Wikipedia and Help already, there was no need for any restrictions. Also, the code checks if each page has a corresponding {{Editnotices}} page. If that page exists, it checks if it is empty (which might happen). If it is, it adds the notice. If not, it goes to the next page. If the page doesn't exist in the first place, it creates the page by adding the editnotice. If you had looked at the code more thoroughly, I wouldn't have to explain this to you. Please look at the if statement inside the foreach if you want proof. PhilrocMy contribs 23:15, 28 January 2017 (UTC)
- Well, the TE request failed. Guess I can't do the bot. Another admin or TE can make this proposal a reality if they want to. PhilrocMy contribs 16:17, 1 February 2017 (UTC)
- That code is missing namespace restrictions (should only be for Wikipedia and Help pages in the category) and is creating the notice for the subject page instead of the talk page. Does
BRFA filed. — JJMC89 (T·C) 03:47, 2 February 2017 (UTC)
- Thanks! —swpbT 15:08, 2 February 2017 (UTC)
Tidy taxonomy templates
Explanation
"Taxonomy templates" are those with titles like "Template:Taxonomy/..."; they are all listed in Category:Taxonomy templates and its subcategories. They form the "database" of the automated taxobox system.
Most of the taxonomy templates begin {{Don't edit this line {{machine code|}}|{{{1}}}
. For a long time parameter 1 has been redundant to the parameter machine code
. Consider finding the rank of a taxon. What used to happen was that setting |1=rank
caused a switch statement in {{Don't edit this line}} to return the rank of the taxon. Following an update around 2012/13, setting |machine code=rank
causes {{Don't edit this line rank}} to return the rank of the taxon.
A few instances of the earlier kind of call lingered in the underlying code of the system until recently. I've now removed them all, and also the switch statement from {{Don't edit this line}}, so giving a value to parameter 1 has no effect whatsoever.
New taxonomy templates and those edited recently don't contain "|{{{1}}}
", but the great majority still do. This makes the code processing taxonomy templates slightly more complicated, since there can be a following unnamed parameter (the link target) which will be parameter 2 if "|{{{1}}}
" is present, or parameter 1 if not. Having to remember to allow for this makes maintenance more difficult and increases the likelihood of future errors.
Bot action requested
For all templates in Category:Taxonomy templates and its subcategories, replace {{Don't edit this line {{machine code|}}|{{{1}}}
by {{Don't edit this line {{machine code|}}
leaving everything else (including line breaks) untouched.
Peter coxhead (talk) 08:05, 21 January 2017 (UTC)
- @Peter coxhead: I am willing to take this on. Is there consensus for the change from the relevant WikiProject(s) and/or other maintainers/users of the taxonomy templates? If not, please start a central discussion (at Wikipedia talk:Automated taxobox system?) with pointers to it from any relevant WikiProjects. — JJMC89 (T·C) 09:35, 21 January 2017 (UTC)
- No-one other than me seems to be maintaining the system at present (unfortunately, from my point of view, since it's taken up a huge amount of my time which I'd rather have spent on articles). See, e.g. these histories: Module:Autotaxobox – history, Template:Don't edit this line – history, or Template:Automatic taxobox – history. I have posted messages saying what I have been doing in various fora, e.g. Template talk:Automatic taxobox/Archive 13#Lua coding, Template talk:Automatic taxobox, Wikipedia talk:Automated taxobox system#Update and move, but without any response or input from anyone on technical matters. Wikid77 initiated changes to the automated taxobox system with fixes made in mid-2016, by these have been overtaken by my conversion to Lua (which took hundreds of pages out of Category:Pages where expansion depth is exceeded – fixing expansion depth issue was the driver of the change to Lua).
- The key change from using parameter 1 to using
|machine code=
was noted in an HTML comment here on 1 January 2011 by Smith609 (a slightly inaccurate comment, because it was only redundant when used with parameter 1), but this change was never followed through to fix all now redundant uses of parameter 1 and then fix the taxonomy templates. The only other editor who has worked extensively on the automated taxobox system is Bob the Wikipedian. Both are aware of what I have been doing – I have posted on their talk pages, and have had no objections, but neither is currently active in this area of work. Peter coxhead (talk) 10:09, 21 January 2017 (UTC)- BRFA filed. — JJMC89 (T·C) 10:55, 21 January 2017 (UTC)
- Thanks! Peter coxhead (talk) 11:05, 21 January 2017 (UTC)
- Done. — JJMC89 (T·C) 02:24, 2 February 2017 (UTC)
- Thanks! Peter coxhead (talk) 11:05, 21 January 2017 (UTC)
- BRFA filed. — JJMC89 (T·C) 10:55, 21 January 2017 (UTC)
Non-free images used excessively
I'd like a few reports, if anyone's able to generate them.
1) All images in Category:Fair use images, defined recursively, which are used outside of the mainspace.
2) All images in Category:Fair use images, defined recursively, which are used on more than 10 pages.
3) All images in Category:Fair use images, defined recursively, which are used on any page that is not anywhere in the text of the file description page. i.e. If "File:Image1.jpg" was used on page "Abraham Lincoln" but the text "Abraham Lincoln" appeared nowhere on the file page.
If anyone can handle all or some of these, it would be much appreciated. Feel free to write to a subpage in my userspace. ~ Rob13Talk 20:29, 21 January 2017 (UTC)
- No bot needed for tasks 1 and 2:
- https://tools.wmflabs.org/betacommand-dev/nfcc/NFCC9.html
- https://tools.wmflabs.org/betacommand-dev/nfcc/high_use_NFCC.html
- Task 3 was done by
User:BetacommandBot, but the bot and its master have been since blocked.User:FairuseBot, I think. I'd very much like to see this task being done by a bot. – Finnusertop (talk ⋅ contribs) 20:42, 21 January 2017 (UTC)
Replace Donald Trump image with presidential portrait
Per Talk:Donald_Trump/Archive_38#Trump_Photo_2_Rfc, the image File:Donald_Trump_August_19,_2015_(cropped).jpg and variants such as File:Donald_Trump_August_19,_2015_3_by_2.jpg should be replaced with an official presidential portrait, which is at File:Donald_Trump_official_portrait.jpg. Thanks. - CHAMPION (talk) (contributions) (logs) 00:12, 22 January 2017 (UTC)
- It looks like the old images linked at the top of the RFC are used in only about 200 pages, at the most, not including sandbox pages. Someone could probably do this with AWB. Make sure to wash your hands afterwards.
- I read the RFC closure, and it didn't say which types of pages the closure applies to. Should User pages have the image replaced? Archive pages? Talk pages? Pinging EvergreenFir in case this might need clarification. – Jonesey95 (talk) 02:45, 22 January 2017 (UTC)
- Another couple of notes: Some of these images are used on pages where multiple images are being discussed, like Talk:List of Presidents of the United States. Some of the images are used specifically on pages discussing the campaign for president, and illustrate Donald Trump during the campaign. Replacing those photos with a post-campaign official White House photo may not be editorially valid. – Jonesey95 (talk) 04:11, 22 January 2017 (UTC)
- This should be applied only to the mainspace. I'd do this with AWB, but AWB is technically part of the admin toolkit, and I'm technically involved with regard to American politics, so it's not clear that I should do so. ~ Rob13Talk 07:18, 22 January 2017 (UTC)
Doing... per discussion with Rob13 over IRC. --JustBerry (talk) 20:35, 22 January 2017 (UTC)- Done within mainspace for files relating to File:Donald Trump August 19, 2015.jpg. Log available upon request. --JustBerry (talk) 21:07, 22 January 2017 (UTC)
- @JustBerry: Appears File:Donald Trump August 19, 2015 (cropped).jpg still needs doing. ~ Rob13Talk 05:23, 23 January 2017 (UTC)
- Another couple of notes: Some of these images are used on pages where multiple images are being discussed, like Talk:List of Presidents of the United States. Some of the images are used specifically on pages discussing the campaign for president, and illustrate Donald Trump during the campaign. Replacing those photos with a post-campaign official White House photo may not be editorially valid. – Jonesey95 (talk) 04:11, 22 January 2017 (UTC)
That discussion was about the infobox portrait on that article, not the whole of Wikiepdia! All the best: Rich Farmbrough, 00:38, 5 February 2017 (UTC).
Move GA reviews to the standard location
There are about 3000 Category:Good articles that do not have a GA review at the standard location of Talk:<article title>/GA1
. This is standing in the way of creating a list of GAs that genuinely do not have a GA review. Many of these pages have a pointer to the actual review location in the article milestones on the talk page, and these are the ones that could potentially be moved by bot.
There are two cases, the easier one is pages that have a /GA1 page but the substantive page has been renamed. An example is 108 St Georges Terrace whose review is at Talk:BankWest Tower/GA1. This just requires a page move and the milestones template updated. Note that there may be more than one review for a page (sometimes there are several failed reviews before a pass). GA reviews are identified in the milestones template with the field actionn=GAN
and the corresponding review page is found at actionnlink=<review>
. Multiple GA reviews are named /GA1, /GA2 etc but note that there is no guarantee that the review number corresponds to the n number in actionn.
The other case (older reviews, example 100,000-year problem) is where the review took place on the article talk page rather than a dedicated page. This needs a cut and paste to a /GA1 page and the review transcluding back on to the talk page. This probably needs to be semi-automatic with some sanity checks by human, at least for a test run (has the bot actually captured a review, is it a review of the target article, did it capture all of the review). SpinningSpark 08:30, 22 January 2017 (UTC)
- Discussion at Wikipedia talk:Good articles#Article incorrectly listed as GA here? and Wikipedia:Village pump (technical)/Archive 152#GA reviews SpinningSpark 08:37, 22 January 2017 (UTC)
That is a redirect page, and the version Angiosperm at the singular exists. Could an automated process edit the links to the plural version to that of the singular one, and remove Angiosperms ? --Jerome Potts (talk) 12:43, 22 January 2017 (UTC)
- @Jerome Charles Potts: Please review WP:NOTBROKEN. --Izno (talk) 14:49, 22 January 2017 (UTC)
- Actually, the standard practice is to create redirects like this; see WP:R#KEEP #2. — Carl (CBM · talk) 14:52, 22 January 2017 (UTC)
- I see. Thanks for the indications. --Jerome Potts (talk) 21:56, 22 January 2017 (UTC)
Klisf.info dead links
For some time Russian soccer stats website Klisf.info is inactive, inavailable. There are many links to this website (either as incline references or simple external links like the one from this article) and they should be tagged as dead links (at least). --XXN, 12:54, 22 January 2017 (UTC)
- No one bot operator is interested in this task? This is an important thing, there are a lot of articles based on only one Klisf.info dead link, and the WP:VER is problematic. I don't request (yet) to remove these links - just tag them as dead, and another bot will try to update them with a link to an archived version, if possible. The FOOTY wikiproject was notified some time ago, but there is nothing controversial. XXN, 13:55, 10 February 2017 (UTC)
Add https to ForaDeJogo.net
Please change foradejogo.net links to https. I have already updated its templates. SLBedit (talk) 19:24, 22 January 2017 (UTC)
- @SLBedit: see User:Bender the Bot and its contribs. You can contact directly the bot operator, probably. XXN, 13:59, 10 February 2017 (UTC)
- I'll keep it in mind. --bender235 (talk) 15:40, 10 February 2017 (UTC)
User's recognized content list
List like Wikipedia:WikiProject Physics/Recognized content generated by User:JL-Bot/Project content seems very neat. Is it possible to generate and maintain the same list, tied to a user instead of a Wikiproject? For example, I can use it to have a list of DYK/GA/FAs credited to me in my user page. HaEr48 (talk) 03:51, 23 January 2017 (UTC)
- Let's ping JLaTondre (talk · contribs) on this. Headbomb {talk / contribs / physics / books} 04:43, 23 January 2017 (UTC)
- He replied in User talk:JL-Bot#Generating User-centric recognized content and said that he doesn't have time to add this new feature right now, and the way such a thing can be implemented is a bit different from JL-Bot's existing implementation. So probably we need new bots. HaEr48 (talk) 06:50, 9 February 2017 (UTC)
Bot to delete emptied monthly maintenance categories
I notice that we have a bot, AnomieBOT that automatically creates monthly maintenance categories (Femto Bot used to do it earlier). Going by the logs for a particular category, I find that it has been deleted and recreated about 10 times. While all recreations are by bots, the deletions are done by human adminstrators. Why so? Mundane, repetitive tasks like the deletion of such categories (under CSD G6) when they get emptied should be done by bots. This bot task is obviously non-controversial and absolutely non-contentious, since AnomieBOT will recreate the category if new pages appear in the category. 103.6.159.93 (talk) 14:21, 23 January 2017 (UTC)
- Needs wider discussion. It should be easy enough for AnomieBOT III to do this, but I'd like to hear from the admins who actually do these deletions regularly whether the workload is enough that they'd want a bot to handle it. Anomie⚔ 04:54, 24 January 2017 (UTC)
- Are these already being tagged for CSD by a bot? I don't work CAT:CSD has much as I used to, but rarely see these in the backlog there now. — xaosflux Talk 14:08, 24 January 2017 (UTC)
- I think they are tagged manually by editors. Anyway, this discussion is now shifted to WP:AN#Bot to delete emptied monthly maintenance categories, for the establishment of consensus as demanded by Anomie. 103.6.159.67 (talk) 14:12, 24 January 2017 (UTC)
- Thanks for taking it there, 103.6.159.67. It looks like it's tending towards "support", if that keeps up I'll write the code once the discussion there is archived. I also see some good ideas in the comments, I had thought of the "only delete if there are no edits besides AnomieBOT" condition already but I hadn't thought of "... but ignore reverted vandalism" or "don't delete if the talk page exists". Anomie⚔ 03:21, 25 January 2017 (UTC)
- @Xaosflux: No, {{Monthly clean-up category}} (actually {{Monthly clean-up category/core}}) automatically applies {{Db-g6}} if the category contains zero pages. Anomie⚔ 03:12, 25 January 2017 (UTC)
- I think they are tagged manually by editors. Anyway, this discussion is now shifted to WP:AN#Bot to delete emptied monthly maintenance categories, for the establishment of consensus as demanded by Anomie. 103.6.159.67 (talk) 14:12, 24 January 2017 (UTC)
- My experience shows this is safe to delete. They can even be recreated when needed (usually a delayed reversion in a page edit history). -- Magioladitis (talk) 23:09, 24 January 2017 (UTC)
- The question isn't if they're safe to delete, that's obvious. The question is whether the admins who actually process these deletions think it's worth having a bot do it since there doesn't seem to be any backlog. Anomie⚔ 03:12, 25 January 2017 (UTC)
- Category:Candidates for uncontroversial speedy deletion is almost always empty when I drop by it. — xaosflux Talk 03:39, 25 January 2017 (UTC)
- The question isn't if they're safe to delete, that's obvious. The question is whether the admins who actually process these deletions think it's worth having a bot do it since there doesn't seem to be any backlog. Anomie⚔ 03:12, 25 January 2017 (UTC)
- Are these already being tagged for CSD by a bot? I don't work CAT:CSD has much as I used to, but rarely see these in the backlog there now. — xaosflux Talk 14:08, 24 January 2017 (UTC)
The AN discussion is archived now, no one opposed. I put together a task to log any deletions such a bot would make at User:AnomieBOT III/DatedCategoryDeleter test, to see if it'll actually catch anything. If it logs actual deletions it might make I'll make a BRFA for actually doing them. Anomie⚔ 14:45, 31 January 2017 (UTC)
Bot to remove old warnings from IP talk pages
There is consensus for removing old warnings from IP talk pages. See Wikipedia_talk:Criteria_for_speedy_deletion/Archive_9#IP_talk_pages and Wikipedia:Village_pump_(proposals)/Archive_110#Bot_blank_and_template_really.2C_really.2C_really_old_IP_talk_pages.. This task is being done using AWB by BD2412 for several years now. Until around 2007, it was also being done by Tawkerbot.
I suggest that a bot should be coded up to remove all sections from IP talk pages that are older than 2 years, and add the {{OW}} template to the page if it doesn't already exist (placed at the top of the page, but below any WHOIS/sharedip templates) There are many reasons why this should be done by a bot. (i) Bot edits marked as minor do not cause the IPs to get a "You have new messages" notification, when the IP talk page is edited. (ii) Blankings done using AWB also remove any WHOIS/sharedip templates, for which there is no consensus. (iii) This is a type of mundane task that should be done by bots. Human editors should not waste their time with this, rather spend it at tasks that require some human intelligence. 103.6.159.93 (talk) 14:41, 23 January 2017 (UTC)
- Needs wider discussion. These are pretty old discussions to support this sort of mass blanking of talk pages. If I recall correctly, an admin deleted a bunch of IP user talk pages a while back and this proved controversial. This needs a modern village pump discussion. ~ Rob13Talk 20:21, 24 January 2017 (UTC)
- Here is one such discussion that I initiated. I think that two years is a bit too soon. Five years is reasonable. When I do these blankings with AWB, I typically go back seven, just because it is easy to skip any page with a date of 2010 or later on hte page. I think some flexibility could be built in based on the circumstances. An IP address from which only one edit has ever been made, resulting in one comment or warning in response, is probably good for templating after no more than three years. I would add that I intentionally remove the WHOIS/sharedip templates, because, again, these are typically pages with nothing new happening in the past seven (and sometimes ten or eleven) years. We are not a permanent directory of IP addresses. bd2412 T 01:01, 25 January 2017 (UTC)
- @BU Rob13: don't be silly. The is consensus for this since 2006. Tawkerbot did it till 2007 and BD2412 has been doing it for years, without anyone disputing the needs for doing it on his talk page. You correctly remember that MZMcBride used an unapproved bot to delete over 400,000 IP talk pages in 2010. That was obviously controversial since there is consensus only for blankings, not for deletions. Any new discussion on this will only result in repetition of arguements. The only thing that needs discussion is the approach. 103.6.159.84 (talk) 04:29, 25 January 2017 (UTC)
- I am filling a BRFA as soon as this get consensus. --- Magioladitis (talk) 20:27, 24 January 2017 (UTC)
- I wrote above "remove all sections from IP talk pages that are older than 2 years". I realise that this was misunderstood. What I meant was remove the sections in which the last comment is over 2 years old. This is more moderate proposal. Do you agree with this, BD2412? 103.6.159.84 (talk) 04:29, 25 January 2017 (UTC)
- I have two thoughts on that. First, I think that going after individual sections, as opposed to 'everything but the top templates' is a much harder task to program. I suppose it would rely on the last date in a signature in the section, or on reading the page history. Secondly, I think that there are an enormous number of pages to deal with that would have all sections removed even under that criteria, so we may as well start with the easy task of identifying those pages and clearing everything off of them. If we were to go to a section-by-section approach, I would agree with a two year window. bd2412 T 04:35, 25 January 2017 (UTC)
- As mentioned, deletion should NOT be done (and is also not requested), deletion results in hiding tracks that may be of interest (either discussions on a talkpage of an IP used by an editor years ago that has relevance to edits to mainspace pages (every now and then there are edits with a summary 'per discussion on my talk'), and it hides that certain IPS that behaved bad were actually warned (company spamming in 2010, gets several warnings, sits still for 7 years, then someone else spams again - we might consider blacklisting with reasoning 'you were warned in 2010, and now you are at it again' - it may be a different person behind a different IP, and the current editor may not even be aware of the situation of 2 1/2 years ago, it is the same organisation that is responsible). If the talkpage 'exists', and we find the old IP that showed the behaviour, it is easy to find the warnings back; if it involves 15 IPs of which 2 were heavily warned, and those two pages now are also redlinks, we need someone with the admin bit to check deleted revisions on 15 talkpages - in other cases, anyone can do it.
- Now, regarding blanking: what would be the arguments against archiving threads on talkpages where:
- the thread is more than X old (2 years?)
- the IP did not edit in the last Y days (1 year?)
- We would just insert a custom-template in the header like {{IPtalkpage-autoarchive}} which is pointing to the automatically created archives and which provides a lot of explanation, and we have a specified bot that archives these pages as long as the conditions are met. Downside is only that it would preserve utter useless warnings (though, some editors reply to warnings and go in discussion, and are sometimes right, upon which the perceived vandalism is re-performed), upside is that it preserves also constructive private discussions.
- (@BD2412: regarding your "
I think that going after individual sections, as opposed to 'everything but the top templates' is a much harder task to program
" - the former is exactly what our archiving bots do). --Dirk Beetstra T C 05:51, 25 January 2017 (UTC)- As far as I am aware, editors have previously opposed archiving of IP talk pages and so this would require wider discussion at an appropriate forum first. Regarding removal of warnings by section, I don't think there is any need to bother about the time the IP last edited -- the whole point of removing old warnings is to ensure that the current (or future) users of the IPs don't see messages that were intended for someone who used the IP years ago. Ideally, a person who visit their IPtalk should see only the messages intended for that person. 103.6.159.84 (talk) 06:19, 25 January 2017 (UTC)
- That consensus could have changed - it may indeed need a wider community consensus. As I read the above thread, however, removal is not only restricted to warnings, it mentions
remove the sections in which the last comment is over 2 years old
, which also would include discussions. Now, archiving is not a must, one is allowed to simply delete old threads on ones 'own' talkpage. - Whether you archive, or delete - in both cases the effect is the same: the thread that is irrelevant to the current user of the IP is not on the talkpage itself anymore. And with highly fluxional IPs, or with IPs that are used by multiple editors at the same time it is completely impossible to address the 'right editor', you will address all of them. On the other hand, some IPs stay for years with the same physical editor, and the messages that are deleted will be relevant to the current user of the page, even if they did not edit for years. And that brings me to the point whether the editor has been editing in the last year (or whichever time period one choses) - if the IP is continuously editing there is a higher chance that the editor is the same, as when an IP has not been editing for a year (though in both cases, the IP can be static or not static, thwarting that analysis and making it needful to check on a case-by-case basis, which would preclude bot-use). --Dirk Beetstra T C 10:53, 25 January 2017 (UTC)
- I favour archiving using the {{wan}} template rather than blanking. It alerts future editors that there have been previous warnings. If the IP belongs to an organisation, they might just possibly look at the old warnings and discover that the things they are about to do were done before and were considered bad. SpinningSpark 12:07, 25 January 2017 (UTC)
- I think that archiving for old IP talk pages is very problematic. One of the main reasons I am interested in blanking these pages is to reduce link load - the amount of dross on a "What links here" page that obscures the links from that page to other namespaces, which is particularly annoying when a disambiguator is trying to see whether relevant namespaces (mainspace, templates, modules, files) are clear of links to a disambiguation page. All archiving does for IP talk pages is take a group of random conversations - link load and all - and disassociate them from their relevant edit history, which is what a person investigating the IP address is most likely to need. This is very different from archiving article talk pages or wikispace talk pages, where we may need to look back at the substance of old discussions. bd2412 T 14:08, 25 January 2017 (UTC)
- Agree with that. I also don't think archiving of IP talk pages is useful. In any case, it needs to be discussed elsewhere (though IMO it's unlikely to get consensus). There is no point in bringing it up within this bot request. 103.6.159.89 (talk) 15:59, 25 January 2017 (UTC)
- I see the point of that, but that is also the reason why some people want to see what links to a page - where the discussions were. The thread above is rather unspecific, and suggests to blank ALL discussions, not only warnings. And that are the things that are sometimes of interest, plain discussions regarding a subject, or even discussions following a warning. If the talkpage-discussions obscure your view, then you can choose to select incoming links per namespace.
- @103.6.159.89: if there is no consensus to blank, but people are discussing whether it should be blanking or archiving or nothing, then there is no need for a discussion here - bots should simply not be doing this. I agree that the discussion about what should be done with it should be somewhere else. --Dirk Beetstra T C 03:29, 26 January 2017 (UTC)
- You can not choose to select incoming links per namespace if you need to see multiple namespaces at once to figure out the source of a problem. For example, sometimes a link return appears on a page that can not actually be found on that page, but is transcluding from another namespace (a template, a portal, a module, under strange circumstances possibly even a category or file), and you need to look at all the namespaces at once to determine the connection. It would be nice if the interface would allow that, but that would be a different technical request. bd2412 T 17:06, 28 January 2017 (UTC)
- I agree that that is a different technical request. But the way this request is now written (
to remove all sections from IP talk pages that are older than 2 years
) I am afraid that important information could be wiped. I know the problems with the Wikimedia Development team (regarding feature requests etc., I have my own frustrations about that), but alternatives should be implemented with extreme care. I would be fine with removal of warnings (but not if those warnings result in discussion), but not with any other discussions, and I would still implement timing restrictions (not having edited for x amount of time, etc.). --Dirk Beetstra T C 07:32, 29 January 2017 (UTC)- If there is a really useful discussion on an IP talk page that has otherwise gone untouched for half a decade or more, than that discussion should be moved to a more visible location. We shouldn't be keeping important matters on obscure pages, and given the hundreds of thousands of existing IP talk pages, there isn't much that can be more obscure than the random set of numbers designating one of those. (Yes, I know they are not really random numbers, but for purposes of finding a particular one, they may as well be). bd2412 T 17:02, 5 February 2017 (UTC)
- @BD2412: and how are you going to see that (and what is the threshold of importance)? When you archive it is at least still there, with blanking any discussion is 'gone'. --Dirk Beetstra T C 03:54, 9 February 2017 (UTC)
- Then people will learn not to leave important discussions on IP talk pages with no apparent activity after half a decade or more. bd2412 T 04:13, 9 February 2017 (UTC)
- Your kidding, right? Are we here to collaboratively create an encyclopedia, or are we here to teach people a lesson? --Dirk Beetstra T C 05:49, 9 February 2017 (UTC)
- We are not here to create a permanent collection of random IP talk page comments. bd2412 T 00:33, 14 February 2017 (UTC)
- Your kidding, right? Are we here to collaboratively create an encyclopedia, or are we here to teach people a lesson? --Dirk Beetstra T C 05:49, 9 February 2017 (UTC)
- Then people will learn not to leave important discussions on IP talk pages with no apparent activity after half a decade or more. bd2412 T 04:13, 9 February 2017 (UTC)
- @BD2412: and how are you going to see that (and what is the threshold of importance)? When you archive it is at least still there, with blanking any discussion is 'gone'. --Dirk Beetstra T C 03:54, 9 February 2017 (UTC)
- If there is a really useful discussion on an IP talk page that has otherwise gone untouched for half a decade or more, than that discussion should be moved to a more visible location. We shouldn't be keeping important matters on obscure pages, and given the hundreds of thousands of existing IP talk pages, there isn't much that can be more obscure than the random set of numbers designating one of those. (Yes, I know they are not really random numbers, but for purposes of finding a particular one, they may as well be). bd2412 T 17:02, 5 February 2017 (UTC)
- I agree that that is a different technical request. But the way this request is now written (
- You can not choose to select incoming links per namespace if you need to see multiple namespaces at once to figure out the source of a problem. For example, sometimes a link return appears on a page that can not actually be found on that page, but is transcluding from another namespace (a template, a portal, a module, under strange circumstances possibly even a category or file), and you need to look at all the namespaces at once to determine the connection. It would be nice if the interface would allow that, but that would be a different technical request. bd2412 T 17:06, 28 January 2017 (UTC)
- I think that archiving for old IP talk pages is very problematic. One of the main reasons I am interested in blanking these pages is to reduce link load - the amount of dross on a "What links here" page that obscures the links from that page to other namespaces, which is particularly annoying when a disambiguator is trying to see whether relevant namespaces (mainspace, templates, modules, files) are clear of links to a disambiguation page. All archiving does for IP talk pages is take a group of random conversations - link load and all - and disassociate them from their relevant edit history, which is what a person investigating the IP address is most likely to need. This is very different from archiving article talk pages or wikispace talk pages, where we may need to look back at the substance of old discussions. bd2412 T 14:08, 25 January 2017 (UTC)
- I favour archiving using the {{wan}} template rather than blanking. It alerts future editors that there have been previous warnings. If the IP belongs to an organisation, they might just possibly look at the old warnings and discover that the things they are about to do were done before and were considered bad. SpinningSpark 12:07, 25 January 2017 (UTC)
- That consensus could have changed - it may indeed need a wider community consensus. As I read the above thread, however, removal is not only restricted to warnings, it mentions
- As far as I am aware, editors have previously opposed archiving of IP talk pages and so this would require wider discussion at an appropriate forum first. Regarding removal of warnings by section, I don't think there is any need to bother about the time the IP last edited -- the whole point of removing old warnings is to ensure that the current (or future) users of the IPs don't see messages that were intended for someone who used the IP years ago. Ideally, a person who visit their IPtalk should see only the messages intended for that person. 103.6.159.84 (talk) 06:19, 25 January 2017 (UTC)
Trump 2016 election image by state
Can you change the images in Category:United States presidential election, 2016 by state to the cropped version File:Donald Trump official portrait (cropped).jpg so they all match as some are cropped and others aren't. It will also match the main United States presidential election, 2016. 80.235.147.186 (talk) 06:21, 25 January 2017 (UTC)
- Maybe JustBerry could take also this? --Edgars2007 (talk/contribs) 08:28, 25 January 2017 (UTC)
- See above. – Jonesey95 (talk) 14:56, 25 January 2017 (UTC)
- That doesn't address the issues mentioned. 80.235.147.186 (talk) 15:27, 25 January 2017 (UTC)
- @Jonesey95: This issue refers to change between official photos, rather than changing non-official photos to official photos. --JustBerry (talk) 18:43, 25 January 2017 (UTC)
- @80.235.147.186: @Edgars2007: Has consensus been established regarding which image should be used (cropped versus non-cropped)? If so, please link to the discussion. --JustBerry (talk) 18:43, 25 January 2017 (UTC)
- See above. – Jonesey95 (talk) 14:56, 25 January 2017 (UTC)
On hold until the proposal has achieved WP:CONSENSUS. If proposal demonstrates consensus, please link to the corresponding discussion. --JustBerry (talk) 05:21, 26 January 2017 (UTC)
- I propose to change the images in Category:United States presidential election, 2016 by state to a single campaign photo, not a post-election photo. Must illustrate Donald Trump during the campaign, not after. --Frodar (talk) 04:05, 27 January 2017 (UTC)
- This is not the right place for a discussion. I suggest Wikipedia talk:WikiProject Donald Trump. Once consensus has been reached there, post here with a link to the discussion's outcome. Thanks. – Jonesey95 (talk) 04:11, 27 January 2017 (UTC)
- The proposal doesn't need a consensus and no one has disputed the changes to cropped on the other articles. It should be cropped to match the main United States presidential election, 2016 article and an admin has said here you can be WP:BOLD. 80.235.147.186 (talk) 18:20, 27 January 2017 (UTC)
- This is not the right place for a discussion. I suggest Wikipedia talk:WikiProject Donald Trump. Once consensus has been reached there, post here with a link to the discussion's outcome. Thanks. – Jonesey95 (talk) 04:11, 27 January 2017 (UTC)
Migrate from deprecated WikiProject Central America country task forces
All seven of the task forces for WikiProject Central America have graduated to full-fledged WikiProjects (for example, WikiProject Costa Rica) and the task force parameters have been deprecated. We need a bot to go through all of the existing transclusions of {{WikiProject Central America}} and perform the following changes:
- If there are no country task forces assigned, leave the {{WikiProject Central America}} template.
- If there are 1 or 2 country task forces assigned, replace the task force entries with full WikiProject templates for those countries (replicating any relevant assessment data) and delete the {{WikiProject Central America}} template.
- If there are 3 or more country task forces assigned, leave the {{WikiProject Central America}} template and remove the task force entries (as the scope of the topic is unlikely to be country-specific).
The only parameters supported by the country-specific templates are class, importance, small, and listas, so you don't have to worry about replicating any other parameters (like attention or needs-infobox). Kaldari (talk) 00:25, 26 January 2017 (UTC)
Kaldari why do you need listas? The namespace is automatically omitted. -- Magioladitis (talk) 00:32, 26 January 2017 (UTC)
- @Magioladitis: I was just thinking it might be good to copy it if it exists. If listas isn't migrated, that's fine with me though (same for small). Kaldari (talk) 00:54, 26 January 2017 (UTC)
- AFAIK all WikiProject banner templates support
|listas=
- it's one of the three "universal" parameters along with|category=
and|small=
. I should qualify that by modifying that to "... WikiProject banner templates built upon{{WPBannerMeta}}
support ...", that is to say, I don't know of any that don't support these params, apart from Mathematics and Military history, which have their own peculiar banners that are not built upon{{WPBannerMeta}}
. - My recommendation would be that if
{{WikiProject Central America}}
has|listas=
and it is non-blank, copy that to the replacement template - but if there are two or more replacement templates, copy it to just one of them, since it is not required if another WikiProject template on the same page has its own|listas=
set: it not only affects categories used by the banner in which it is set, but it also affects the sortkey of all other banners and templates. --Redrose64 🌹 (talk) 19:08, 26 January 2017 (UTC)- @Kaldari: Could you direct me toward a list of what taskforces get what templates, etc.? Might be able to do this. ~ Rob13Talk 15:53, 31 January 2017 (UTC)
- @BU Rob13::
- Belize -> {{WikiProject Belize}}
- Costa Rica -> {{WikiProject Costa Rica}}
- El Salvador -> {{WikiProject El Salvador}}
- Guatemala -> {{WikiProject Guatemala}}
- Honduras -> {{WikiProject Honduras}}
- Nicaragua -> {{WikiProject Nicaragua}}
- Panama -> {{WikiProject Panama}}
- Kaldari (talk) 18:16, 31 January 2017 (UTC)
- Doing... ~ Rob13Talk 21:58, 31 January 2017 (UTC)
- @BU Rob13: Are you still interested in doing this task? No rush. Just wanted to check in. Kaldari (talk) 03:25, 10 February 2017 (UTC)
- Doing... ~ Rob13Talk 21:58, 31 January 2017 (UTC)
- @BU Rob13::
- @Kaldari: Could you direct me toward a list of what taskforces get what templates, etc.? Might be able to do this. ~ Rob13Talk 15:53, 31 January 2017 (UTC)
- Yes, I've just been both very busy and very low motivation recently due to some things going on. I'll get to it definitely, but might need to be a tad patient with me. I'll try to get a BRFA submitted this weekend or next week. ~ Rob13Talk 04:36, 10 February 2017 (UTC)
- @Kaldari: Written and mostly tested at Module:WikiProject Central America/convert. This will need an AWB bot run to substitute the module (including some alterations to a couple parameters going into the substitution to correct an annoying quirk of using template args in Lua modules). Note that the output from my method of implementation would be something like this: [3] with each parameter on a new line. Are you fine with that? The output of the page isn't changed at all by doing it that way instead of on one line. ~ Rob13Talk 15:12, 10 February 2017 (UTC)
- @BU Rob13: That looks fine with me. Kaldari (talk) 16:09, 10 February 2017 (UTC)
- @Kaldari: BRFA filed as Task 33 of BU RoBOT. ~ Rob13Talk 11:08, 11 February 2017 (UTC)
- @Kaldari: Do you want the country projects to inherit the default importance, the country-specific importance, or the country if it's available and the default if not? ~ Rob13Talk 16:28, 12 February 2017 (UTC)
- @BU Rob13: The country-specific importance if it's available and the default if not. Kaldari (talk) 17:11, 12 February 2017 (UTC)
- @Kaldari: Do you want the country projects to inherit the default importance, the country-specific importance, or the country if it's available and the default if not? ~ Rob13Talk 16:28, 12 February 2017 (UTC)
- @Kaldari: BRFA filed as Task 33 of BU RoBOT. ~ Rob13Talk 11:08, 11 February 2017 (UTC)
- @BU Rob13: That looks fine with me. Kaldari (talk) 16:09, 10 February 2017 (UTC)
- @Kaldari: Written and mostly tested at Module:WikiProject Central America/convert. This will need an AWB bot run to substitute the module (including some alterations to a couple parameters going into the substitution to correct an annoying quirk of using template args in Lua modules). Note that the output from my method of implementation would be something like this: [3] with each parameter on a new line. Are you fine with that? The output of the page isn't changed at all by doing it that way instead of on one line. ~ Rob13Talk 15:12, 10 February 2017 (UTC)
- AFAIK all WikiProject banner templates support
- @Kaldari: Thanks for the clarification. Trial results are available at Wikipedia:Bots/Requests for approval/BU RoBOT 33 if you want to take a look. ~ Rob13Talk 22:59, 12 February 2017 (UTC)
Fix duplicate references in mainspace
Hi. Apologies if this is malformed. I'd like to see a bot that can do this without us depending on a helpful human with AWB chancing across the article. --Dweller (talk) Become old fashioned! 19:11, 26 January 2017 (UTC)
- As a kind of clarification, if an article doesn't used named references because the editors of that article have decided not to, we don't want to require the use of named references to perform this kind of merging. In particular, AWB does not add named references if there are not already named references, in order to avoid changing the citation style. This is mentioned in the page linked above (which is an AWB subpage), but it is an important point for bot operators to keep in mind. — Carl (CBM · talk) 19:27, 26 January 2017 (UTC)
- Been here bonkers years and never come across that, thanks! Wikipedia:Citing_sources#Duplicate_citations suggests finding other ways to fix duplicates. I don't know what those other ways are, but if that makes it too difficult, maybe the bot could only patrol articles that already make use of the refname parameter. --Dweller (talk) Become old fashioned! 19:57, 26 January 2017 (UTC)
- It's easy enough for a bot to limit itself to articles with at least one named ref; a scan for that can be done at the same time as a scan for duplicated references, since both require scanning the article text. — Carl (CBM · talk) 20:26, 26 January 2017 (UTC)
- Smashing! Thanks for the expertise. --Dweller (talk) Become old fashioned! 20:44, 26 January 2017 (UTC)
- Note: this is not what is meant by CITEVAR. It is perfectly fine to add names to references. All the best: Rich Farmbrough, 00:48, 5 February 2017 (UTC).
- Note: this is not what is meant by CITEVAR. It is perfectly fine to add names to references. All the best: Rich Farmbrough, 00:48, 5 February 2017 (UTC).
- Smashing! Thanks for the expertise. --Dweller (talk) Become old fashioned! 20:44, 26 January 2017 (UTC)
- It's easy enough for a bot to limit itself to articles with at least one named ref; a scan for that can be done at the same time as a scan for duplicated references, since both require scanning the article text. — Carl (CBM · talk) 20:26, 26 January 2017 (UTC)
- Been here bonkers years and never come across that, thanks! Wikipedia:Citing_sources#Duplicate_citations suggests finding other ways to fix duplicates. I don't know what those other ways are, but if that makes it too difficult, maybe the bot could only patrol articles that already make use of the refname parameter. --Dweller (talk) Become old fashioned! 19:57, 26 January 2017 (UTC)
NB Your chart, above, is reading my signature as part of my username. Does that need a separate Bot request ;-) --Dweller (talk) Become old fashioned! 12:00, 1 February 2017 (UTC)
Corrections to usages of Template:Infobox television episode
Many occurrences of {{Infobox television episode}} use the uppercase form of parameters rather than lowercase, and spaces in parameters rather than underscores (for example, |Series no=
instead of |series_no=
). This should be updated to use the lowercase/underscore format to match the usages of {{Infobox television}} and {{Infobox television season}}, so that the uppercase/spaced formats can be deprecated in the episode infobox template. This can be done with AWB with two regex search-and-replaces:
- Find:
\n(\s*\|\s*)([A-Z])([a-z\s_]*)(\s*=)
Replace:\n$1{{subst:lc:$2}}$3$4
- Find:
\n(\s*\|\s*)([^\s]+)\s([^\s=]+)(\s*=)
Replace:\n$1$2_$3$4
Given that there are over 8,000 articles using {{Infobox television episode}}, I thought that this would be best for a bot. I attempted one article as an example here and the search-and-replace works as expected. Alex|The|Whovian? 09:09, 3 December 2016 (UTC)
- Since the appearance of the articles using this infobox won't be altered, this seems like a straight "not done" per WP:COSMETICBOT to me. --Redrose64 (talk) 10:21, 3 December 2016 (UTC)
- So, I'd need to update 8,000 articles manually? Great. Alex|The|Whovian? 10:29, 3 December 2016 (UTC)
- If a discussion at the template's talk page chooses to deprecate and replace the upper-case parameters and then remove them from the template's code, you should be able to get a bot run approved. – Jonesey95 (talk) 16:40, 3 December 2016 (UTC)
- @Jonesey95: I'll just get it done manually with an automatic clicker over the save button overnight. Saves the drama of approvals, and it'll get done quicker. Alex|The|Whovian? 01:58, 5 December 2016 (UTC)
- @Jonesey95: On what are you basing the fact that a deprecated parameter should be able to be removed with a bot? I'm trying to get multiple bots approved to do EXACTLY that and am getting hit with WP:COSMETICBOT... See Wikipedia:Bots/Requests for approval/ZackBot 4 & Wikipedia:Bots/Requests for approval/ZackBot 5. --Zackmann08 (Talk to me/What I been doing) 06:17, 5 December 2016 (UTC)
- WP:COSMETICBOT does not apply to this discussion or to those two discussions. Removing deprecated parameters from template transclusions so that the template can be modernized and updated by removing those parameters is a substantive change, and when you are doing it to thousands of articles, a bot is the best way to do it. User:Monkbot, for example, has replaced deprecated parameters in thousands of articles. If you want to apply cosmetic fixes like AWB's general fixes at the same time as that substantive change, that's up to you. – Jonesey95 (talk) 06:44, 5 December 2016 (UTC)
- @Jonesey95: if you'd be willing to chime in on those BRFAs I would greatly appreciate it. (This isn't WP:CANVASSING right?). I agree with you! --Zackmann08 (Talk to me/What I been doing) 07:26, 5 December 2016 (UTC)
- Cool, so, I'm confused. Is this a request that would be able to be approved and run by a bot, or not? Alex|The|Whovian? 07:42, 5 December 2016 (UTC)
- @Jonesey95: My point is that the parameters concerned don't seem to be deprecated: the only discussion that I can find was started yesterday, and has only two participants. Without deprecation, these are merely aliases, and changing one valid form of a parameter to another valid form is a cosmetic change. Using AWB (instead of a bot) to do this would fall foul of WP:AWB#Rules of use item 4 for the same reason. --Redrose64 (talk) 21:34, 5 December 2016 (UTC)
- Cool, so, I'm confused. Is this a request that would be able to be approved and run by a bot, or not? Alex|The|Whovian? 07:42, 5 December 2016 (UTC)
- @Jonesey95: if you'd be willing to chime in on those BRFAs I would greatly appreciate it. (This isn't WP:CANVASSING right?). I agree with you! --Zackmann08 (Talk to me/What I been doing) 07:26, 5 December 2016 (UTC)
- WP:COSMETICBOT does not apply to this discussion or to those two discussions. Removing deprecated parameters from template transclusions so that the template can be modernized and updated by removing those parameters is a substantive change, and when you are doing it to thousands of articles, a bot is the best way to do it. User:Monkbot, for example, has replaced deprecated parameters in thousands of articles. If you want to apply cosmetic fixes like AWB's general fixes at the same time as that substantive change, that's up to you. – Jonesey95 (talk) 06:44, 5 December 2016 (UTC)
- @Jonesey95: On what are you basing the fact that a deprecated parameter should be able to be removed with a bot? I'm trying to get multiple bots approved to do EXACTLY that and am getting hit with WP:COSMETICBOT... See Wikipedia:Bots/Requests for approval/ZackBot 4 & Wikipedia:Bots/Requests for approval/ZackBot 5. --Zackmann08 (Talk to me/What I been doing) 06:17, 5 December 2016 (UTC)
- @Jonesey95: I'll just get it done manually with an automatic clicker over the save button overnight. Saves the drama of approvals, and it'll get done quicker. Alex|The|Whovian? 01:58, 5 December 2016 (UTC)
- If a discussion at the template's talk page chooses to deprecate and replace the upper-case parameters and then remove them from the template's code, you should be able to get a bot run approved. – Jonesey95 (talk) 16:40, 3 December 2016 (UTC)
- So, I'd need to update 8,000 articles manually? Great. Alex|The|Whovian? 10:29, 3 December 2016 (UTC)
Redrose64 makes a good point that there does need to be firm consensus before the bot can be approved. That being said, I think it is good to talk about it before one writes the code and then finds out "no, this STILL would be a cosmetic change". So redrose (can I call you Rudolph? ), let us assume for a moment that Jonesey95 does get a good consensus over the next week. Multiple participants all agreeing that the uppercase params should be removed, not just aliased, but 100% removed. IFF that happens, would this be a worthwhile bot to pursue creating at that time? --Zackmann08 (Talk to me/What I been doing) 21:49, 5 December 2016 (UTC)
- Iff there is consensus on the templates's talk page to deprecate and remove, then go ahead. --Redrose64 (talk) 22:20, 5 December 2016 (UTC)
- @Redrose64: copy that! --Zackmann08 (Talk to me/What I been doing) 22:28, 5 December 2016 (UTC)
- Yes, that is what I was saying above: "If a discussion at the template's talk page ...." – Jonesey95 (talk) 22:53, 5 December 2016 (UTC)
- @Redrose64: copy that! --Zackmann08 (Talk to me/What I been doing) 22:28, 5 December 2016 (UTC)
Even if there is only one post agreeing, there seems to have been no opposition to it since I posted the discussion a week ago. Does this mean that approval for a bot could be gained now? Alex|The|Whovian? 08:42, 16 December 2016 (UTC)
- Over three weeks ago; any further comments? Alex|The|Whovian? 09:23, 27 December 2016 (UTC)
- Did you try asking for feedback at WP:TV? Headbomb {talk / contribs / physics / books} 13:39, 30 December 2016 (UTC)
- Posted at WP:TV, and there now seems to be a stronger support for this at Template talk:Infobox television episode § Deprecating uppercase parameters. Alex|The|Whovian? 01:52, 8 January 2017 (UTC)
- Another bump. This appears to be clear for a go-ahead. Almost two months later, and so many have been approved after this. I might as well do it manually with an auto-clicker. Alex|The|Whovian? 09:06, 22 January 2017 (UTC)
- I am not keen on part of this. In my opinion spaces are preferable to underscores and dashes. This is the sort of text that non-programmers are used to using, and which we use for article names, file names, headings, tables and text. It has been a long and painful road to reduce the amount of camel case in Wikipedia, replacing it with spine case is not a good step.
- All the best: Rich Farmbrough, 23:39, 26 January 2017 (UTC).
- Fair enough. However, this would match the usages of {{Infobox television}} and {{Infobox television season}}, neither of have seem to have had issues with the underscores rather than the spaces. Alex|The|Whovian? 23:54, 26 January 2017 (UTC)
- AlexTheWhovian, if you're looking for a bot to run this (once consensus and blah blah blah), I'm happy to run PrimeBOT for you after the BRFA is accepted. Primefac (talk) 00:57, 30 January 2017 (UTC)
- @Primefac: That would be greatly appreciated, thanks. It appears that there is consensus - no-one has disagreed with replacing the parameters (bar the occurrence of an editor disagreeing with underscores, but this is required for conformity, and they haven't replied since). Alex|The|Whovian? 01:15, 30 January 2017 (UTC)
- AlexTheWhovian, if you're looking for a bot to run this (once consensus and blah blah blah), I'm happy to run PrimeBOT for you after the BRFA is accepted. Primefac (talk) 00:57, 30 January 2017 (UTC)
- Fair enough. However, this would match the usages of {{Infobox television}} and {{Infobox television season}}, neither of have seem to have had issues with the underscores rather than the spaces. Alex|The|Whovian? 23:54, 26 January 2017 (UTC)
- Did you try asking for feedback at WP:TV? Headbomb {talk / contribs / physics / books} 13:39, 30 December 2016 (UTC)
BRFA filed Primefac (talk) 19:31, 30 January 2017 (UTC)
- Done Thank you, Primefac! Alex|The|Whovian? 01:32, 13 February 2017 (UTC)
- Always happy to help :) Primefac (talk) 01:39, 13 February 2017 (UTC)
Create a simple TemplateData for all Infoboxes
- Request
- Check if the /doc file in all templates contained in WP:List of infoboxes contains
<templatedata>
or<templatedata />
or<templatedata/>
:
If none exists add to the bottom:
<templatedata> { "params": {}, "paramOrder": [], "format": "block" } </templatedata>
— አቤል ዳዊት?(Janweh64) (talk) 17:05, 27 January 2017 (UTC)
- Why? How is the editor's or reader's experience improved if this is done? – Jonesey95 (talk) 16:15, 27 January 2017 (UTC)
- It is only step one of a series of bot operations I have in mind to systematical create a base-line TemplateData for all Biography Infoboxes by importing data from Infobox person. Maybe, it is too small of a step. Sorry this is my first bot request. The next step would be to check if the template contains a "honorific_prefix" parameter and if so add between
{}
:"honorific_prefix": {"description": "To appear on the line above the person's name","label": "Honorific prefix","aliases": ["honorific prefix"]},
- and
"honorific_prefix",
inside[]
. Step by step we could accomplish the goals set out by this daunting task. The same idea could be used to create TemplateData to other infoboxes or even many other templates by inheriting the data from their parents.— አቤል ዳዊት?(Janweh64) (talk) 17:05, 27 January 2017 (UTC)- It sounds like this idea needs more development. I suggest having a discussion at that TemplateData talk page, coming up with a plan that could be executed by a bot, and then coming back here with that plan. – Jonesey95 (talk) 17:31, 27 January 2017 (UTC)
- This discussion is proof that discussing things accomplishes nothing. Nevermind, I will just learn how to build a bot myself and get it approved. — አቤል ዳዊት?(Janweh64) (talk) 23:10, 27 January 2017 (UTC)
- Sorry to disappoint you. So that you don't waste your time and get more frustrated, here's one more piece of information: you will find that when you make a bot request, you will also be asked for a link to the same sort of discussion. If you take a look at Wikipedia:Bots/Requests for approval, you will see these instructions:
If your task could be controversial (e.g. most bots making non-maintenance edits to articles and most bots posting messages on user talk pages), seek consensus for the task in the appropriate forums. Common places to start include WP:Village pump (proposals) and the talk pages of the relevant policies, guidelines, templates, and/or WikiProjects. Link to this discussion from your request for approval.
This is how things work. – Jonesey95 (talk) 00:51, 28 January 2017 (UTC)- I was rude. You were kind. — አቤል ዳዊት?(Janweh64) (talk) 13:46, 28 January 2017 (UTC)
- Sorry to disappoint you. So that you don't waste your time and get more frustrated, here's one more piece of information: you will find that when you make a bot request, you will also be asked for a link to the same sort of discussion. If you take a look at Wikipedia:Bots/Requests for approval, you will see these instructions:
- This discussion is proof that discussing things accomplishes nothing. Nevermind, I will just learn how to build a bot myself and get it approved. — አቤል ዳዊት?(Janweh64) (talk) 23:10, 27 January 2017 (UTC)
- It sounds like this idea needs more development. I suggest having a discussion at that TemplateData talk page, coming up with a plan that could be executed by a bot, and then coming back here with that plan. – Jonesey95 (talk) 17:31, 27 January 2017 (UTC)
- It is only step one of a series of bot operations I have in mind to systematical create a base-line TemplateData for all Biography Infoboxes by importing data from Infobox person. Maybe, it is too small of a step. Sorry this is my first bot request. The next step would be to check if the template contains a "honorific_prefix" parameter and if so add between
WP:UAAHP
Hi, is it possible for a bot, such as DeltaQuadBot, to remove stale reports at the UAA holding pen (those blocked and those with no action in seven days), like it does with blocked users and declined reports at WP:UAAB? If this is not possible I would be happy to create my own bot account and have it do this task instead. Thanks! Linguisttalk|contribs 22:23, 28 January 2017 (UTC)
- You should ask DeltaQuad if she would consider adding it to her bot. Also, is this something that the UAA admins want? — JJMC89 (T·C) 22:39, 28 January 2017 (UTC)
- I haven't asked the UAA regulars but I'm sure this would be helpful. In fact, I'm almost the only one who cleans up the HP and it would be helpful to me. Linguisttalk|contribs 22:41, 28 January 2017 (UTC)
Local Route □ (South Korea)
Please replace these lists below to "[[Local Route □ (South Korea)|Local Route □]]". Read Talk:Local highways of South Korea for the reason.
- [[Provincial Route □ (South Korea)|Provincial Route □]]
- [[Jibangdo □]]
- [[Gukjido □]]
- [[Korea Provincial Route □|Provincial Route □]]
- [[South Korea Provincial Route □|Provincial Route □]]
- [[South Korean Provincial Route □|Provincial Route □]]
- [[Korean Provincial Route □|Provincial Route □]]
- [[Provincial Route □ (Korea)|Provincial Route □]]
- [[Local Road □ (South Korea)|Local Road □]]
- [[National Support Provincial Route □|Provincial Route □]]
- [[National Support Provincial Route □]]
--
- @ㅂㄱㅇ: Can you provide an example edit to show the type of change you want made? ~ Rob13Talk 08:37, 29 January 2017 (UTC)
- @BU Rob13: This is an example. --
ㅂㄱㅇ (talk) 03:22, 30 January 2017 (UTC)- I don't see any consensus on the talk page you pointed me to requiring that these are presented in this way. Is there a discussion associated with this? ~ Rob13Talk 03:34, 30 January 2017 (UTC)
- @BU Rob13: The main page(Local highways of South Korea) was moved by talk, so subpages(Local Route □ (South Korea)) have to moved, too. --
ㅂㄱㅇ (talk) 01:13, 31 January 2017 (UTC)- Yes, but that doesn't necessarily mean changing all wikilinks. The other terms could also be acceptable. ~ Rob13Talk 01:27, 31 January 2017 (UTC)
- @BU Rob13: The main page(Local highways of South Korea) was moved by talk, so subpages(Local Route □ (South Korea)) have to moved, too. --
- I don't see any consensus on the talk page you pointed me to requiring that these are presented in this way. Is there a discussion associated with this? ~ Rob13Talk 03:34, 30 January 2017 (UTC)
- @BU Rob13: This is an example. --
- What are all the little squares for? They're not explained anywhere here. Are they part of the route coding system? When I try removing a nowiki, as with Provincial Route □, all I get is a redlink - no actual valid page. --Redrose64 🌹 (talk) 12:02, 31 January 2017 (UTC)
- @Redrose64: That's number. Example: [[Provincial Route 13 (South Korea)|Provincial Route 13]]. That pages are not exist yet. I'm going to create some of that. --
ㅂㄱㅇ (talk) 12:31, 31 January 2017 (UTC)
- @Redrose64: That's number. Example: [[Provincial Route 13 (South Korea)|Provincial Route 13]]. That pages are not exist yet. I'm going to create some of that. --
- Needs wider discussion. as noted above. ~ Rob13Talk 16:14, 11 February 2017 (UTC)
This backlog is really crowded (it is near 200K). Legobot used to tag images that have rationales, but they have not edited in the File namespace since May 2014 [4] . Is there a way that someone could set up their bot to take over this bot's previous responsibility? Thanks. -- 1989 (talk) 23:32, 31 January 2017 (UTC)
- @1989: See Wikipedia:Bots/Requests_for_approval/BU_RoBOT_32. I filed a BRFA to help address this a few days ago. ~ Rob13Talk 01:55, 1 February 2017 (UTC)
- @Legoktm: May I ask why your bot has stopped performing this task? -- 1989 (talk) 13:48, 1 February 2017 (UTC)
- No idea why it stopped, but it didn't look trivial to restart when I glanced at the code just now - it relied on a hard coded list of template redirects and stuff. I could fix it up to run again but if someone else is willing to take the task over, I'd much prefer and encourage that. Legoktm (talk) 20:32, 1 February 2017 (UTC)
- A lot of bots stopped working around that time; Toolserver was becoming more and more flaky, and was taken down permanently on 1 July 2014. --Redrose64 🌹 (talk) 10:35, 2 February 2017 (UTC)
- No idea why it stopped, but it didn't look trivial to restart when I glanced at the code just now - it relied on a hard coded list of template redirects and stuff. I could fix it up to run again but if someone else is willing to take the task over, I'd much prefer and encourage that. Legoktm (talk) 20:32, 1 February 2017 (UTC)
Require a list of articles
As per the discussion at Wikipedia talk:WikiProject Airports#Request for comments on the Airlines and destinations tables there is a push to replace the "Terminals" with "References" in {{Airport destination list}} where it is used in airport articles. I'd like a bot to make a list of airport articles that contain "3rdcoltitle" and a separate list of articles containing "4thcoltitle". Thanks. CambridgeBayWeather, Uqaqtuq (talk), Sunasuttuq 10:55, 1 February 2017 (UTC)
- @CambridgeBayWeather: Doing... Should get this done by tonight. PhilrocMy contribs 14:18, 2 February 2017 (UTC)
- @CambridgeBayWeather: I have made code for the bot, which is visible here. PhilrocMy contribs 14:59, 2 February 2017 (UTC)
- Philroc. That's great. Thanks very much. CambridgeBayWeather, Uqaqtuq (talk), Sunasuttuq 17:09, 2 February 2017 (UTC)
- @CambridgeBayWeather: Before I can file a BRFA, I need your approval. Just want to make sure that the BRFA won't be rejected for too little consensus. PhilrocMy contribs 17:38, 2 February 2017 (UTC)
- Philroc. I think it's fine but then I'm a bit biased eh? What exactly is it you would like me to do? CambridgeBayWeather, Uqaqtuq (talk), Sunasuttuq 17:42, 2 February 2017 (UTC)
@CambridgeBayWeather: Just say that you want a BRFA to happen or that you don't want one to happen. PhilrocMy contribs 17:47, 2 February 2017 (UTC)@CambridgeBayWeather: Actually, you already gave your approval. Will make a BRFA later today. PhilrocMy contribs 17:50, 2 February 2017 (UTC)- @CambridgeBayWeather: Just found that the bot doesn't need a BRFA because the lists will be in my userpage. Will run the bot later today. PhilrocMy contribs 21:00, 2 February 2017 (UTC)
- Philroc. I think it's fine but then I'm a bit biased eh? What exactly is it you would like me to do? CambridgeBayWeather, Uqaqtuq (talk), Sunasuttuq 17:42, 2 February 2017 (UTC)
- @CambridgeBayWeather: Before I can file a BRFA, I need your approval. Just want to make sure that the BRFA won't be rejected for too little consensus. PhilrocMy contribs 17:38, 2 February 2017 (UTC)
- @CambridgeBayWeather: Still doing... This is taking longer than I thought. I will contiune working to getting the bot ready to run. I will tell you when I am finished. PhilrocMy contribs 01:30, 3 February 2017 (UTC)
- No worries. I'm not in any rush. Thanks. CambridgeBayWeather, Uqaqtuq (talk), Sunasuttuq 06:12, 3 February 2017 (UTC)
- Philroc. That's great. Thanks very much. CambridgeBayWeather, Uqaqtuq (talk), Sunasuttuq 17:09, 2 February 2017 (UTC)
- @CambridgeBayWeather: list for 3rdcoltitle is here. I found no instances of 4thcoltitle. All the best: Rich Farmbrough, 22:22, 13 February 2017 (UTC).
- Rich Farmbrough. Thanks. That's a lot less than I expected. CambridgeBayWeather, Uqaqtuq (talk), Sunasuttuq 04:08, 14 February 2017 (UTC)
MOS Bot
A bot to perform certain simple edits to make articles comply with MOS. One feature would be to italicize foreign words (the bot could have a list of words that are commonly used, but need to be italicized). And also to removing the italics from words that don't need them, but commonly are italicized. It would also add a {{nbsp}} between any integer, and AD, BC, CE, BCE. It would also capitalize AD, BC, CE, or BCE if it was next to a number. Iazyges Consermonor Opus meum 17:30, 1 February 2017 (UTC)
- This isn't a suitable task for a bot. See WP:CONTEXTBOT for details. Headbomb {talk / contribs / physics / books} 17:39, 1 February 2017 (UTC)
- To give a concrete example, look at this article to see the phrase "2014 ad campaign", which should obviously not be changed by a bot to "2014{{nbsp}}AD campaign". The word "ad" in this context means "advertisement", not "anno domini". There is no way for a bot to tell the difference. – Jonesey95 (talk) 18:13, 1 February 2017 (UTC)
- However a good NLP system would realise this, because there are three glaring clues, firstly the lower, secondly the presence of the word "campaign" and thirdly AD is rarely used with years as late as 2104. The combination would disambiguate ad to "advertisement" rather than its alternative meanings. All the best: Rich Farmbrough, 00:57, 5 February 2017 (UTC).
- In the meantime, if anyone is interested, I have a formatting script that I use for making MOS-related changes. -- Ohc ¡digame! 23:20, 14 February 2017 (UTC)
- However a good NLP system would realise this, because there are three glaring clues, firstly the lower, secondly the presence of the word "campaign" and thirdly AD is rarely used with years as late as 2104. The combination would disambiguate ad to "advertisement" rather than its alternative meanings. All the best: Rich Farmbrough, 00:57, 5 February 2017 (UTC).
- To give a concrete example, look at this article to see the phrase "2014 ad campaign", which should obviously not be changed by a bot to "2014{{nbsp}}AD campaign". The word "ad" in this context means "advertisement", not "anno domini". There is no way for a bot to tell the difference. – Jonesey95 (talk) 18:13, 1 February 2017 (UTC)
One-off bot to ease archiving at WP:RESTRICT
This isn't urgent, or even 100% sure to be needed, but it looks likely based on this discussion that we will be moving listings at WP:RESTRICT to an archive page if the user in question has been inactive for two years or more. Some of the restrictions involve more than one user and would require a human to review them, but it would be awesome if a bot could determine that if a user listed there singly had not edited at all in two or more years it could automatically transfer their listing to the archive. There are aloso some older restrictions that involved a whole list of users (I don't think arbcom does that anymore), and in several of those cases all of the users are either blocked or otherwise totally inactive. This would only be needed once, just to reduce the workload to get the archive started. (the list is extremely long, which is why this was proposed to begin with) Is there a bot that could manage this? Beeblebrox (talk) 18:46, 4 February 2017 (UTC)
- Ongoing would be better, and even bringing back "resurrected" users might be helpful too. All the best: Rich Farmbrough, 01:01, 5 February 2017 (UTC).
- Doing... All the best: Rich Farmbrough, 23:25, 13 February 2017 (UTC).
- Awesome, the discussion was archived without a formal close, but consnesus to do this is pretty clear. Beeblebrox (talk) 20:57, 15 February 2017 (UTC)
- Doing... All the best: Rich Farmbrough, 23:25, 13 February 2017 (UTC).
Addition of tl:Authority control
There are very many biographies - tens of thousands - for which wikidata has authority control data - example Petscan, and where such data is not displayed in the article. {{Authority control}}, with no parameters, displays authority control data from wikidata. It is conventionally placed immediately before the list of categories at the foot of an article. It is used on 510,000+ articles and appears to be the de facto standard for handling authority control data on wikipedia.
Would this make a good bot task: use the petscan lists, like the above, to identify articles for which {{Authority control}} can be placed at the foot of the article? --Tagishsimon (talk) 11:51, 6 February 2017 (UTC)
I think there is already a bot doing this. -- Magioladitis (talk) 18:48, 11 February 2017 (UTC)
Tagishsimon User:KasparBot adds Authority Control. -- Magioladitis (talk) 09:24, 12 February 2017 (UTC)
- Thanks Magioladitis. I've added to User talk:T.seppelt. --Tagishsimon (talk) 17:34, 12 February 2017 (UTC)
Doing... I'm taking care of it. --19:01, 12 February 2017 (UTC) — Preceding unsigned comment added by T.seppelt (talk • contribs)
Requesting bot for wikisource
I'm not sure exactly what to say here, at least in part because I'm not sure exactly what functions we are necessarily seeking a bot to do. But there is currently a discussion at wikisource:Wikisource:Scriptorium#Possible bot about trying to get some sort of bot which would be able to generate an output page roughly similar to Wikipedia:WikiProject Christianity#Popular pages and similar for the portals, author, and categories over there at wikisource. I as an individual am not among the most knowledgeable editors there. On that basis, I think it might be useful to get input from some of the more experienced editors there regarding any major issues which might occur to either a bot developer or them but not me. Perhaps the best way to do this would be to respond at the first linked to section above and for the developer to announce himself, perhaps in a separate subsection of the linked to thread there, to iron out any difficulties. John Carter (talk) 14:31, 6 February 2017 (UTC)
How about a bot to update (broken) sectional redirects?
When a section heading is changed, it breaks all redirects targeting that heading. Those redirects then incorrectly lead to the top of the page rather than to the appropriate section.
Is this desirable and feasible? If so, how would such a script work? The Transhumanist 22:14, 6 February 2017 (UTC)
- This may turn out to be a WP:CONTEXTBOT. How often do people delete the section entirely, or split the section into two (then which should the bot pick?), or revise the section such that the redirect doesn't really apply anymore? Can the bot correctly differentiate these cases from cases where it can know what section to change the target to?
- Such a script would presumably work by watching RecentChanges for edits that change a section heading, and then would check all redirects to the article to see if they targeted that section. It would probably want to delay before actually making the update in case the edit gets reverted or further edited. Anomie⚔ 22:29, 6 February 2017 (UTC)
Need a red link stripper
This would be for removing red links generated by template substitution, where the template includes every potential link for a given topic type. For example, a red link stripper would be useful on city outlines created by template, where many of the links will be red for a particular city because the generated links don't apply to that city. With a red link stripper, you could create a city outline via template, then strip out the red links. That would provide a good base to which to add more topics.
Years ago, the outline department used a template to create outlines on every country in the world, and then stripped out the red links by hand. It was very time consuming and tedious, dragging on for years. Some of those still need the red links stripped. So...
What is the easiest way for a script to identify red links?
How would a script go about removing red links?
To clarify, first a page is created and a template used to add the content via substitution, resulting in a lot of red links throughout. Then the red link stripper is used to strip them out (actually, it will delete some, and delink others, with the end result of removing the red links). I look forward to your answers to the questions above. The Transhumanist 00:40, 7 February 2017 (UTC)
- @The Transhumanist: I highly doubt that it'll be useful for a bot, but I have a Javascript-based redlink remover script that I created last year, and it's mighty handy at keeping Category:Wikipedia red link cleanup empty (example). Might give some inspiration, what with the comments through the code. Alex|The|Whovian? 00:48, 7 February 2017 (UTC)
- Thank you! I'd be lost without your help. The Transhumanist 00:34, 9 February 2017 (UTC)
- The following discussion is closed. Please do not modify it. Subsequent comments should be made in a new section.
See discussion at Wikipedia talk:WikiProject Star Trek#Lt. Ayala. At some point, someone decided that this "named extra" who is in the background of almost every single episode of Star Trek:Voyager is actually a "guest appearance" despite the fact that they are never listed as such in the opening credits. As such they are listed as a guest appearance in the infobox for nearly every single episode of this program. Removing all these erroneous mentions would be extremely tedious, if a bot could do a sweep and remove them all that would be great. Beeblebrox (talk) 10:14, 8 February 2017 (UTC)
- Beeblebrox, just to check, you're talking about removing
* [[Tarik Ergin]] - Lt. [[Ayala (Star Trek)|Ayala]]
(and similar) from all pages listed in Category:Star Trek: Voyager episodes? If so, this should be manageable. Primefac (talk) 19:28, 8 February 2017 (UTC)- Yes, that's the idea. thanks. Beeblebrox (talk) 20:41, 8 February 2017 (UTC)
- Cool. I'll work up some regex and hopefully have something to submit in the next day or three. Primefac (talk) 03:21, 10 February 2017 (UTC)
- Yes, that's the idea. thanks. Beeblebrox (talk) 20:41, 8 February 2017 (UTC)
Doing... -- Magioladitis (talk) 18:40, 11 February 2017 (UTC)
Appox. 100 pages to fix. -- Magioladitis (talk) 18:45, 11 February 2017 (UTC)
Done 93 pages. -- Magioladitis (talk) 18:47, 11 February 2017 (UTC)
- [5][6][7][8]. You're basically running an unattended process on your main account while making other edits simultaneously. 100 pages isn't exactly a lot, but you're basically ignoring BOTPOL/MEATBOT. — HELLKNOWZ ▎TALK 19:40, 11 February 2017 (UTC)
Hellknowz you mean I should have changed the edit summaries for those... There were 160 pages. I skipped mot of them. -- Magioladitis (talk) 22:42, 11 February 2017 (UTC)
- No, I mean that you clearly ran an automated task on your main account ignoring BOTPOL. — HELLKNOWZ ▎TALK 23:34, 11 February 2017 (UTC)
Hellknowz please read the BRFA where it was in fact rejected because the number of edits was too low for a bot. There was a clear suggestion to do the edits manually Wikipedia:Bots/Requests for approval/PrimeBOT 11. Note also that the request was for a max of 161 edits while I did 93 i.e. 43% fewer edits. -- Magioladitis (talk) 23:47, 11 February 2017 (UTC)
- It was clearly not a manual run, regardless of the actual task. — HELLKNOWZ ▎TALK 00:19, 12 February 2017 (UTC)
- I do find it mildly amusing that "I did less than required" is their defence. Primefac (talk) 03:24, 12 February 2017 (UTC)
Hellknowz I even missed one fix in my run. Perhaps I undid my accidentally hitting the mouse. [9]. See also that here I removed a deleted image but later I got more lazy. -- Magioladitis (talk) 00:24, 12 February 2017 (UTC)
- You've been berated and even taken to ArbCom for doing cosmetic/bot-like edits, and here you are clearly breaking protocol. I'm failing to understand how "skip if only genfixes performed" is so difficult for you to enable. Primefac (talk) 03:22, 12 February 2017 (UTC)
- The above discussion is closed. Please do not modify it. Subsequent comments should be made in a new section.
Script works intermittently
Hi guys, I'm stuck.
I forked the redlink remover script above, with an eye toward possibly developing a bot from it in the future, after I get it to do what I need on pages one-at-a-time. But first I need to get it to run. Sometimes it works and sometimes it doesn't (mostly doesn't).
For example, the original worked on Chrome for AlexTheWhovian but not for me. But later, it started working for no apparent reason. I also had the fork I made (from an earlier version) working on two machines with Firefox. But I turned one off for the night. And in the morning, it worked on one machine and not the other.
The script I'm trying to get to work is User:The Transhumanist/OLUtils.js.
I'm thinking the culprit is a missing resource module or something.
Is there an easy way to track down what resources the script needs in order to work? Keep in mind I'm a newb. The Transhumanist 01:41, 11 February 2017 (UTC)
After some trial and error, I learned the following: in Firefox, if I run the Feb 28 2016 version of User:AlexTheWhovian/script-redlinks.js and if I use it to strip redlinks from a page (I didn't save the page), then I can load the 15:05, December 26, 2016 version and it works.
Does anyone have any idea why using one script (not just loading it) will cause another script to work? I'm really confused. The Transhumanist 05:33, 11 February 2017 (UTC)
- Maybe one has dependencies that it doesn't load itself, instead relying on other scripts to load them. --Redrose64 🌹 (talk) 21:55, 11 February 2017 (UTC)
- The author said it was stand alone. (They are both versions of the same script). I now have them both loaded, so I can more easily use the first one (User:The Transhumanist/redlinks Feb2016.js) to enable the other (User:The Transhumanist/OLUtils.js). Even the original author doesn't know why it isn't working.
- What's the next step in solving this? The Transhumanist 06:46, 12 February 2017 (UTC)
- You've changer the outer part: that's what I would suspect, maybe not loading the mw library properly. Possibly the best way is to make the changes step-by-step, with a browser restart between. (Or better still binary chop.) All the best: Rich Farmbrough, 22:32, 13 February 2017 (UTC).
- You've changer the outer part: that's what I would suspect, maybe not loading the mw library properly. Possibly the best way is to make the changes step-by-step, with a browser restart between. (Or better still binary chop.) All the best: Rich Farmbrough, 22:32, 13 February 2017 (UTC).
- What's the next step in solving this? The Transhumanist 06:46, 12 February 2017 (UTC)
Creating a list of red-linked entries at Recent deaths
I request a bot to create and maintain a list consisting of red-linked entries grabbed from the Deaths in 2017 page, as and when they get added there. These entries, as you may know, are removed from the "Deaths in ... " pages if an article about the subject isn't created in a month's time. It would be useful to maintain a list comprising of just the red entries (from which they are not removed on any periodic basis) for editors to go through. This would increase the chances of new articles being created. Preferably at Wikipedia:WikiProject Biography/Recent deaths (red links), or in the bot's userspace to begin with. (In the latter case, the bot wouldn't need any BRFA approval.) 103.6.159.71 (talk) 12:54, 15 February 2017 (UTC)
Check book references for self-published titles
This is in response to this thread at VPT. So here's the problem. We have a list of vanity publishers whose works should be used with extreme caution, or never (some of these publishers exclusively publish bound collections of Wikipedia articles). But actually checking if a reference added to Wikipedia is on this list is time consuming. However, it occurs to me that in some cases it should be simple to automate. At any Amazon webpage for a book, there is a line for the publisher, marked "publisher". On any GoogleBooks webpage, there is a similar line to be found in the metadata hiding in the page source. If an ISBN is provided in the reference, it can be searched on WorldCat to identify the publisher.
So it seems to me like a bot should be able to do the following:
- 1) Watch recent changes for anything that looks like a link or reference to a book, such as a "cite book" template, a number that looks like an ISBN, or a link to a website like Amazon or GoogleBooks
- 2) Follow the link (if to Amazon or GoogleBooks), or search the ISBN (if provided), to identify the publisher
- 3) Check the publisher against the list of vanity publishers
- 4) Any positive hits could then be automatically reported somewhere on Wikipedia. There could even be blacklisted publishers (such as those paper mirrors of Wikipedia I mentioned) that the bot could automatically revert, after we're sure there are few/no false positives
What do people think? Doable? Someguy1221 (talk) 00:13, 16 February 2017 (UTC)
Bot to move files to Commons
Certainly files that a trusted user like Sfan00 IMG has reviewed and marked as suitable for move to Commons, can be moved without any further review using bot? All these files are tagged with {{Copy to Commons|human=Sfan00 IMG}}
and appear in Category:Copy to Wikimedia Commons reviewed by Sfan00 IMG. There are over 11,000 files. I have personally no experience in dealing with files and so can't talk about the details, but I reckon something like CommonsHelper would be useful? I have asked Sfan about this but they have been inactive for 3 days.
If the process is likely to be error-free, I suppose that instead of marking the transferred files as {{NowCommons}} (which creates more work for admins in deleting the local copy), the bot could outright delete them under CSD F8. 103.6.159.65 (talk) 05:14, 16 February 2017 (UTC)
- Technical details: such a bot would need to operate on a user-who-tagged-the-file basis; I'd envision using a parameter with the tagging user's username, combined with some setup comparable to {{db-u1}} to detect if the last user to edit the page was someone other than the user whose name appears in the parameter. On eligibility for specific user participation, I'm hesitant with Sfan00 IMG, basically because it's a semiautomated script account, and I'd like to ensure that every such file be checked manually first; of course, if ShakespeareFan00 is checking all these images beforehand and then tagging the checked images with the script, that's perfectly fine. Since you asked my perspective as a dual-site admin: on the en:wp side, the idea sounds workable, and bot-deleting the files sounds fine as long as we're programming it properly. On the Commons side, I hesitate. We already have several bots that do the Commons side of things, and they tend to do a rather poor-quality job; they can accurately copy the license and description, but they often mess up with the date and sometimes have problems with copying template transclusion properly, and they're horrendous with categories (which are critical for Commons images) — basically the only options with such bots are leaving the images entirely uncategorised, or ending up with absolute junk, e.g. "Companies of [place]" on portraits because the subject was associated with a company from that place; you can see one bad example in this revision of File:Blason Imbert Bourdillon de la Platiere.svg. If we deem it a good idea, adding another Commons bot would be fine; the issue is whether having a bot do this at all is a good idea on the Commons side. Nyttend (talk) 05:55, 16 February 2017 (UTC)
- phew, there are 200k files in Copy to Wikimedia Commons (bot-assessed) which need further review. But what is surprising is that there are over 12,000 in Category:Copy to Wikimedia Commons maincat, which must all have been tagged by humans (because the bot-tagged ones are in the former cat). I wonder whether it would be a good idea to have a bot identify the tagger from the page history and add the |human= parameter. I also note that there are some files like File:Ambyun official.jpg that were tagged by Sfan without the human parameter. 103.6.159.65 (talk) 15:17, 16 February 2017 (UTC)
Linkfix: www.www. to www.
We have 87 links in the form www.www.foo.bar which should really be www.foo.bar - the form www.www. is normally a fault. A simple link checker with text replacement would help.--Oneiros (talk) 13:49, 16 February 2017 (UTC)