Page semi-protected

Wikipedia:Bots/Requests for approval

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

BAG member instructions

If you want to run a bot on the English Wikipedia, you must first get it approved. To do so, follow the instructions below to add a request. If you are not familiar with programming it may be a good idea to ask someone else to run a bot for you, rather than running your own.

 Instructions for bot operators

Current requests for approval

OmarGhridaBot

Operator: Omar Ghrida (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 21:39, Sunday, July 15, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python (Pywikibot)

Source code available: no

Function overview: Pages Maintenances

Links to relevant discussions (where appropriate):

Edit period(s):

Estimated number of pages affected:

Namespace(s):all namespaces

Exclusion compliant (Yes/No): yes

Function details:

  1. Fixes double redirects
  2. deletes tag broken redirects for speedy deletion
  3. add portals to articles
  4. Adds missing tag and references section if needed
  5. Adds missing {{reflist}} template to article if needed .
  6. add template {{orphan}} to pages are currently unlinked by other pages.

Discussion

Overview

Just a few thoughts/questions regarding the tasks proposed:

  • Xqbot and AvicBot already deal with double redirects. Is a third bot necessary?
  • What is a "broken" redirect? How is a non-admin bot going to delete it?
  • How will it determine which portals to place?

I'm most curious/concerned about the second one, to be honest. Primefac (talk) 23:11, 15 July 2018 (UTC)

@Primefac: Thank you for your comment, first for "double redirect", I don't know if there is a policy that refuses to work bots for the same task, but Maybe I agree with you in this . Second, for "broken redirect" .. Of course by candidates for speedy deletion , and finally, Add portals will be via Equivalent portals from "frwiki" and "itwiki" and via Categories With a specific algorithm. --Omar Ghrida (talk) 12:26, 16 July 2018 (UTC)
When you say 'with a specific algorithm' - can you give details of the algorithm you'll be using? ƒirefly ( t · c · who? ) 20:31, 16 July 2018 (UTC)
Indeed, that was sort of what I was hoping to find out. Also, re: my question about broken redirects - are you saying that your bot would simply tag "bad" redirects for {{db-g6}} deletion? Primefac (talk) 13:27, 17 July 2018 (UTC)
@Firefly: for add portals: if the article hasn't portal link, the bot Will search at the equivalent article in frwiki and itwiki,then it will add portal link. Or if the name of article and category have identical name it will search if there is a portal for example here. @Primefac: yes for all pages existent in Special:BrokenRedirects, Do you see a problem in this task? . Thanks to all --Omar Ghrida (talk) 16:56, 17 July 2018 (UTC)
Um... that's not how a portal works. Putting {{portal bar|france}} at Nice links to Portal:France, not fr:Nice. Primefac (talk) 17:21, 17 July 2018 (UTC)

Mr @Primefac:, I mean here Method of Category, in category:Nice exist template portal, and template "coord" of geography exist in the article. But if there is an error in this Maybe I do not use it. and I Use only Equivalent portalss method. thank you --Omar Ghrida (talk) 18:16, 17 July 2018 (UTC)

This collection of items need to be evaluated separately, suggesting numbering them as Task numbers 1 through 4. For #2 (deletes broken redirects) - this is denied as not possible for non-admins to run admin bots. Do you mean to have it "tag for deletion" as suggested by @Primefac: above? — xaosflux Talk 19:41, 18 July 2018 (UTC)

@Xaosflux: yes OK, I did. for #2 exactly You can see the last 10 changes here in Test Wikipedia but Maybe I can undo this task if an error will happen. --Omar Ghrida (talk) 20:23, 18 July 2018 (UTC)
OK, I made sections below for each of the tasks you would like to perform. — xaosflux Talk 21:10, 18 July 2018 (UTC)

Task 1 (Fix double redirects)

@Xaosflux: exactly I'll just work for Special:DoubleRedirects. --Omar Ghrida (talk) 14:24, 20 July 2018 (UTC)

Task 2 (tag broken redirects)

exactly I'll just work for Special:DoubleRedirects. --Omar Ghrida (talk) 14:24, 20 July 2018 (UTC)

Task 3 (add portals to articles)

Please make 5 of these edits with your own account and post the diffs below to better demonstrate what you will be doing here. — xaosflux Talk 21:15, 18 July 2018 (UTC)

OK -- Edit 1 - Edit 2 - Edit 3 - Edit 4 - Edit 5 ///// But I canceled this task At present, because there was a previous objection in the field about adding portals to all articles. I will open a new discussion in near future for this task.--Omar Ghrida (talk) 14:24, 20 July 2018 (UTC)

Task 4 (Adds missing tag and references section if needed)

What do you mean be "tag" here? — xaosflux Talk 21:15, 18 July 2018 (UTC)

I mean reference tag Which is used in the reference paragraph . It Replaced by {{reflist}}, so his task has been canceled but I will work on something similar (see task 5) --Omar Ghrida (talk) 14:24, 20 July 2018 (UTC)

Task 5 (Adds missing {{reflist}} template to article if needed )

Ex: Here . I will add {{reflist}} template to article if needed . --Omar Ghrida (talk) 14:24, 20 July 2018 (UTC)

Task 6 ( add template {{orphan}} to pages are currently unlinked by other pages)

Ex: Here I want to work on this task by add template {{orphan}} to pages are currently unlinked by other pages, you can see Wikipedia:WikiProject Orphanage --Omar Ghrida (talk) 14:24, 20 July 2018 (UTC)

LioneltBot 2

Operator: Lionelt (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 05:03, Friday, June 29, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): AutoWikiBrowser

Source code available: AWB

Function overview:Diffuse category:Conservatism in the United States

Links to relevant discussions (where appropriate): Talk:List_of_American_conservatives#US_Conservatism_Category,_List,_Nav_template, Category_talk:Conservatism_in_the_United_States#US Conservatism Category discussion

Edit period(s): Run two times

Estimated number of pages affected: 425

Namespace(s): Category

If supported by AWB

Exclusion compliant (Yes/No): If supported by AWB

Function details: The Category:Conservatism in the United States has grown to 479 pages. I propsoe to use AWB to diffuse the parent category into two child cats:

  1. Pages in category:Living People will be moved from moved from Category:Conservatism in the United States to category:American conservative people‎
  2. Pages with the text "[[Organizations in" will be moved from moved from Category:Conservatism in the United States to category:Conservative organizations in the United States‎


Essentially, this request is for automatic processing using AWB, no server will be used. I have 1458 AWB edits.– Lionel(talk) 05:03, 29 June 2018 (UTC)

Note: my first bot request was an April Fool's joke. – Lionel(talk) 02:25, 30 June 2018 (UTC)

Discussion

At the very least, the bot's user page should be updated to reflect the serious task, rather than the joke. Headbomb {t · c · p · b} 03:10, 30 June 2018 (UTC)

Good idea. – Lionel(talk) 04:21, 30 June 2018 (UTC)

The linked discussion doesn't have much in the way of input (just one other participant), but it seems relatively uncontroversial. My main concern is your sorting mechanism - pages like FireHollywood and Judicial Watch are not in a category starting with "Organizations in..." so they would not be properly sorted, and if you have dead conservatives like Henry Hazlitt they would not be in Category:Living people. How do you plan on accommodating those exceptions? Primefac (talk) 11:49, 30 June 2018 (UTC)

Good catch.
If any are missed I could do them by hand. Of course * is a wildcard; probably use regex for that. – Lionel(talk) 11:56, 1 July 2018 (UTC)
@Lionelt: Could we see the regex you plan to use? ƒirefly ( t · c · who? ) 17:48, 1 July 2018 (UTC)

Bot0612 10

Operator: Firefly (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 00:14, Wednesday, June 6, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: https://github.com/rwjuk/g13bot_tools_new

Function overview: Take over HasteurBot's G13 nudging and tagging role as said bot is now inactive.

Links to relevant discussions (where appropriate): https://en.wikipedia.org/w/index.php?title=Wikipedia_talk:WikiProject_Articles_for_creation&diff=prev&oldid=844237355

Edit period(s): As before, triggered every 4 hours (may reduce to 12 now that backlog is clearing)

Estimated number of pages affected: ~100 pages per day (going by HasteurBot's edits)

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details: The functionality will be identical to that described in HasteurBot's BRFA, save the following:

  • I have swapped the hard-coded notice that the bot posts on a creator's talk page for a template to achieve separation of concerns. Suggestions for improvements welcome!
  • I may reduce the trigger period to 12 hours to prevent the needless consumption of Toolforge resources once the existing backlog has been cleared

Discussion

@Xaosflux: Not really, for two reasons. (1) The actions of the bot haven't been brought up in the discussion, despite its original operator participating. Of course, it's entirely possible that some participants aren't aware of its existence. I can mention this BRFA there as an 'FYI'. (2) At present there appears to be a rough consensus opposing the use of {{promising draft}} to prevent G13 deletion, which would have no impact on bot operations. If a consensus develops (here or at any point in the future) that the template should in fact prevent G13, it would be trivial to check a page for this before nudging or nominating.
I see that discussion as being more about what happens before G13 (i.e. at what point a draft becomes eligible), rather than what happens when G13 kicks in. Once a draft is G13 eligible any editor can apply the tag - the bot simply automates this process, and is strictly limited to filling the G13 category to 50 items every 4 hours. ƒirefly ( t · c · who? ) 18:44, 6 June 2018 (UTC)
  • WP:G13 should not be applied to a draft that has survived its most recent deletion discussion see WP:CSD. Will the bot respect this CSD requirement? Thincat (talk) 20:21, 6 June 2018 (UTC)
    • @Thincat: If the existing code doesn't handle this, I'll add this functionality in - thanks. ƒirefly ( t · c · who? ) 20:40, 6 June 2018 (UTC)
I think that's worth a typed-in thank you. Thank you! Thincat (talk) 21:05, 6 June 2018 (UTC)
  • {rto|Thincat}} G13 is applicable regardless of surviving a recent deletion discussion. Once a page is MFDed or AFDed we don't permanantly immunize it against CSDs. This should not be added @Firefly: as it is not in any policy. Hasteur (talk) 23:39, 6 June 2018 (UTC)
  • Hasteur is correct. There's a longstanding consensus that drafts have a significantly lower bar for being kept than articles do. As a result, drafts are usually only deleted at MfD if they're promotional, attack pages/verging on BLP issues, totally not suitable for WP, or being tendentiously submitted without improvement and wasting peoples' time (in other words, if they are so bad as to necessitate deletion before G13 kicks in).
  • Most other draft MfDs, even if the draft is crap, are kept or closed as no-consensus. Very often, the fact that G13 will cause the draft to "expire" eventually is taken into account when people !vote at MfD. You'll see keep comments that say something like, "keep and allow author to continue working, no reason to delete before G13 kicks in." That doesn't mean those drafts are necessarily good topics or promising drafts. It means that they're not bad enough to require deletion now instead of G13 later. And of course the MfD resets the G13 clock, so any draft that isn't deleted at MfD gets another 6 months to be worked on anyway.
  • If you suddenly begin to exclude those drafts from bot-performed G13 tagging, you remove that nuance from MfD and force discussions there to be more black and white. I think an undesired result would be that more drafts will wind up deleted at MfD outright, rather than closed as keep/no consensus pending improvement with their G13 timer reset, which of course is counter-intuitive if your intention is to preserve more draft content. ♠PMC(talk) 00:55, 7 June 2018 (UTC)
To the extent that some pages are kept at MfD because of the future availability of G13, these are the sorts of pages that could be tagged for G13 by hand. (There shouldn't be that many of them, in any event.) Calliopejen1 (talk) 16:36, 7 June 2018 (UTC)
With all due respect, Calliopejen1, until G13 changes your opinion doesn't matter. If you think a page should be kept, make a null edit every five months. There are already a half-dozen discussions about changing G13, and this BRFA is not (and should not) be one of those. The bot should run with the restrictions given by the current guidelines, amended IIf it becomes necessary down the line. Primefac (talk) 16:39, 7 June 2018 (UTC)
The state of policy as to G13 is not entirely settled. When G13 was extended to non-AfC drafts, a simultaneous discussion concluded that pages tagged with the promising-draft template "should go to WP:MFD instead of being deleted G13". Now that appears to be changing, as I freely admit. However, the community has not yet considered whether G13 should apply to a draft that has previously survived an MfD. This would be the next logical question given the change in policy as to the promising draft template. I don't think it is obvious that a page should be WP:CSD eligible where a community discussion (presumably regarding the same basis as the G13 tag--i.e. should we just let it sit around?) has concluded that it should not be deleted. I would like to bring this issue at WT:CSD, but I would like to wait until the current promising-draft discussion is settled to do so. In the meantime, it seems reasonable not to have a bot operating on the cutting edge of Wikipedia policy.
──────────────────────────────────────────────────────────────────────────────────────────────────── I'm entirely happy to put this on hold until the discussion around {{promising draft}} is closed. ƒirefly ( t · c · who? ) 16:59, 7 June 2018 (UTC)
@Firefly: I was hoping to have a separate discussion about MFD-keeps and G13 after the promising-draft discussion is closed. The discussion is already confusing enough, and I don't want to put forward a harder-to-understand conditional proposal (if the discussion above concludes in X way, should we Y?). I believe that the issue will be easier to consider for all involved once promising-draft is resolved. Could we wait for the conclusion of my discussion to come (as well) before the bot applies G13 to previously kept drafts? If I lose, of course, I have no objection to having the bot implement whatever policy consensus there is. Calliopejen1 (talk) 17:06, 7 June 2018 (UTC)
@Calliopejen1: I understand - could you link me to that discussion? I can't recall seeing it. I'm happy to postpone until that discussion runs its course. Just to confirm, you're happy for the bot to run, but to leave previously-kept-at-MfD drafts alone until the discussion concludes - correct? ƒirefly ( t · c · who? ) 17:52, 7 June 2018 (UTC)
@Firefly: No, I haven't started the discussion. But it's an issue that I believe is necessarily raised by the forthcoming policy change. I don't want to open the discussion yet because it will just be one more confusing thing when promising-draft isn't officially settled (even though it de facto is, of course). For now, I think the bot should steer clear of promising-draft drafts and previously-kept-at-MFD drafts. Once the promising-draft discussion officially closes, the bot should expand to cover promising-draft drafts and I'll open the second discussion. Once the second discussion concludes, the bot can expand as appropriate. I'm not sure I'd say I'm "happy" for the bot to run, but I won't stand in the way of consensus... Calliopejen1 (talk) 19:12, 7 June 2018 (UTC)
(Note that the discussion below branched from a separate discussion above. Calliopejen1 (talk) 06:36, 8 June 2018 (UTC))
(ec) @Calliopejen1: It will be tagged for deletion after six months of no edits to the page. If a draft is kept at MfD and then nobody does anything whatsoever to it (not even a dummy edit to reset the G13 timer) for half a year then it's likely that people aren't as interested in it as the MfD participants thought. ƒirefly ( t · c · who? ) 16:42, 7 June 2018 (UTC)
I don't think the people voting at MfD would change their minds based on six months of inactivity. (The view seems to be that the content is valuable, whether it is developed now or at some uncertain date in the future.) Calliopejen1 (talk) 16:53, 7 June 2018 (UTC)
Incorrect, often the view is, "this isn't controversial enough to require deletion right now, leave it alone to be either worked on or abandoned to G13." With all due respect, you haven't exactly been an MfD regular until fairly recently, so perhaps you're not the best judge of what the usual attitude there is. ♠PMC(talk) 21:15, 7 June 2018 (UTC)
I was only referring to the particular discussion about the history of Thailand, which is no longer clear because of intervening edits that divided the discussion. I don't know one way or another what the usual attitude is (and didn't mean to make any general statements), but it seems to me that the possibility of discussions like the one I cited mean that human tagging (or requiring a second deletion discussion) is a better outcome. Calliopejen1 (talk) 06:32, 8 June 2018 (UTC)

Given that G13 is now applicable to all Drafts and rejected/unsubmitted AfC submissions, are there any objections to rewriting the bot to use a database query rather than the AfC categories? It would still work from the oldest articles forward, and all other functionality would remain identical. I think that doing so would increase maintainability and ensure that all eligible drafts get picked up. ƒirefly ( t · c · who? ) 21:52, 6 June 2018 (UTC)

  • G13 applies regardless of a previous AfD or MfD of the same page. At AfD and MfD, this is overt.
G13 does not apply to {{promising draft}} tagged articles per the argument that the speedy-deletion is pre-contested. This applies to an extremely small fraction of drafts (all are listed at Category:Promising draft articles), and the situation is a mess. It is heavily contested at Wikipedia_talk:Criteria_for_speedy_deletion#Request_for_comment:_Promising_drafts, the controversy of the contest means that speedy deletion should not apply, because speedy deletion is for uncontestable cases. But worse, the mechanism is a mess. The {{promising draft}} is an ad hoc creation, created without sufficient communication with the bot owner. How is the bot supposed to recognise the template or category? What if the template is modified, or the category modified? I recommend that the bot owner take responsibility for the template, and the category. Or, subject to the direction of consensus at Wikipedia_talk:Criteria_for_speedy_deletion#Request_for_comment:_Promising_drafts, perhaps the "promising drafts" should be moved out of draftspace entirely, perhaps to my proposed WP:WikiProject Promising Drafts. I like my suggestion, because there is a beauty of simplicity that every page in draftspace, redirects excepted, are on a six month inactivity limit. --SmokeyJoe (talk) 00:03, 7 June 2018 (UTC)
  • "Premature" is the wrong word. The fact of the statement is true right now based upon the RfC having been initiated. The RfC is underway, the application of G13 to {{promising draft}} tagged articles is currently controversial. I suggest that the bot should be restricted from tagging those articles until there is clarity. This sounds like a major headache for the bot owner, and I think this is what is the practice reason for the upset of User:Hasteur. I suggest that the bot operator, User:Firefly, take responsibility and even ownership of the troublesome tag. For the period until the close of the RfC, alternative practical solutions include: Move the promising drafts out of draftspace; or manually check the category intersection and remove tagged pages; delay re-implementation of the bot until the RfC is closed. I worry that the RfC is not on a path to a simple answer. --SmokeyJoe (talk) 06:23, 7 June 2018 (UTC)
  • No, it isn't. {{Promising draft}} isn't recognized by policy, hence the reason for this discussion! But I do agree with the suggestion that pages with transclusions of {{Promising draft}} ought to be skipped by the bot, so as to insulate OP from needless abuse. -FASTILY 07:00, 7 June 2018 (UTC)
  • I could argue CSD nuance, but we seem to agree on the main point. Get this going again, while helping the operator avoid wandering into an abusive room. --SmokeyJoe (talk) 07:11, 7 June 2018 (UTC)
@SmokeyJoe: @Fastily: Agreed, until a clear consensus develops on {{promising draft}}, pages tagged with it will be skipped. ƒirefly ( t · c · who? ) 07:26, 7 June 2018 (UTC)
@Firefly:, User:Primefac has close the RfC Wikipedia_talk:Criteria_for_speedy_deletion#Request_for_comment:_Promising_drafts. The bot shouldn’t have to worry about this template. —SmokeyJoe (talk) 10:07, 1 July 2018 (UTC)
  • Question: Why is the bot not exclusion compliant? Is this important? --SmokeyJoe (talk) 07:17, 7 June 2018 (UTC)
  • Question: Is this bot written to for all DraftSpace pages? Or does it include AfC tagged userpages? I suggest scrapping the scope to include AfC tagged userpages, as they are supposed to have all been moved to draftspace. The inclusion of all AfC userspace pages was to deal with the initially existing thousands of such pages that predated draftspace, and this is no longer an issue. If AfC tagged pages in userspace are a problem, it called for a solution, but not a G13 bot solution. I note that AfC tagger in draftspace is now irrelevant, unlike when HasteurBot was written. --SmokeyJoe (talk) 07:17, 7 June 2018 (UTC)
    Technically speaking G13 covers userspace drafts that have some variant of the {{AFC submission}} template on it. There are a ton of drafts that never get moved to the draft space because they're blank or otherwise not worth the effort of moving them (i.e. they're the quickest of quick-fails). Primefac (talk) 12:38, 7 June 2018 (UTC)
  • Noted - for now the bot will cover Draft: space, but I will look into those and file another BRFA to expand the functionality once this one has shown to be acceptable technically and societally. ƒirefly ( t · c · who? ) 13:20, 7 June 2018 (UTC)
  • Question: What's the story behind the bot's catchy name? --SmokeyJoe (talk) 07:17, 7 June 2018 (UTC)
The bot will cover Draft: namespace pages only. The bot's name comes from my old username (see here). If the community feel that this task is better run under a specific bot account (e.g. User:G13Bot), then I can create such an account. The bot wasn't set as exclusion compliant because it is extremely unlikely that a user will apply {{nobots}} to their draft to prevent the bot from tagging it, and I'm not sure that's a desirable outcome anyway. That said, the bot should and will respect exclusions for talk pages when notifying. I'll add this in. ƒirefly ( t · c · who? ) 07:26, 7 June 2018 (UTC)

Arbitrary section split for an update

I have decided following some wise counsel to split the 'notify creators of impending G13 doom' task out into another BRFA. This way the (arguably more important) notification task can begin now, even as the nominating task is postponed awaiting policy clarity. ƒirefly ( t · c · who? ) 18:17, 8 June 2018 (UTC)

A user has requested the attention of a member of the Bot Approvals Group. Once assistance has been rendered, please deactivate this tag. Now that the RfC around {{promising draft}} has been closed, with the outcome being that the template cannot be used to permanently immunise drafts against G13 (adding the template, like any other edit, will of course reset the six month timer), it'd be good to get this request moving again. The bot will not take {{promising draft}} into account, as per the consensus developed at the RfC. ƒirefly ( t · c · who? ) 17:45, 1 July 2018 (UTC)
  • On a completely unrelated note (and I don't know if this is the place for "feature requests"), I was wondering whether the bot could adopt a system of two categories. Drafts tagged for G13 deletion would continue to be placed in Category:Candidates for speedy deletion as abandoned drafts or AfC submissions, unless they have also beeen tagged as promising, in which case there will be a separate (sub)category. This should help alert deleting admins to the presence of the template, so that unconntroversial batch deletions could continue to be performed from the first category, while the drafts that require the exercise of some judgement (after the RfC, that's precisely what {{promising draft}} does) would be clearly separated. – Uanfala (talk) 18:12, 1 July 2018 (UTC)
    They are already placed in Category:Promising draft articles, so a petscan or dbase scan would easily allow a user to find ones that are in both. Additionally, the bot should not be adding categories to a page, so {{db-g13}} itself would have to be modified. Primefac (talk) 18:57, 1 July 2018 (UTC)
    I see. But my point was to seperate the two for the benefit of deleting admins, and this can't be achieved with the current set up as it is. – Uanfala (talk) 19:01, 1 July 2018 (UTC)
    (ec) Agree with Primefac - we'd need to modify {{db-g13}} for this if needed. I was just thinking of ways you could do this, but (a) you'd need consensus, and (b) that's more a discussion for WT:CSD. ƒirefly ( t · c · who? ) 19:03, 1 July 2018 (UTC)
    (edit conflict) Right, and I see nothing wrong with wanting to do that. However, the G13 template cannot (at this time) distinguish between a page with {{promising draft}} and one without, which means either a) modifying {{db-g13}}, b) creating a new version of said template, c) having the bot add a category directly (which would then need to be removed manually or via AFCH). The first two aren't a matter for this BRFA, and the third would require AFCH to be updated first (and given how many other things are in the pipeline, not likely to happen soon). Primefac (talk) 19:07, 1 July 2018 (UTC) -- edit: I do suppose the promising draft template could be updated to include a time switch like {{AFC submission}} does, which would place it in the appropriate category if/when the six-month window passes. But again, that's something to change in the template and not really a bot task. Primefac (talk) 19:14, 1 July 2018 (UTC)

DeprecatedFixerBot 5

Operator: TheSandDoctor (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 05:15, Wednesday, May 16, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: https://github.com/TheSandDoctor/Click-deprecated-param-fixer/tree/master

Function overview: Goes through {{click}}'s transclusions, converting {{click}} to the <nowiki>File:Title.extension</nowwiki> (using click's |image= and |link= parameters (respectively) to popular the fields.

Links to relevant discussions (where appropriate): Category:Pages containing click using deprecated parameters, Template:Click

Edit period(s): Routine runs until category is cleared, possible maintenance runs in future

Estimated number of pages affected: 12,300 (approx)

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: If the click (or click-fixed) template are found on the page, the bot would fetch the values stored in the |image= and |link= parameters and store them internally. It would then replace that instance of the click/click-fixed template with the newer file format. If they were not found (ie in a transcluded template), then the bot would simply move on. I anticipate that some pages may be left behind in the category that will need to be dealt with by hand, but a bot could help reduce that number and make those cases more obvious (so they could be identified and addressed). Some pages may also be added to Category:Pages using deprecated image syntax if the click template happens to be within an infobox. I do not anticipate that the category would be flooded with new pages, however.

Discussion

Given that the {{click}} template itself is deprecated, could/should it be removed altogether where found? Richard0612 07:57, 16 May 2018 (UTC)
Good point Richard0612, task updated. --TheSandDoctor Talk 13:51, 16 May 2018 (UTC)
I note that Category:Pages containing click using deprecated parameters seems to have pulled in some pages that do not have the {{click}} template. I assume it will leave those pages alone? I know because one of may sub pages is flagged - User:Ronhjones/Gallery, it might have the word "click" inside a template, but it's not (and never has) used the click template. Ronhjones  (Talk) 17:49, 16 May 2018 (UTC)
I have seen the same thing Ronhjones. In short: your assumption is correct. In detail: The bot no longer would go through that category,[a] rather the transclusions looking for the click template and converting it. If the click template (or {{click-fixed}}) is not present, then the page in question would not be edited and the bot would simply move on to the next one. What it does is it pulls the wiki code of a page, filters out the templates in the page, and then loops through them looking for a match to (a) given template name(s) (hardcoded). If one is not found, then the content_changed flag is never true and, as such, the page is not resaved and the bot just moves on to the next one in its list.
Hopefully that helps to alleviate any concerns. If you have any questions, please feel free to let me know. --TheSandDoctor Talk 01:08, 17 May 2018 (UTC)
  1. ^ I should note that, while a good chunk of the transclusions are not within the category you linked, I might still run the bot through there first doing the conversions for the save of simplicity. Ultimately though, it would be running through the transclusions
@TheSandDoctor: can you provide a couple of diffs (make them with your own account) below to demonstrate exactly the edit content that will be made? — xaosflux Talk 14:00, 21 May 2018 (UTC)
@Xaosflux: I will have to modify the bot's code a bit as this has taught me a couple things (namely that size needs to be preserved if specified). With that said, the diffs are what the bot would be expected to do once altered. Diff 1, Diff 2. I will happily do more (and might add to this) in the future, but at the moment I do not have time. There appears to be a transcluded template (by the looks) that is resulting in a lot of these pages showing up that do not have {{click}} or {{click-fixed}} anywhere on them, so I will probably do some data analysis/detective work later and see if I can figure out where these are coming from (and hopefully resolve). Will keep this BRFA posted. --TheSandDoctor Talk 03:49, 25 May 2018 (UTC)

Bots in a trial period

Pi's Bot

Operator: Pi (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 20:15, Sunday, May 13, 2018 (UTC)

Automatic, Supervised, or Manual: Supervised

Programming language(s): VBA

Source code available: No

Function overview:To add WikiProject tags and set taskforce parameters for Reality TV Taskforce

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#Wikiproject_task_force_tagging_(Reality_TV). This is a replacement of the previous request Wikipedia:Bots/Requests_for_approval/The_Little_Pixie's_Friend

Edit period(s): one time run

Estimated number of pages affected: 8,493 (approx, the data will be re-validated due to the time elapsed since the last request)

Namespace(s):Talkspace

Exclusion compliant (Yes/No):Yes

Function details:A request was made on Wikipedia:Bot_requests for articles in certain categories to be tagged with the WikiProject Television template (if not already added) and to set the Reality TV Taskforce flag to true. I have written by first bot in order to complete this task.

A list of the articles in the categories that I have compiled is here

I will limit edits to one every 10 seconds, or another limit that is recommended.

Discussion

Approved for trial (60 edits or 15 days).xaosflux Talk 12:31, 19 May 2018 (UTC)
Please trial and report back results here. — xaosflux Talk 12:31, 19 May 2018 (UTC)
Hi, Sorry I only just saw this was approved, I'll be on it tomorrrow Pi (Talk to me!) 22:58, 3 June 2018 (UTC)
Trial complete. I have completed the initial trial. The report on the edited pages is here: User:Pi's Bot/RealityTV Pi (Talk to me!) 21:54, 5 June 2018 (UTC)
@Pi: For Bill Murray & Brian Doyle-Murray's Extra Innings, the bot didn't edit it's talk page, it edited Bill Murray's talk page instead[1]. I assume something went off with the "&" with boolean operations or something. WikiVirusC(talk) 22:18, 5 June 2018 (UTC)
Also a few edits added the tag but also some error code from the script to the talk page. Talk:Zee Super Talents, Talk:Zee Dance League, and Talk:Cheer Squad are three that I found. It seems to be on articles in which a new page was created. Along with the error message on talk page, on the report page it posted the diff from the previous article. Talk:The Voice – Magyarország hangja had a similar error on table where it used the diff from the article two rows above it(one in between was SATISFIED), but no edit was made at all on Talk:The Voice – Magyarország hangja, instead newly created Talk:The Voice – Magyarország hangja was made. Error with the "–" it seems. I'm sure you are looking through the report as well, I'm just listing the things I found. WikiVirusC(talk) 22:34, 5 June 2018 (UTC)
Your bot tagged Talk:The Voice (TV series), which was a redirect. This broke the redirect (I noticed this when the page landed in Category:Unsynchronized talk page redirects). Check for pages beginning #REDIRECT, and don't put project templates on them. – wbm1058 (talk) 14:52, 6 June 2018 (UTC)
Hi, I'm going through the list of edits and correcting anything that needs correcting. I have a few things to fix. There was a bug about how it deals with non-existent pages, and then I need to deal with redirects.Pi (Talk to me!) 21:07, 6 June 2018 (UTC)
The list of articles to run bot on was generated back in March, several pages probably have been moved/deleted since then, hence the issue with The Voice. WikiVirusC(talk) 10:23, 7 June 2018 (UTC)
OK, I have made changes to the code to solve the issues from the first test. I have also manually checked all of the edits. I would now like to do another test run. Pi (Talk to me!) 21:28, 8 June 2018 (UTC)
Symbol tick plus blue.svg Approved for extended trial (50 edits or 20 days). @Pi: OK. — xaosflux Talk 18:58, 18 July 2018 (UTC)

Texvc2LaTeXBot

Operator:

Time filed: 19:45, Monday, June 18, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python (pywikibot)

Source code available: Yes

Function overview: The bot will exclusively edit mathematical and chemical formulas to manage upgrading of the LaTeX math engine and the eventual removal of the texvc backend.

Links to relevant discussions (where appropriate): phab:T195861

Edit period(s): one time runs

Estimated number of pages affected: on enwiki max. 1524 (User:Salix_alba/maths2018); initially 204 (User:Texvc2LaTeXBot/enwiki)

Namespace(s): all namespaces

Exclusion compliant (Yes/No): Yes

Function details:

  • As a first step, the bot will perform the replacements listed in the table mw:Extension:Math/Roadmap#Step_1_Part_A:_Remove_problematic_texvc_redefinitions on the 204 pages listed in User:Texvc2LaTeXBot/enwiki.
  • The first 204 pages will be checked for correct operation, then extended to a further 1320 pages which have maths syntax which needs updating.
  • After editing those 204 pages, we will apply for bot flags on the remaining 553 projects that have some mathematical equations, perform the same replacements and incorporate their ideas and concerns.
  • Subsequent steps will only be performed if a consensus is reached. The update process involves either breaking rendering of version histories or replacing all math and chem tags (around 65000 pages on the English Wikipedia).
  • If you have questions, suggestions or concerns regarding the update process, please post them on mw:Extension:Math/Roadmap or join our commission at phab:T195861.

@Physikerwelt and Salix alba: Feel free to improve/modify.--Debenben (talk) 19:45, 18 June 2018 (UTC)

Discussion

Could you make a sample edit to see what exactly would be involved here? Headbomb {t · c · p · b} 20:44, 21 June 2018 (UTC)

  •  Note: This bot appears to have edited since this BRFA was filed. Bots may not edit outside their own or their operator's userspace unless approved or approved for trial. AnomieBOT 21:11, 21 June 2018 (UTC)
I ran the bot on the first three pages in the list. They happened to be only \and and \or replacements, but the others are conceptionally the same.--Debenben (talk) 21:17, 21 June 2018 (UTC)
@Debenben: diffs? Headbomb {t · c · p · b} 23:25, 21 June 2018 (UTC)
It was run in error on some main space article here are three diffs
I've copied some article to my userspace to test
--Salix alba (talk): 05:43, 22 June 2018 (UTC)
@Headbomb: I assumed the sample edit should be performed on a regular page and chose three pages, hoping it would also cover some of the replacements Salix alba did in the userspace. I am sorry about the misunderstanding in case you did not want any main space edits yet.--Debenben (talk) 10:39, 22 June 2018 (UTC)

@Debenben: Before proceeding with full trial, there should be (if there isn't one already) a noticed posted at WP:FORMULA, as well as WP:PHYS, WP:CHEM and WP:WPMATH since those are the projects most affected. Also, see WP:BOTMULTIOP, as I understand multiple people will be operating this bot. Headbomb {t · c · p · b} 12:32, 22 June 2018 (UTC)

@Salix alba: Do you want to take care of all English speaking projects? I could do the German and French ones and we could write a custom userpage "unless otherwise identified edits on English speaking projects are done by Salix alba".--Debenben (talk) 13:08, 22 June 2018 (UTC)
@Debenben: Yes I'm quite happy to be sole operator of the bot on en-wikipedia (and other projects). I've got my head about how the bot runs now.
@Headbomb: Yes publicising the work of the bot and the associated migration project should be publicised in the places you mention. I'll get to it. --Salix alba (talk): 15:11, 22 June 2018 (UTC)
@Headbomb: I posted notices at the places mentioned above on Friday.--Salix alba (talk): 16:33, 25 June 2018 (UTC)

{{BotOnHold}} There's a security problem with the bot, I've blocked it until it's resolved. Debenben, you've been added to a private bug report. Max Semenik (talk) 05:00, 26 June 2018 (UTC)

Can you add me to the bug report. I think I'm now responsible for the bots use of the English wikipedia.--Salix alba (talk): 05:55, 26 June 2018 (UTC)

The issue is resolved and the bot is unblocked, we can continue. Max Semenik (talk) 21:07, 27 June 2018 (UTC)

Alright, well we can move on to trial once the operator has read WP:BOTPOL and specifically WP:BOTACCOUNT/WP:BOTREQUIRE. In particular, {{Bot}} should be added to the bot's user page. Headbomb {t · c · p · b} 20:12, 28 June 2018 (UTC)
Cool. I'll read up on the relevant docs. --Salix alba (talk): 21:02, 28 June 2018 (UTC)
I've now added {{Bot}} and done a couple of test edits. What a good number of edits for the trial phase? --Salix alba (talk): 17:06, 30 June 2018 (UTC)
Approved for trial (10 edits for each fix).. Link to this BRFA in the edit summary during the trial. Headbomb {t · c · p · b} 18:14, 30 June 2018 (UTC)

Bots that have completed the trial period

TokenzeroBot 5

Operator: Tokenzero (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 19:25, Saturday, July 21, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): python, pywikibot

Source code available: GitHub

Function overview: Handle OMICS predatory journals by creating redirects and hatnotes.

Links to relevant discussions (where appropriate): Wikipedia:Bot requests#Redirects of OMICS journals

Edit period(s): one time run

Estimated number of pages affected: 2001 created redirects and their talks pages, 24 added hatnotes.

Namespace(s): Mainspace (mostly redirects), Talk (just to tag talk pages of created redirects)

Exclusion compliant (Yes/No): Yes

Function details:

OMICS Publishing Group is an predatory open access publisher, which often deceptively names it journals. This bot shall create redirects or hatnotes to point from these names, from a fixed list of names at User:Headbomb/OMICS, taken from the publisher's website.

More precisely, for each title Foobar on the list:

  • If Foobar exists, consider Foobar (journal) instead (unless the title already contained journal or already was a redirect, in which case skip it)
  • Consider also variants obtained by replacing "and" with "&" (and vice versa, if the title doesn't contain Latin "Acta")
  • Consider also variants obtained by taking the the ISO 4 abbreviation (dotted and undotted, computed using the automatic tool, using multilanguage rules iff the title contains "Acta").
  • If any of the consider variants already exists, skip it, just to be safe.
  • Otherwise, create a redirect from each variant:
#REDIRECT[[OMICS Publishing Group]]
[[Category:OMICS Publishing Group academic journals]]

and create a talk page for that redirect, containing {{WPJournals}}. Then, for each title of the form Foobar: Open Access/Foobar-Open Access/Foobar: An Indian Journal/Foobar: Current Research:

{{Confused|text=[[Foobar: Open Access]], published by the [[OMICS Publishing Group]]}}

Here's a log of all edits made by a simulated run: pastebin log. The 9 skipped titles are logged as 'Skip' should be handled by hand. Titles logged as 'Done' are existing redirects to OMICS Publishing Group, Allied Academies or Pulsus Group.

Discussion

@Tokenzero: Approved for trial (25 edits). 10 "plain" redirects, 10 "(journal)" redirects, 5 hatnotes. Headbomb {t · c · p · b} 20:46, 21 July 2018 (UTC)

Note, I've updated User:Headbomb/OMICS to catch a few typos and capitalization mistakes. Headbomb {t · c · p · b} 20:57, 21 July 2018 (UTC)
Trial complete. See contribs and the new full simulated run log. The hatnotes have redlinks now, their redirects will be created in a full run. There are no more '(journal)' redirects to make, all those remaining cases are existing journals (which will be added hatnotes). Tokenzero (talk) 11:16, 22 July 2018 (UTC)
Oh, almost forgot: there was one bug where the bot would also add hatnotes to some redirects. This is now fixed. Tokenzero (talk) 11:26, 22 July 2018 (UTC)

CitationCleanerBot 3

Operator: Headbomb (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 15:38, Wednesday, July 18, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): WP:AWB

Source code available: In {{cite xxx}}, find \|(\s*)(asin)(\s*)=(\s*)([^\|\}]*) replace with nothing when the citation contains \|(\s*)(isbn)(\s*)=(\s*)([^\s\|\}]+)

Function overview: remove ASIN values when ISBN identifiers are present (per Help:Citation Style 1#Identifiers), plus WP:GENFIXES. E.g. [4].

Links to relevant discussions (where appropriate): Help talk:Citation Style 1#Clearing out ASIN when ISBN is present by bot

Edit period(s): One big run to start, plus occasional runs as amazon links/asins creep up again.

Estimated number of pages affected: ~2000

Namespace(s): Mainspace, Draft, occasionally Wikipedia (manually)

Exclusion compliant (Yes/No): Yes

Function details: remove ASIN values when ISBN identifiers are present. E.g. [5].

Discussion

  • This looks like a straightforward task to me. It wwill be helpful if the edit summary points to this BRFA. Will the search be case-insensitive so that it correctly catches both |ASIN= and |asin= (and the same for ISBN)? – Jonesey95 (talk) 04:08, 19 July 2018 (UTC)
It's case insensitive, yes. Headbomb {t · c · p · b} 04:40, 19 July 2018 (UTC)
Approved for trial (40 edits). Please report back here with diff range when trial has run. — xaosflux Talk 01:55, 21 July 2018 (UTC)
Trial complete. @Xaosflux: done. See 40 most recent edits. No error to report, although I was operating semi-automatically with slightly more aggressive rules than I'd operate automatically (e.g. it looked inside {{reflist}} / {{quote box}}, etc. when it wouldn't do that during an automated run. This is currently unsafe because of T159958.) However I didn't have to modify anything the bot would not have touched during an automated run. Headbomb {t · c · p · b} 02:40, 21 July 2018 (UTC)
This one is technically cosmetic, but unavoidable due to the way AWB is coded. Headbomb {t · c · p · b} 02:48, 21 July 2018 (UTC)

RonBot 7

Operator: Ronhjones (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 19:50, Thursday, July 19, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python

Source code available: User:RonBot/7/Source1

Function overview: Adds Category:Association footballers not categorized by position to footballers articles, where their position has not been categorised.

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#Association_footballers_not_categorized_by_position

Edit period(s): Monthy

Estimated number of pages affected: Initial run 47,000, subsequent runs will be very small.

Namespace(s): Mainspace/Articles

Exclusion compliant (Yes/No): Yes

Function details:

  1. Constructs a list(1) of players already categorised with a position.
  2. To that list(1), adds a list of pages that do not need any attention (Category:Association football player non-biographical articles),
  3. Checks the contents of the existing Category:Association footballers not categorized by position for correctness. If they should be there then add to the list(1) (so we don't process again), if they have be categorised then remove the category.
  4. Get a list(2) of all football players. If they are in list(1) then ignore. Otherwise construct a list(3) for action later.
  5. When all checked use list(3) to add the category.

The adding and removing subroutines tested on user space pages - Special:Contributions/RonBot at 18:15, 16 July 2018. A dummy run, saving the list of players to a file, and then pasted gave User:Ronhjones/Sandbox5 as the potential list of pages to be changed on the first run.
NB: The list in Item 2 is currently at Category:Association football player support pages, we are moving the category to a more meaningful name. Cat rename done


Discussion

Approved for trial (50 edits). please report back here after trial is done with diff range. — xaosflux Talk 16:09, 20 July 2018 (UTC)
Trial complete. Five random pages were artificially "fixed" with Category:Association footballers not categorized by position :- (Brigitte Klinz, Nancy Gutiérrez, Laura Bassett, Vince Bartram, Denis Adamov , bot removed the category correctly at 16:54, 20 July 2018. Then correctly added the category to 45 pages, see Special:Contributions/RonBot at 19:15, 20 July 2018 to 19:16, 20 July 2018. Page User:Ronhjones/Sandbox5 contains the full list of pages (44503 pages) needing the category (including the 45 done in the trial)
Some example diffs:-
Removals...
https://en.wikipedia.org/w/index.php?title=Vince_Bartram&type=revision&diff=851186215&oldid=851185086
https://en.wikipedia.org/w/index.php?title=Nancy_Guti%C3%A9rrez&type=revision&diff=851186223&oldid=851185104
Additions...
https://en.wikipedia.org/w/index.php?title=Aaron_Maund&type=revision&diff=851202988&oldid=835723371
https://en.wikipedia.org/w/index.php?title=Aaron_Williams_%28footballer%29&type=revision&diff=851203072&oldid=849151601
Ronhjones  (Talk) 19:30, 20 July 2018 (UTC)

Usernamekiran BOT 2

Operator: Usernamekiran (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 04:00, Wednesday, February 7, 2018 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): WP:AWB

Source code available: Yes.

Function overview: Insert banner on the talkpages of pages that come under the scope of wikiproject organised crime.

Links to relevant discussions (where appropriate):

Edit period(s): 3-4 times a week.

Estimated number of pages affected: around 2 to 3000.

Namespace(s): artcile talk, category talk, file talk, template talk.

Exclusion compliant (Yes/No): No.

Function details: Since last few months, I have been using my other account (Usernamekiran (AWB)) for inserting {{WikiProject Organized crime}} on the talkpages that fall under the scope of wikiproject. So far I have inserted banner on thousands pages, and there has been no mistake, nobody has objected yet. I can sort the targets properly (Wikipedia:WikiProject Organized crime/Bot tagging categories). The bot will not do any other changes other than adding this banner. And basic things like adding banner shell if banners exceed 3, old ProD.

Discussion

Is your only estimate that this will be 6 to 12000 edits per week? — xaosflux Talk 02:10, 8 February 2018 (UTC)

@Xaosflux: No. At max, there are around 4 thousand pages remaining that needs to be tagged with the organised crime banner. I think, using the bot, I will rag around 3 thousand pages. The ones that need human judgement, will be done semi automatically from non-bot a/c. I meant, I will run the bot 3-4 times a week. Given if i get done 1000 pages done in one day/session, then the entire task will be done in like 4 days. If not, Iwill be using bot like 3-4 times a week. I think this task will take 2 weeks to finish. I apologise for the confusion. —usernamekiran(talk) 08:11, 8 February 2018 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── If necessary, I have been using this module since months. I mean, I created the module a long time ago, but I barely used it till the discussion with Primefac special:diff/824201270.

  public string ProcessArticle(string ArticleText, string ArticleTitle, int wikiNamespace, out string Summary, out bool Skip)
        {
            Regex header = new Regex(@"\{\{{WikiProject Organized crime|{{WikiProject Organized Crime|{{WikiProject Fictional characters|{{Comicsproj|{{WikiProject Film|{{Film|{{WikiProject Video games|{{WikiProject Television|{{WPTV|{{WP Fictional|{{WikiProject Novels|{{WikiProject Anime|{{TelevisionWikiProject|{{WPFILM|{{WikiProject Songs|{{Songs|{{album|{{WikiProject Hip hop|{{WP film|{{WPBooks", RegexOptions.IgnoreCase);
            Summary = "Added banner for [[WP:WikiProject Organized Crime]]";
            Skip = (header.Match(ArticleText).Success || !Namespace.IsTalk(ArticleTitle));
            if (!Skip)
                ArticleText = "{{WikiProject Organized Crime}} \r" + ArticleText;
            return ArticleText;
        }

Also, I re-checked. I can't be sure about exact number of pages to be tagged with banner, but they appear to be more than 7000. In previous calculations, I had not included the terrorist organisations. —usernamekiran(talk) 13:27, 12 February 2018 (UTC)

{{BAG assistance needed}}

@Usernamekiran: I see this kind of stalled out, is there still work that would be done on this task? If so, can you update the summary above to reflect the current estimates ("6 to 12000" edits a week is too vague). — xaosflux Talk 17:29, 26 May 2018 (UTC)
@Xaosflux: Currently, there are around 5000 pages tagged with the banner. I am not sure how much more there are. But a big chunk consisting of terrorism groups is remaining. I am guessing around 4000ish pages in total. 3,500 can be tagged with the bot; a few (supposedly 500) would require human judgement, these can be tagged with the non-bot account. These are the estimated numbers of the pages, unfortunately I cant say much about my timing though. Once done with that wikiproject banner, I will start with wikiproject Espionage, which really needs it (special:diff/842199888, and special:diff/842065860. —usernamekiran(talk) 18:16, 26 May 2018 (UTC)
Approved for trial (60 edits or 14 days).xaosflux Talk 18:51, 26 May 2018 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── Trial complete. The only unexpected edit was this one: special:diff/843217671. As the {{WikiProject Banner Shell| parameter didnt have |1= in it. —usernamekiran(talk) 18:31, 27 May 2018 (UTC)

@Usernamekiran: Have you fixed the code to handle that edge-case in future? ƒirefly ( t · c · who? ) 21:39, 16 June 2018 (UTC)
Unfortunately, i havent had much access to a computer recently. I had solution, but just in mind. I am sort of stuck there. So I contacted Primefac regarding that a couple of days ago. But I wouldnt be able to say anything for sure till I get hold of my computer again. I hope I will be able to find the a solution then. —usernamekiran(talk) 19:51, 29 June 2018 (UTC)

GreenC bot 5

Operator: GreenC (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 02:52, Tuesday, April 24, 2018 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): BotWikiAwk

Source code available: accdate.awk

Function overview: The proposal is for 'accdate bot' to remove |access-date= from citations in the tracking category Category:Pages using citations with accessdate and no URL using targeted strategies.

Links to relevant discussions (where appropriate): Help_talk:Citation_Style_1#Clearing Category Pages using citations with accessdate and no URL - also CS1 documentation which supports use of |access-date= for |url= only.

Edit period(s): one-time run during first pass as standalone bot; then semi-continually as part of a module of WaybackMedic

Estimated number of pages affected: 25,000 (57% of 43,719)

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details:

Of the Category:CS1 errors, the tracking category with the most entries is Category:Pages using citations with accessdate and no URL (43,719). There is no silver bullet solution to clearing the cat, so this will break it down by targeting known types of problems within that category. There have been many discussions about it over the years.

The proposal is for 'accdate bot' to remove |access-date= from citations in the tracking category Category:Pages using citations with accessdate and no URL using the following strategies:

  • 1. Remove |accessdate= in CS1|2 templates that don't have a |url= but do have a value assigned to any of the various 'permanent-record' identifiers. Excluding templates {{cite web}}, {{cite podcast}}, and {{cite mailing list}}. Normally |isbn= would be excluded from the identifier list, but if a {{cite book}} it would be included.
  • 2. Remove |accessdate= in {{Cite book}}, {{Cite news}} and {{Cite journal}} with no |url=. Per the documentation, "Access dates are not required for links to published research papers, published books, or news articles with publication dates." If a publication date is provided, remove |accessdate=.


Discussion

  • The bot has been updated to the specifications above. A dry run of 1,000 articles found a fix in 574 or about 57%. The total cites fixed is 1165, of those 1121 are of type #2 and 44 are of type #1. I manually checked about 100 diffs offline and don't see any problems but will manually check these 574 once they are uploaded. Or whatever number is approved for trial. -- GreenC 15:44, 1 May 2018 (UTC)
Agreed a good idea to have a FAQ since |access-date= is a common source of confusion, what it's for and why exists. -- GreenC 04:11, 3 May 2018 (UTC)

Approved for trial (50 edits). Since no one from BAG seems interest in this, I'll take it despite having been involved in the discussion a bit. Headbomb {t · c · p · b} 16:45, 16 May 2018 (UTC)

Trial complete. Edits (toolserver). Or Special:Contributions of May 7. -- GreenC 21:06, 17 May 2018 (UTC)
The edits look good to me. A very minor cosmetic issue: for edits like these where the accessdate parameter is the last parameter in the citation, ideally the bot should also be removing the white space in front of the pipe character rather than leaving some extra white space at the end of the citation. —RP88 (talk) 21:37, 17 May 2018 (UTC)
The space is there because the preceding argument has a trailing space and the bot leaves other arguments alone for safety. I understand personal preferences for spacing, but I can't program for every contingency, cites are often a mix of spacing styles. If removal of the preceding argument trailing space is the right decision, always, I don't know. Arguably in this case the spacing is consistent because every other argument has both a leading and trailing space. The bot retained the existing style, though it was coincidence. -- GreenC 22:15, 17 May 2018 (UTC)

@GreenC: In edits like these [6] (and I could pick several examples), the bot also removes empty |url= parameters, and I do not see the wisdom in doing that. This discourages finding free URLs and makes it (slightly) harder to add them. Empty parameters should be left alone. Headbomb {t · c · p · b} 16:23, 18 May 2018 (UTC)

I concur with User:Trappist the monk in the discussion, and also generally about removing them when they might cause confusion - in this case empty |url= have actually created some of the problem this bot is attempting to resolve. There is no evidence empty arguments encourage users to fill them in (nudge theory); there's no way future editors can know why the empty argument exists: did it once have something and was deleted? Was the citation copy-pasted in with other empty args and lazily the empties were kept? Was it always empty? There's no nudge factor because there are so many possibilities of why it exists. If the empty |url= included a wikicomment saying "A URL might exist; please fill me in, or delete this notice and empty arg" that would be more clear. Do we want to do it? It seems like it would be true for any citation without a |url= and goes down the rabbit hole of trying to direct users what to do. -- GreenC 18:09, 18 May 2018 (UTC)
By that rationale, every empty parameter should be removed, and that's not something I feel bots should be doing, save in fairly controled situations, or strong consensus to do so (in which case the functionality could be implemented in AWB). I picked a clean edit, but I could have picked an edit where the bot removed an empty url parameter, but left a slew of other empty parameters alone (jstor/zbl/etc...) such as [7]. The problem the bot is trying to solve is stray accessdates, so it should stick to that IMO. Open to other BAG opinion here since I'm partly involved here. I will point out that in the dicussion that lead to this, no one suggested/supported removing empty url parametesr from citations. Headbomb {t · c · p · b} 18:18, 18 May 2018 (UTC)
"In this case empty |url= have actually created some of the problem this bot is attempting to resolve." Removal is relevant to the purpose of the bot, and it's limited to the citation it edits as a secondary - it doesn't seek out other empty arguments in other citations. To nudge the community to do things with signals of encouragement is not the bot's intention. OTOH removal of |url= within the citations its edits is relevant to the bot's purpose. -- GreenC 19:41, 18 May 2018 (UTC)
Personally, I have no issue with removal. The empty args are a waste of space and accomplish nothing from my viewpoint. Also basic bots working with cite templates, may encounter issues with empty URL parameters, though good coding can easily work around that.—CYBERPOWER (Chat) 20:30, 18 May 2018 (UTC)
WP:COSMETICBOT says «changes that do not [change output] are typically considered cosmetic». Sometimes this means that it's taken for granted they can be performed alongside bigger changes, sometimes it means they raise more complaints than the bigger change. :) --Nemo 23:58, 18 May 2018 (UTC)

A user has requested the attention of a member of the Bot Approvals Group. Once assistance has been rendered, please deactivate this tag. To be clear I'm recusing myself from making the final call here. I have listed some objections above, but I'll note for the record they are not a personal deal breaker for me, simply a concern I have. Headbomb {t · c · p · b} 16:37, 1 June 2018 (UTC)


Approved requests

Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.


Denied requests

Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.

Expired/withdrawn requests

These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at any time. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.