Page semi-protected

Wikipedia:Bots/Requests for approval

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

BAG member instructions

If you want to run a bot on the English Wikipedia, you must first get it approved. To do so, follow the instructions below to add a request. If you are not familiar with programming it may be a good idea to ask someone else to run a bot for you, rather than running your own.

New to bots on Wikipedia? Read these primers!
 Instructions for bot operators

Current requests for approval

Dreamy Jazz Bot 6

Operator: Dreamy Jazz (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 17:55, Monday, November 30, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Pywikibot

Source code available: Will likely publish it when written

Function overview: Place {{nobots|deny=all|optout=MassMessage}} on pre-existing user talk pages of users beginning with "Vanished user" or "Renamed user" with or without the space.

Links to relevant discussions (where appropriate): Wikipedia_talk:Requests_for_comment/Arbitration_Committee_Elections_December_2020#Mass_message

Edit period(s): Daily

Estimated number of pages affected: from a quick count from Special:ListUsers, it seems around 2,000 users start with Vanished user or Renamed user. A proportion of those will have their talk page edited once. I will get a more accurate number if this goes ahead.

Exclusion compliant (Yes/No): No (see more below)

Already has a bot flag (Yes/No): Yes

Function details: This proposed bot will add {{nobots|deny=all|optout=MassMessage}} to the user talk pages (and not user talk subpages) of users with usernames which start with Vanished user or Renamed user (case insensitive and with/without the space). This is following a discussion in November 2019 where it was thought a good idea to have vanished users not receive ACE election notices. To do this Category:Wikipedians who opt out of message delivery would need to be added to their talk page. It was also thought that it would be best to have vanished users not receive any bot talk page messages, as they will not be read by the person they are intended for. To simplify this, the category can be included by providing a parameter to the nobots template. Therefore this bot, if approved, will add this template with the parameters all and MassMessage to prevent unnecessary posting of bot talk page messages to vanished users, who by the very nature of their vanishing won't need to see these notifications. To prevent the bot from re-adding it if it was reverted, I will add a check to ensure that this bot task will only ever edit a page once.

This bot task won't be completely exclusion compliant. This is because simply adding the {{nobots}} template will not prevent mass messages being added. Per the documentation, MassMessage needs to be included in the optout parameter before it is added to the talk page. Therefore, if the page is not in Category:Wikipedians who opt out of message delivery, the bot will add MassMessage to the optout parameter to the nobots which is present. If the optout parameter was not included, the bot will add deny=all|optout=MassMessage as a parameter to the nobots template. If nobots is present and the page is in Category:Wikipedians who opt out of message delivery, the bot will skip the page. Dreamy Jazz talk to me | my contributions 18:06, 30 November 2020 (UTC)

To add, post filing this I've decided that the bot will only edit pre-existing talk pages. This is because talk pages of vanished users which don't exist are unlikely to be created. Dreamy Jazz talk to me | my contributions 18:46, 30 November 2020 (UTC)
I've corrected the parameters in nobots to be added. Dreamy Jazz talk to me | my contributions 12:14, 1 December 2020 (UTC)


An interesting idea, and not one that I'm necessarily opposed to. However, I do note that in the discussion you link to xaosflux says this should be discussed at WP:VPR first; has this been done? Primefac (talk) 18:01, 30 November 2020 (UTC) (please do not ping on reply)

No, but I could certainly start one. I will do so now. Dreamy Jazz talk to me | my contributions 18:07, 30 November 2020 (UTC)
@Dreamy Jazz: I haven't dug in to that template, but adding the mms-opt-out category seems just fine, but do you also want to force them to opt-out of everything else? Why? If an file-fixing bot, or a linter-fixing bot comes along - what's the problem? — xaosflux Talk 18:12, 30 November 2020 (UTC)
Xaosflux, I am personally happy with just mass message prevention, but thought that vanished users don't need to have file fixing bots posting at their talk page. Although it isn't a big problem, I thought while the mass message category was added it made sense to me to add the nobots as well. Vanished users are very unlikely to be reading their talk page, so talk page messages don't serve a use to the intended target (the vanished user). Editors who watchlist the page may possibly find it useful, but I would say more times than not these messages go unread. SoWhy mentioned that it might be an idea to add nobots when adding the category. I'm currently writing the start of a VPR discussion, so hopefully some kind of consensus for what it wanted will appear. I started the BRFA to get the ball rolling and did think that a VPR discussion / wider discussion would likely be needed. Dreamy Jazz talk to me | my contributions 18:17, 30 November 2020 (UTC)
Perhaps the bot could be constrained to only editing talk pages which exist too? That would reduce unnecessary edits, as talk pages which don't exist for vanished users are unlikely to be created. Dreamy Jazz talk to me | my contributions 18:28, 30 November 2020 (UTC)
I've added that the bot task will only edit pre-existing talk pages for vanished users. I've also started the disucssion at VPR at Wikipedia:Village pump (proposals) § A bot to exclude vanished users from mass messages and/or bot talk page messages. Dreamy Jazz talk to me | my contributions 18:52, 30 November 2020 (UTC)

{{nobots|optout=all,MassMessage}} is not the correct syntax. {{nobots}} should never have parameters. If you want to deny all bots and opt them out of MassMessages, it should be {{bots|deny=all|optout=MassMessage}}. If you want to opt them out of all messages, it should be {{bots|optout=all,MassMessage}}. If you want to just opt them out of MassMessages, it should be {{bots|optout=MassMessage}}. — JJMC89(T·C) 05:49, 1 December 2020 (UTC)

Thanks for noticing this. I assumed that all could be put in optout based on the documentation. I'll update the BRFA details. I've gone with deny all bots and opt them out of mass messages, but if consensus at VPR or here want's different, I am happy to modify it. Dreamy Jazz talk to me | my contributions 12:11, 1 December 2020 (UTC)

SDZeroBot 9

Operator: SD0001 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 12:57, Thursday, November 12, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): TypeScript

Source code available: GitHub

Function overview: Monitor activity of other bots. Issue alerts to bot operators via talk page or email if they subscribe.

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests/Archive_80#A_bot_to_monitor_the_activity_level_of_other_bots

Edit period(s): Continuous

Estimated number of pages affected: -

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details: Based on pre-configured information about bot tasks (name of bot account, what edit summaries it uses, what pages/namespaces it edits, how many edits are expected in the last x days, etc), it identifies bots and bot tasks which have stopped working. Stalled bot tasks can be identified even if the bot account is still running other tasks. Bots which perform actions other than editing (deletions/blocks/patrols etc) can also be monitored. A bot status table would be generated and posted to WP:Bot activity monitor.

If configured, this bot can also issue alerts to the operator to let them know that their bot tasks are not running. Alerts can be sent via talk page or email or least intrusively, via a ping from a central page.

I expect anyone should be able to set up a bot for tracking (to be included in status table), but of course only the operator(s) should set up alerts for themselves.


Pinging some users from the old BOTREQ discussion: @Sdkb, GreenC, Redrose64, Headbomb, Primefac, Majavah, and Amorymeltzer:. – SD0001 (talk) 15:34, 12 November 2020 (UTC)

  • The configuration parameters for describing bot tasks are given at WP:Bot activity monitor. However, I'm a bit confused on how and where should people set up these configurations. My initial thought was to have a central JSON page: Wikipedia:Bot activity monitor/config.json but the problems with JSON are (i) it requires regexes to be weirdly escaped (eg. \d needs to be written as \\d, though it will show up as \d while viewing the page) and (ii) it looks so clumsy, especially when there would be 100s of tasks. It seems using template markup is better to describe the configurations – but should they go on a central page or be decentralized on bot user pages? The drawback of the latter is that it discourages users from setting up tracking for others' bots. – SD0001 (talk) 15:35, 12 November 2020 (UTC)
  • I'm super happy to see this; thanks for your work on it, SD0001! I don't have the expertise to comment on the technical questions, but as far as monitoring goes, my sense is that many bots that stop working have retired operators, so it would be good for there to be notifications not just to the talk page of the operator. Looking forward to seeing this in operation! {{u|Sdkb}}talk 15:58, 12 November 2020 (UTC)
  • Okay, so am I reading it correctly that this is an opt in situation, only "checking up" on whichever specific bots are listed? Also, why do we need a second process when Wikipedia:Bots/Requests for approval/MajavahBot 3 exists? Primefac (talk) 14:12, 13 November 2020 (UTC)
    @Primefac: This is a lot more advanced than MajavahBot 3. See User:SD0001/Bot_activity_monitor/Report for the kind of output it produces – that's based on the data for a cherry-picked set of bots at Wikipedia:Bot activity monitor/config.json. And as mentioned it also supports sending notifications to botops. Because all of this requires data about bot tasks in a machine-readable form, it necessarily has to be "opt-in" (through folks can opt-in others' bots). – SD0001 (talk) 19:07, 13 November 2020 (UTC)
    Fair enough. Per the general precedent, there's no issue with creating a database-style report for these bots (i.e. "only edits one page") but when it starts getting towards notifications there come more questions. Speaking as a bot operator, I don't really care if someone keeps tabs on my bot, but I don't want any sort of automated notice if I happen to decide not to run one of my "active" tasks for some period of time, and I'd rather not find out after receiving a notification that someone's added my name to the "notify" list. Primefac (talk) 20:15, 13 November 2020 (UTC)
    Agreed. I myself wouldn't want these notifications – I've implemented error handling in my own bot tasks so that whenever an error occurs, I get an email with the stack trace of the error – which would be more useful than a generic message from a third-party bot which says the task didn't run. This is why I say above notifications would (should) only be enabled by botops themselves. But I think we can just let this be a convention rather than try to restrict it at the technical level, and hope that people won't be jerks? Remember that it's technically also possible for a random guy to subscribe you to random wikiprojects' newsletters – but this doesn't seem to happen in practise. – SD0001 (talk) 11:33, 14 November 2020 (UTC)
  • I'll drop a note at WP:BON about this bot. One point is worth noting, just in case it isn't obvious, is that the "monitoring" is intended for bots that run fully automatically – with zero human intervention. It wouldn't make sense to track bots that are one-time or on-demand, or the even the ones which require any level of operator intervention to run. The intent is to "catch" bot stoppages which the operator may not be aware of, typically occurring due to unforeseen issues such as the ones Anomie mentioned at this discussion, quoting:
  • Something changes that causes the bot to fail unless its code or configuration is updated ...
  • A software update by the hosting provider breaks the bot's code, again requiring a code update.
  • The bot's process stops running or locks up, and the operator isn't paying attention to notice and restart it.
  • The hosting provider where it bot is being run closes the account (or closes entirely).

SD0001 (talk) 12:04, 14 November 2020 (UTC)

  • I think this is helpful and not particularly problematic. Of course, operators are not obligated to run tasks, but many times the downtime is accidental not intentional. For example, when my task 3 stops we lose a day of Main Page history. It did lock up once after some maintenance from my host, so Template:2020 Main Page history is missing November 9b and 10. Setting up good app/server monitoring is not what most bots do. Note I haven't looked too closely at the implementation yet to say if I have any concerns with that part. ProcrastinatingReader (talk) 15:21, 14 November 2020 (UTC)
  • Sounds like a great idea. I would prefer JSON where each bot monitor is its own object with fields for summary regexes, expected run times, number of runs per day, an array of pages that are expected to have been edited, etc. JSON has the added benefit of being highly extensible since it can contain other objects allowing for more complex configurations. That said it may not be widely accessible to less-technical botops, and the regex escape problem is always a nuisance. Either way, sounds great and I look forward to seeing it in operation! Wug·a·po·des 03:25, 15 November 2020 (UTC)
  • BAG question: is the notification system up and running? Primefac (talk) 10:52, 16 November 2020 (UTC)
    Not yet, the notifications code as presently written will keep spamming the botop every half an hour until their bot comes back up again! I'll probably have to further use SQLite to keep track of notifications sent to avoid repetitions. – SD0001 (talk) 15:12, 16 November 2020 (UTC)
    If you can have the dbase/check running without the notifications enabled, feel free to start running that part while the rest gets hammered out. Primefac (talk) 17:40, 16 November 2020 (UTC)

Bots in a trial period

Bots that have completed the trial period

Usernamekiran BOT 4

Operator: Usernamekiran (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 19:47, Thursday, August 13, 2020 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): AWB

Source code available: AWB

Function overview: find and replace wikilinks

Links to relevant discussions (where appropriate): User talk:Xaosflux#bot task

Edit period(s): As required

Estimated number of pages affected: variable, based on task

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details: I became active again at closing page move discussions. If there is a DAB page involved in the moves, sometimes the bot (RussBot) us unable to fix the links, as they are not double redirects. Recently, the issue came up with RM at Requested Talk:Chani (character)#move 30 July 2020, but the links to be fixed were very few. Today, similar issue came up with Talk:Bell Satellite TV#Requested move 6 August 2020. I had to update somewhere around 450 pages, from "Bell TV" to "Bell Satellite TV"; as Bell TV was moved to Bell Satellite TV, and Bell TV was converted into a DAB pointing to Bell TV should become a DAB pointing to Bell Satellite TV, Bell Fibe TV, and Bell Mobile TV. In this case, the RussBot is not an option, as there wouldn't have been a double redirect; and no botremoved mistaken repetition fully-automated bot will know what link to choose from the DAB page.

My method/task is pretty basic, and simple. I added basic find and replace rule: find [[Bell TV, and replace it with [[Bell Satellite TV. So far I have updated more than 150 articles, and there have been no issues. I have been checking the diff in AWB, and hitting ctrl+S; without any issues.

I am aware this is a very basic task. But there is no bot at WP:RM with this function. There is a possibility that some other bot might be approved with this task; but that would mean I will have to wait till the bot operator comes online. Whereas if I had the approval, I can do it whenever the need arises. As this is a basic, and uncontroversial task; I thought I should ask about it. Regards, —usernamekiran (talk) 19:47, 13 August 2020 (UTC)

PS: This method also handles previously piped links: special:diff/972777535. —usernamekiran (talk) 19:49, 13 August 2020 (UTC)


What's the general logic / criteria for the task? Would this be automatic or assisted somehow (i.e. triggered by moves, or human-supervised, telling it 'fix those specifically') ? Headbomb {t · c · p · b} 17:11, 20 August 2020 (UTC)

@Headbomb: Basically, I would be making list in AWB after closing the RM, but before performing the actual move. I apologise for selecting bot's mode as automatic above. The operation would be just like manual AWB editing, with the only exception being "bot flag" to avoid hitting ctrl+S for a lot of times. It would also be convenient, and time saving. —usernamekiran (talk) 23:18, 20 August 2020 (UTC)
For the record, what you just described is "automatic mode". Primefac (talk) 23:51, 20 August 2020 (UTC)
Then where do the bots like MuzikBot, and archive bots fall in? —usernamekiran (talk) 10:51, 21 August 2020 (UTC)
They are also fully automatic. Primefac (talk) 19:32, 21 August 2020 (UTC)

Approved for trial (1 move discussion). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Alright, then let's have a trial and see if bolts come off loose. Headbomb {t · c · p · b} 23:22, 20 August 2020 (UTC)

  • This is a classic case where context matters. There's a whole wikiproject whose goal is fixing links to dab pages (WP:DPL), and there are various tools they have developed over the years to help with aspects of the work. Sadly, a simple find and replace will not work, because the links are by definition ambiguous, so you can't know the intended target unless you examine the context. And you can't assume that the previous primary topic will always be it either. Fixing links in some very narrowly defined contexts can be helpful, like in [[New York]] City -> [[New York City]]), but situations where it's needed to do this at scale are quite rare. Also adding that, contrary to what was suggested in the Channi RM linked above, closers of RM discussions are not expected to fix dab links resulting from the moves. – Uanfala (talk) 10:19, 21 August 2020 (UTC)
    @Uanfala: Hi. I am aware of the issues that can arise, like the ones you mentioned. Like I stated in the comment above, I will be making list in AWB (and then I will check the list, and the edits once before making the actual edits), once the list/edits are okay; only then I will save the edits. It will not be a "blind" task where I will make list of "what links here", and hit save. —usernamekiran (talk) 10:50, 21 August 2020 (UTC)
    @Uanfala: Yes they are, see WP:FIXDABLINKS - "Before moving an article to a qualified name (in order to create a disambiguation page at the base name, to move an existing disambiguation page to that name, or to redirect that name to a disambiguation page), click on What links here to find all of the incoming links and repair them." Narky Blert (talk) 20:56, 22 August 2020 (UTC)
    Thank you for the link, Narky Blert. This is indeed what that page says. This is an abomination we should be thankful is never followed in practice – otherwise many moves wouldn't happen because the editors involved wouldn't ever have the time and inclination to fix those links and we would be stuck with all the bad primary topics for perpetuity. – Uanfala (talk) 21:38, 22 August 2020 (UTC)
    I'm having a look at the talk archives and I'm not surprised that this piece of nonsense doesn't appear to have been discussed much. The one discussion there was on the topic, however, is clear that closers of RM discussions should neither be required nor expected to fix those links. – Uanfala (talk) 21:44, 22 August 2020 (UTC)
    @Uanfala: FIXDABLINKS is idealistic and unworkable, even penal. If there is only a handful of links, it does make sense for the closer to fix the links, as being quicker and simpler than asking the proposer to do so and checking to see if they've done it. The problem is with moves which break several hundred or several thousand links. (Example: a move involving one of the Jammu and Kashmir pages at the beginning of June broke over 2,000 links; two-and-a-half months later, 495 are still broken, see DPwL.) Is is unfair to expect an editor who has made an administrative change reflecting a WP:CONSENSUS should have sole responsibility for cleaning up the mess; especially prior to making it, which is what FIXDABLINKS says.
    That 2014 discussion is still relevant. It includes a proposal which I came up with independently, that the onus should be on those who supported the move (and in big cases, there will always be more than one of them).
    I once proposed a change to a guideline; I won't be making that mistake again. My legal training taught me that if one party finds a clause ambiguous, it is ambiguous and it needs to be redrafted. I found the wording of a guideline ambiguous; some agreed, others didn't, and the result was WP:NOCONSENSUS. Narky Blert (talk) 07:00, 23 August 2020 (UTC)
    FIXDABLINKS is vague as to who should fix the links. In my opinion, it should be the person(s) requesting the move, rather than the closer who carries out the mechanics of moving the pages but may not be a subject expert. All too often, it gets left for a passing gnome. Certes (talk) 09:37, 23 August 2020 (UTC)
    Yeah, proposing changes to guidelines can be a pain (I would think even more so for like me who lack any legal training), and it sometimes may be easier to just ignore antiquated guidelines than attempt changing them, but oh well, there we go: Wikipedia talk:Disambiguation#Rewriting WP:FIXDABLINKS (Links to disambiguated topics). – Uanfala (talk) 14:06, 23 August 2020 (UTC)
    @Uanfala: I've seen that discussion, I intend to comment in detail once I've gathered my wool and rowed up my ducks. Narky Blert (talk) 18:31, 24 August 2020 (UTC)
  • a comment in general: I am okay with fixing the links from my normal (non-bot) account if there there are few pages to be fixed. If I am running the task from bot account, I will make sure there are no issues after my edits. I am requesting the bot flag only for the cases where there are a lot of page/links to be fixed. For example, in case of Talk:Bell Satellite TV#Requested move 6 August 2020: in step one, I created the list, after the pre-parse mode; there were around 450 articles to be fixed. In step two, I skimmed that list. In step 3, I made around 150; and in second run, around 50 edits with checking the difference before saving the edit. All were without any technical problems. I performed the remaining ~250 edits without checking the diffs; and after the entire session, checked a lot of them at random. None of the diffs I checked had any problem. What I am trying to say is, I will make a logical judgement before, and during step 2. I will not be changing the links without getting to know the context. The only difference in regular/normal AWB run, and this bot run would be the last step of saving the edit manually vs automatically. Given my previous experience of moving pages, and editing wikipedia in general; I can figure out where to use caution, and how to handle things. I am looking at this trial run as means to check only the technical accuracy of my method. —usernamekiran (talk) 15:02, 21 August 2020 (UTC)
    Thank you for the detailed explanation. Just a clarifying question, what proportion of links do you manually examine the context of? By examining the context, I mean looking at the article text that has the link and reading the sentence in which the link is found (possibly also the sentences before and after). – Uanfala (talk) 16:10, 21 August 2020 (UTC)
    That is a very broad question. The solution, and context begins with the page that is being moved. The Bell TV was the easiest one. There was not much scope for linking to the incorrect page (the incorrect page being the one that is to be moved). Biographies are also easy. Till now whenever I came across such instances, I used to skim the page which is to be moved. That gives you a general idea what the topic is, and where can you expect the links to come from. Also, after making the first list, I remove disamb pages from the AWB's list; and edit it manually. Most complicated are the ones of communities/castes; sometimes incorrect process can lead to a (intended) group being linked to a language (eg: Tamils (people) with redirect Tamil people; then there is Tamil language, and Tamil script. A complete list can be found at the dab Tamil). I do have a method for weeding out the pages that might get incorrectly updated, but I don't know how to put it in words. I do it when I see the "what links here" list of wikipedia, and/or in the AWB list. Its based partly on hunch, and mostly on logic (dont know how to put this logic in words). The only thing I can say is, I will make the edits carefully; and there wouldn't be any issues. Not trying to brag, but according to xtools I have moved 1,787 pages till now. That number is exaggerated as it counts one page swap/round robin move as 3 page moves. Assuming I have moved 400 pages, and performed this task (the one in discussion) a few times, there have never been any issues (however, there were two instances during the early days as page mover, where my closure was not perfect; but almost nothing after that). And I will try to keep things that way Face-smile.svg —usernamekiran (talk) 18:42, 21 August 2020 (UTC)
    Thank you for the detailed reply, but I still don't see an answer to my question. Should I take it that you don't intend to examine each link in its context? In such a case, this task should definitely not go ahead in its present form. To decide the target of the link, you need to examine the link itself and the text around it, it's not enough to guess from the article title. You can use AWV or another tool in combination with whatever heuristics you choose, as long as you manually examine each link to be fixed: either before fixing it, or in the diff afterwards. A process that gets it right 99% of the time does not lead to a net improvement for readers: a link to a disambiguation page is merely an inconvenience that adds an extra step to a navigation path, an incorrectly disambiguated link on the other hand completely cuts off navigation to the intended article and also introduces a factual inaccuracy in the text. If you examine each and every link to be fixed, then it's up to you what technical means you use to save your edit; but whichever way you do it, you shouldn't be doing it under a bot flag, as this would be a manual edit that needs to invite the same amount, and type, of review from page watchers as any other manual edit. – Uanfala (talk) 19:10, 21 August 2020 (UTC)
    Adding that you can ask for more feedback at WT:DPL. – Uanfala (talk) 20:35, 21 August 2020 (UTC)
    @Uanfala: I have no problem with checking the edits. However, I cant understand why are you so much reserved with this task. If [ABC] gets moved to [XYZ], and if we have to divert the incoming links from ABC to XYZ from [foo], and [lorem]; provided there are already wikilinks in foo, and lorem pointing to ABC, I dont see much room for error that you are talking about, if any. —usernamekiran (talk) 08:25, 22 August 2020 (UTC)
    We're talking about moves involving primary topics, right? Say, ABC to ABC (foo) and then ABC (disambiguation) to ABC. Your assumption seems to be that all links to ABC will be intended for ABC (foo). This is not necessarily true, and in fact it is very rarely the case. There will be links ABC where the target is not ABC (foo), but ABC (bar) or ABC (buz). See for example the incoming links to Gaddi, an article about an ethnic group of India: there are some links intended for it (like at Kareri Lake), but there also links not intended for it (like at List of biblical names starting with G). – Uanfala (talk) 11:41, 22 August 2020 (UTC)
    Thats what I was trying to say when I gave the example of Tamil. Anyways, like said earlier; I don't have any problem checking the edits. —usernamekiran (talk) 11:58, 22 August 2020 (UTC)
  • When an article moves and a disambiguation page usurps its title, this is because that title is ambiguous. Many links will intend the moved article but some will have other meanings. It is necessary to examine each link manually. Requests to automatically change links which have been vetted by human eyes occasionally pop up at WP:AWB/TA, and I'm sure many editors use semi-automated tools for similar purposes. (I prefer WP:DisamAssist.) A bot to do that might be useful. However, I would oppose an automated change where links have not been checked manually or even where a sample has been checked. This would simply turn known unknowns into unknown unknowns, for example by replacing Foo by Foo (computing) where Adeline Foo was intended. I recommend automatically generating lists of potentially good changes, then applying them selectively with manual supervision. Certes (talk) 12:09, 22 August 2020 (UTC)
    I did a small amount of the cleanup after Talk:Vinyl#Requested move 19 June 2017, when a redirect was turned into a DAB page. That broke about 2,500 links. Only about 10% of them were intended for the original target, and the other 90% were mostly split three ways in roughly equal amounts. Eyeballs were essential. Narky Blert (talk) 21:05, 22 August 2020 (UTC)
    Another example is New York, which required checking and changing 119,000 wikilinks, of which more than 12,000 were not for the original target. (I did a tiny fraction of the work.) Certes (talk) 21:55, 22 August 2020 (UTC)
  • @Certes, Narky Blert, and Uanfala: Once my problem with AWB resolves (cant start it up) I can fix some links to a page from WP:DPL as bot trial, or maybe some other page. @Headbomb: I know this is different than the proposed task involving RMs, but the task is same. Would it be possible to run the task on a page from DPL? —usernamekiran (talk) 09:25, 23 August 2020 (UTC)
    That sounds good to me, if you identify a DPL case where all links are intended for a single target. Do you actually need AWB for this? WP:JWB can perform many AWB tasks, though you will have to get the list of pages to fix from elsewhere such as Special:WhatLinksHere. Certes (talk) 09:32, 23 August 2020 (UTC)
    No, I meant page links intended 2/3 pages. —usernamekiran (talk) 09:36, 23 August 2020 (UTC)
    (ec) DPwL would be an excellent source for a test-run, it's updated twice daily, and you could choose the number of links from 1 upwards.
    A thought. If a fix isn't obvious, could the bot be programmed to add a {{disambiguation needed}} tag? It's the DABfixer's last resort, and it's remarkable how often problems in Category:Articles with links needing disambiguation get fixed. Narky Blert (talk) 09:44, 23 August 2020 (UTC)
    I will look into DPwL as soon as I get on a computer. disambiguation needed tag can be added, but that would be just me manually adding it hehe. Seriously speaking, I think all that would get more clear after bot's 4-5 heavy runs. —usernamekiran (talk) 09:58, 23 August 2020 (UTC)
  • @Certes, Narky Blert, and Uanfala: I picked up Manny Perez from DPwL, but all the incoming links to the disamb were intended for only one article. I can easily fix the disambs of the likes of Chuuk, Stefan Marinović, and Wyn Jones. All of them have very few incoming links, but that's not the reason. As long as the targets are easily identifiable, I can do the task through AWB/bot; no matter how many links pages are listed on the disamb page. For example, Cruel Summer has only 5 entries, but all of them are songs. For now, I don't know how to approach such situations, but I can come up with some idea/solution if I keep working on the task. Tomorrow I will work on any two articles from above, and then we can have a bot trial for the remaining third article. note: there would be no difference in the working method at all, except for the automated saves. —usernamekiran (talk) 18:17, 24 August 2020 (UTC)
    I'm not the best person to consult about these newfangled tool and bot thingies. I fix links to DAB pages the way my father taught me, which is the way his father taught him, and his forefathers taught him, and so on time out of mind - using craftsman-knapped flint tools and low cunning.
    It's the bulk stuff - DAB pages with 10+ links, say - which is a problem. It's mind-blowingly tedious work, and I try to avoid it. Props to those who do firefight such links. Narky Blert (talk) 19:37, 24 August 2020 (UTC)
  • Done with Wyn Jones, and Stefan Marinović. No technical issues found. There were a few articles where {{sortname}} was used. I fixed it one instance manually, and then a couple through AWB. Then there was this instance where first, and last names were used the other way around. Then there was this instance where initial was used. But all these edits will not be edited even automated account. I will be fixing one link manually (through non-bot account), at the same time I will be making the lists for other targets using AWB. The doubtful/problematic articles would be added to one list for manual editing, other lists would have no chance of error. Links to Chuuk have been fixed by someone. I can fix the links to Big box; there are 4 targets, and 109 incoming links. I am ready to do this with automated/bot account. Pinging @Headbomb and Primefac: not sure if I should ping the other editors as well. —usernamekiran (talk) 16:38, 27 August 2020 (UTC)
  • Since my last comment here, I have been working on a custom/hybrid module for this task. I got input from David at their talkpage: special:permalink/975938658#find and replace random piped wikilinks. I tested this module in my sandbox, and it worked successfully: special:diff/975939820. I tested this module on the Gaddi disamb. All the edits were as expected, including special:diff/977067060. The only problematic edit was special:diff/977066941, where it changed [[Gaddi]]s to [[Gaddis]]s. I updated my module, and fixed the issue: special:diff/977067363. I should have already anticipated that though. Anyways, I have updated the module to handle this scenario. In short: if there are many pages to fixed, I will create the lists first, and then I will handle it through bot account. In case there are not many pages, or creating the lists are not worth it, then I will do that task through non bot account. But now I can positively say that if I do it from bot account, there would be no mistakes. —usernamekiran (talk) 19:45, 6 September 2020 (UTC)
    The Gaddi run has introduced a number of grammatical errors. Replacing [[Gaddi]] with [[Gaddis]] doesn't work in all contexts as the first word is singular and the second one – plural. I've checked the first 10 edits, and the following contain this error: [1] [2] [3] [4]. – Uanfala (talk) 20:04, 6 September 2020 (UTC)
    Yes. If I was told sooner, I could have easily done that. I generally ask this stuff during/while closing the RM (eg: Talk:Neurolathyrism#Requested move 2 July 2020). In either case, there were no incorrect targets, and there were no technical errors :) —usernamekiran (talk) 20:34, 6 September 2020 (UTC)
    Well, editors are expected to figure this out by themselves as part of their preparation before fixing the dablinks, regardless of the method they're going to use (I only pointed that out to you after I noticed an error washing up on my watchlist). This particular kind of error can be avoided if you don't change the visible article text, but use piping in the link (though of course, there are cases where linking directly is preferable). – Uanfala (talk) 21:28, 6 September 2020 (UTC) Adding that I've now fixed the 13 such errors introduced in this run. – Uanfala (talk) 22:39, 6 September 2020 (UTC)

BAG note Just as a note, I won't have time to check this for a couple of days/a week-ish so if some other BAG member wants to take a look, feel free. Headbomb {t · c · p · b} 22:04, 6 September 2020 (UTC)

Trial complete. through my alt ac Usernamekiran (AWB) with this move. I checked all the diffs, no issues found at all. Sample diffs: sample diff 1, sample diff 2, and 3. —usernamekiran (talk) 18:58, 10 September 2020 (UTC)

So, um... why did this not get run with the bot?
Second question, which I only just realized now since I haven't really been paying much attention (since Headbomb's been the primary BAG) - is there going to be some sort of request process, or is this just so you don't have to hit "save" a bunch of times when you close an RM yourself? Primefac (talk) 22:02, 29 September 2020 (UTC)
Headbomb had approved the trial when the account didn't have bot flag. I was not sure if I should have used the bot account, so I went with non-bot account. I am also not sure if the bot needs to be enabled somewhere (from your side) to be able to edit the mainspace.
I am willing to accept requests which I can handle without any problems. I can handle doubtful requests with non-bot account (or the bot account, but with non-bot mode). If approved, I was thinking about putting a normal discussion thread similar to special:permalink/978981536#bot for link fixing before/after page moves on WT:RM, and WT:Page mover; with a real long {{DNAU}} —usernamekiran (talk) 03:17, 30 September 2020 (UTC)

{{BAGAssistanceNeeded}} Techie3 (talk) 06:48, 29 November 2020 (UTC)

  • Hmm. @Headbomb: thoughts on this? As far as I can see, there's a few arguments for/against the consensus and technical viability by editors for this task, some concerns seem more valid to me than others, though I'll admit I have exactly zero involvement in dab pages. I've advertised this to WT:DPL for comments, as well. I've only checked a couple diffs but I have a few thoughts. First, proposal says "replace [[Bell TV", I presume you mean [[Bell TV| (so it won't match [[Bell TV 2, for example)? Second, edits like Special:Diff/977744374 or Special:Diff/977743962 are introducing the subject's first name, when it seems the article content explicitly is choosing to use last names only? ProcrastinatingReader (talk) 13:46, 29 November 2020 (UTC)

Symbol tick plus blue.svg Approved for extended trial (5 dab runs). Please provide a link to the relevant contributions and/or diffs when the trial is complete. @Usernamekiran: says they can restrict the bot to non-problematic cases, so let's see that in action. One thing thought, concerning [[Jonathan David (soccer)|Jonathan David]][[Johnathan David]], I really don't believe those are needed. Especially given Jonathan David might become ambiguous in the future. Same for [[Jonathan David (soccer)|David]][[Jonathan David]], which is straight up an error. Headbomb {t · c · p · b} 15:25, 29 November 2020 (UTC)

Just based on this note it's starting to sound more like a CONTEXT issue... let's see how well this trial goes. Primefac (talk) 15:53, 29 November 2020 (UTC)
@Headbomb: thank you. [[Jonathan David (soccer)|Jonathan David]][[Johnathan David]] was part of the list. At the time, I thought it would be better to use the actual target page, instead of redirect. Also, should I use the bot account for trial runs, or normal account? @ProcrastinatingReader: Yes, I meant [[Bell TV|, sorry about that. Regarding use of only last name (or partial name of target page), such instances can be handled in next runs, where only last name is required. Like mentioned in the above comments, I will be needing to create different lists in AWB, one list can be created for partial names. @Primefac: I will use the same conventions which are already used in articles, the only context-issue here would be the one pointed out by Unfala above — regarding targets. I will be avoiding such tasks. —usernamekiran (talk) 16:02, 29 November 2020 (UTC)
I've said this multiple times - it's a BRFA, it's a bot trial, so use your bot account. Primefac (talk) 16:03, 29 November 2020 (UTC)
Bot account, yes. Headbomb {t · c · p · b} 16:04, 29 November 2020 (UTC)

Little concerned over the technical soundness here given the David error, though, and have a gut feeling the replacement may be overcomplicating things. Kiran can you publish the module code here? ProcrastinatingReader (talk) 16:14, 30 November 2020 (UTC)

I am a little confused, not sure how [[Jonathan David (soccer)|David]][[Jonathan David]] is an error. —usernamekiran (talk) 21:10, 30 November 2020 (UTC)
The link originally displayed as "David", after the edit it displays as "Jonathan David". The display of "David" was intentional on those articles, because the scoreboards used players' last names only. As I understand it, this bot task should never be changing how a link displays, it should only be changing the target (removing the pipe when the display = target is merely a technical point). ProcrastinatingReader (talk) 21:13, 30 November 2020 (UTC)
Indeed. Whether to write According to Einstein, ... or According to Albert Einstein, ... is not something the bot can decide. Headbomb {t · c · p · b} 22:35, 30 November 2020 (UTC)

If you'd like something to practice on from the most recent update of Disambiguation Pages with Links, try Argos (3 links in, each needing a different fix (and one of them a MOS:OVERLINK), DSS (3 links in, all the same, but you'll need to find which one on the DAB page it is), Elizabeth Ferris (1 easy fix, 2 will need redlinking) and Saint Mary's College (1 easy fix, 2 might be soluble by googling). Narky Blert (talk) 15:17, 2 December 2020 (UTC)


Operator: BJackJS (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 18:21, Wednesday, October 21, 2020 (UTC)

Automatic, Supervised, or Manual: supervised

Programming language(s): Node.JS with MWN Library

Source code available: github

Function overview: Repair broken links that occur due to page moves. This would be done to broken links that are in large volumes that cannot be fixed by individual editors efficiently and timely.

Links to relevant discussions (where appropriate): Wikipedia:Bot_requests#Bot_to_fix_broken_peer_review_links was the primary reason for this bot, but it could be extended.

Edit period(s): Likely weekly or when a need arises to repair a group of links.

Estimated number of pages affected: 1000+

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details: Scans categories that are identified to have broken links (such as Category:Pages using Template:Old peer review with broken archive link). Once broken links are found, it locates redirects and fixes the link with the most recent redirect/move. Rescans after all of the links have been fixed and repeats any ones with a later redirect if needed.


  • This is a very useful bot. Relevant details:--Tom (LT) (talk) 20:59, 22 October 2020 (UTC)
  • This is a tricky issue and so a trial run of 10 articles or so is likely necessary to make sure any unanticipated issues are ironed out. Thanks again to BJackJS for proposing this bot.--Tom (LT) (talk) 20:59, 22 October 2020 (UTC)
  • Note: This bot appears to have edited since this BRFA was filed. Bots may not edit outside their own or their operator's userspace unless approved or approved for trial. AnomieBOT 21:52, 23 October 2020 (UTC)
  • Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. @BJackJS:, for future reference, please do not edit with the bot account until a trial has been approved by a BAG member (limiting the edits to the number requested) and until it is fully approved. Not doing so is a violation of bot policy and can result in the account being blocked as an unauthorized bot. --TheSandDoctor Talk 18:40, 27 October 2020 (UTC)
    • Thanks. I apologize for the bot edits. I was testing the bots ability to pick out the templates but forgot to remove the edit function. BJackJS talk 20:36, 27 October 2020 (UTC)
      • @TheSandDoctor: After issues with repeating args, loop problems, undefined added as args, and many other errors. I have completed the 50 actual edits from the bot, as shown here with diff ids: 986969783, 986968303, 986969709, 986969783, 986967683. With my recent edits, it is shown that the bot does need some help occasionally, but this bot would cut down on all of that very quicky to the point that simple code changes and human work could repair the remaining ones that are not fixable by a bot.
        • I agree, this has been much appreciated and I've checked around ten of the entries all of which seem to have been properly fixed. --Tom (LT) (talk) 00:49, 5 November 2020 (UTC)
  • Trial complete. [5] — Preceding unsigned comment added by BJackJS (talkcontribs) 17:48, 4 November 2020 (UTC)
Special:Diff/986964116, a few things that could also be addressed as a side note, like ensuring the bot doesn't use the same parameter twice. It may also help users if the edit summary has (Bot) or similar appended to it. It looks like a great bot though! — Yours, Berrely • TalkContribs 19:42, 4 November 2020 (UTC)

Symbol tick plus blue.svg Approved for extended trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. The double-parameter issue is concerning that should be fixed. While I know it's a GIGO issue, it would be nice if changes like Special:Diff/987701203 would be skipped since neither the old nor new link exist. Additionally, as mentioned by Berrely, please include a link to this BRFA in your edit summaries. Primefac (talk) 01:33, 16 November 2020 (UTC)

  • Ah, I think this may well be the first time the MWN library is being used by someone other than me. Please feel free to reach out for any issues you face. – SD0001 (talk) 04:38, 16 November 2020 (UTC)

Trial complete. [6] There was an error in which it put the argument in twice which I didn't notice at first. I saw it and worried about mass-reverting a good amount of edits that did fix the problem. The code problem has been fixed and I would do another trial if necessary. BJackJS talk 07:15, 18 November 2020 (UTC)

  • Hmm, based on analysis of around 5 - 10 edits from that last batch, the double parameter problem is still quite prevalent. The two most recent changes don't have it, but it seems like a high proportion of the other edits do. That said we only need another 14 trials before the backlog is cleared ;)! --Tom (LT) (talk) 07:41, 18 November 2020 (UTC)
  • @BJackJS: Just to be clear, you're saying the double parameter issue is now fixed after that trial, and shouldn't happen in the next one? For future reference, you can break up your trial in 'multiple runs' (eg do 5/10 at a time, evaluate those), so you're checking it as you go along. Also, what happened here Special:Diff/989310175? And here Special:Diff/989309968/Special:Diff/989309969? ProcrastinatingReader (talk) 07:57, 26 November 2020 (UTC)
    • Yes. Those broken edits were caused by previously strange arguments being added because the bot doesn't actually replace the whole tag. That combined with the code error made those edits possible. BJackJS talk 18:12, 29 November 2020 (UTC)
      @BJackJS: that explains the first diff, but what about the second and third diffs? those base pages themselves are valid redirects, and valid parameters, but there is no actual page there for peer review? ProcrastinatingReader (talk) 06:41, 30 November 2020 (UTC)
      Those pages had broken links which added them to the category. The bot handles all pages in the category the same way by adding the most recent redirect. There isn't any other way I can think of to make it so it does the task in a way that completely repairs the link. BJackJS talk 00:16, 2 December 2020 (UTC)
      They seem to still be in the category; changing one broken link for another isn't ideal. How about making a request to the API to check if the generated page title is valid? ProcrastinatingReader (talk) 00:25, 2 December 2020 (UTC)
      The problem is that it takes up a 2nd request per edit and I'm not sure I want to increase that. BJackJS talk 03:48, 2 December 2020 (UTC)
      I don't think it's much of a problem. Your only per-edit request currently is the edit one, so adding a data fetch each whilst not ideal is not a dealbreaker; it's certainly better than making bad edits. That being said, you can just add one request for all your pages if you want. Loop through your pages, fetch your redirect and generate the predicted archive page title as you do, add it into an array, then make a single mw:API:Query with all the page names stuffed in as "titles". You know it's a valid title if the resulting request (when plucked for page titles) contains the title. If it is, make the edit. Sample ProcrastinatingReader (talk) 13:17, 2 December 2020 (UTC)

Approved requests

Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.

Denied requests

Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.

Expired/withdrawn requests

These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at any time. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.