Page semi-protected

Wikipedia:Bots/Requests for approval

From Wikipedia, the free encyclopedia
< Wikipedia:Bots  (Redirected from Wikipedia:BRFA)
Jump to: navigation, search
Shortcuts:

If you want to run a bot on the English Wikipedia, you must first get it approved. To do so, follow the instructions below to add a request. If you are not familiar with programming it may be a good idea to ask someone else to run a bot for you, rather than running your own.

 Instructions for bot operators


Current requests for approval

edit WP:BRFA/B-bot_3

B-bot 3

Operator: B (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 23:39, Sunday, May 24, 2015 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): C#

Source code available: User:B-bot/source/OTRS Dated Filer

Function overview: 1. If the monthly OTRS pending and OTRS received categories, e.g. Category:Items pending OTRS confirmation of permission as of May 2015 and Category:Wikipedia files with unconfirmed permission received by OTRS as of May 2015, do not exist for either the current month or for next month, create them. 2. For any {{OTRS pending}} or {{OTRS received}} tag, if the tag is not dated, add the appropriate date.

Links to relevant discussions (where appropriate): Wikipedia:OTRS_noticeboard#Proposal_to_move_to_dated_pending_and_received_categories

Edit period(s): Daily

Estimated number of pages affected: Lots initially (there are 400 or so pages to initially be tagged) and thereafter probably around 10 per day on average.

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): Yes

Function details:

  1. Check to see if the current month's OTRS pending category exists and if not, create it using {{subst:OTRS pending subcat starter|date=2015-05-31}}. Do the same for next month.
  2. Check to see if the current month's OTRS received category exists and if not, create it using {{subst:OTRS received subcat starter|date=2015-05-31}}. Do the same for next month.
  3. Patrol the category Category:Items pending OTRS confirmation of permission as of unknown date, adding year|month|day based on the last edit date.
  4. Patrol the category Category:Wikipedia files with unconfirmed permission received by OTRS as of unknown date, adding year|month|day based on the last edit date.

This is a precursor to the next task, which will be automatically giving a final warning / request for follow-up if we have not received appropriate permission in more than {{OTRS backlog}} days.

Discussion

This is running in test mode at User:B-bot/Test page. --B (talk) 23:56, 24 May 2015 (UTC)

edit WP:BRFA/Mdann52 bot_8

Mdann52 bot 8

Operator: Mdann52 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 20:04, Wednesday, May 20, 2015 (UTC)

Automatic, Supervised, or Manual: Auto

Programming language(s): AWB

Source code available: standard AWB

Function overview: Updates links from http://patient.co.uk to http://patient.info/

Links to relevant discussions (where appropriate): NA

Edit period(s): One time run

Estimated number of pages affected: From link searches, around 500-1000

Exclusion compliant (Yes/No): Mo, no need

Already has a bot flag (Yes/No): Yes

Function details: Per OTRS ticket, the website appears to be updating the URL, and the previous link is likely to go dead. As the URL scheme seems to follow the same format, this should be a simple run.

Discussion

edit WP:BRFA/BD2412bot

BD2412bot

Operator: BD2412 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 18:15, Thursday, May 14, 2015 (UTC)

Automatic, Supervised, or Manual: Supervised,

Programming language(s): AutoWikiBrowser.

Source code available: AWB.

Function overview: I frequently clean up links left from disambiguation page moves. For example, the page Epping previously was an article on a town in England. This page was moved to Epping, Essex, and Epping became a disambiguation page with several hundred incoming links. As is commonly found in such cases, most of the links intended the town in England, and many were found in formulations like "[[Epping]], Essex", or "[[Epping]], [[Essex]]". A similar issue is the recurring creation of common patterns of disambiguation links to heavily linked articles; for example editors will often make edits creating disambiguation links like "[[heavy metal]] music" and "the [[French]] language", which can easily be resolved as "[[heavy metal music]]" and "the [[French language]]". Over time, large numbers of these links may build up. I would like permission to run AWB as a bot so that when page moves are made or common disambiguation targets become heavily linked, obvious formulations like these can be changed with less of a direct investment of my time.

Links to relevant discussions (where appropriate): Wikipedia:Disambiguation pages with links generally contains the protocol for repairing links to disambiguation pages.

Edit period(s): Intermittent; I intend to run this when a page move creates a large number of disambiguation links, for which obvious formulations for a large number of fixes can be seen.

Estimated number of pages affected: New disambiguation pages are created frequently. I would guess that between a few dozen pages and a few hundred pages might require this kind of attention on any given day, although there are likely to be days where no pages require such attention.

Exclusion compliant (Yes/No): Yes, as AWB does this automatically.

Already has a bot flag (Yes/No):

Function details: When large numbers of links to new disambiguation pages are created from existing pages having been moved to disambiguated titles, or from the buildup of common patterns of editing behavior over time, I will determine if there are obvious patterns of links to be fixed, for example changing instances of "[[Epping]], Essex" or "[[Epping]], [[Essex]]" to "[[Epping, Essex|Epping]], Essex", or "[[Epping, Essex|Epping]], [[Essex]]". I will then run AWB in bot mode to make these changes, and review the changes once made.

Discussion

BD2412 I like the idea of this bot but I think similar proposals have been rejected in the past as WP:CONTEXTBOT. Could you please raise a discussion at WP:VILLAGEPUMP so that we check whether there is consensus for these changes or not? There might be traps I can't think of right now. -- Magioladitis (talk) 12:57, 16 May 2015 (UTC)

Which Village Pump page would that go to? bd2412 T 15:12, 16 May 2015 (UTC)
BD2412 Let's start from Wikipedia:Village pump (miscellaneous). -- Magioladitis (talk) 21:50, 16 May 2015 (UTC)

Wikipedia:Village_pump_(miscellaneous)#Bot_request_for_disambiguation_link_fixing_issue. -- Magioladitis (talk) 11:11, 21 May 2015 (UTC)

edit WP:BRFA/MoohanBOT_8

MoohanBOT 8

Operator: Jamesmcmahon0 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 08:33, Sunday, May 10, 2015 (UTC)

Automatic, Supervised, or Manual:Automatic

Programming language(s): AWB

Source code available: AWB

Function overview: Creating redirects from [[Foo]] to [[List of Foo]].

Links to relevant discussions (where appropriate):

Edit period(s): one-time run, then weekly/monthly depending on how many new lists are created without redirects

Estimated number of pages affected: Initially 12617

Exclusion compliant (Yes/No):Yes

Already has a bot flag (Yes/No): Yes

Function details: I have compiled a list of pages where there exists a [[List of Foo]] page but no [[Foo]] page, as a redirect or otherwise. My bot will create all of the pages as redirects to their lists. Specifically with the content;

#REDIRECT [[List of Foo]]
{{R from list topic}} 
{{R with possibilities}}

[[Category:Bot created redirects]]

This is per Pigsonthewing request at Wikipedia:Bot requests#Redirects to lists, from the things they are lists of.


Discussion

You say that you've made a list of all relevant pages; can we see it? עוד מישהו Od Mishehu 08:32, 12 May 2015 (UTC)
List is here Jamesmcmahon0 (talk) 12:31, 12 May 2015 (UTC)

edit WP:BRFA/ThePhantomBot

ThePhantomBot

Operator: PhantomTech (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 02:11, Thursday, March 19, 2015 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python

Source code available: No, not now at least, though I'll share some of the regex if asked during the approval process

Function overview: Monitors recent changes for possible vandalism and edits from long term abuse users, logs findings and (sometimes) gives information to AN/I for review by users.

Links to relevant discussions (where appropriate): Not sure if this would require consensus from AN/I since it would be posting there or not since the posting task is simple and likely to be uncontroversial.

Edit period(s): daily (while I have a computer on) with plans to make it continuous

Estimated number of pages affected: 1 (AN/I) not counting pages in its own user space

Exclusion compliant (Yes/No): no

Already has a bot flag (Yes/No): no

Function details: This bot is meant to allow a decrease in the amount of edit filters and to identify abuse that can't be reverted by bots like ClueBot due to lack of certainty. Every 60 seconds (that time might be lowered to 20-40 seconds to spread load) a list of changes since the time of the last check is filled. On a separate thread, the bot goes through the list, and decides if the actions match a set filter, these filters are usually similar in what they check to the edit filters however are not limited to the same restraints. If a filter is matched the associated actions are taken, usually logging to the user space and sometimes a noticeboard report. Essentially, this bot acts as a post-edit filter, currently targeting long term abuse but technically able to act on any identifiable action. Since it happens after edits, as opposed to "during" edits, it doesn't slow down editing for users so problematic edits don't have to be frequent, like they do to be edit filter worthy, for it to be worth it for this bot to check for them. In its current state I have two LTA matches setup, one stolen from a log only edit filter and another stolen from edit filter requests, and a general abuse match, also stolen from edit filter requests. If the bot is accepted, I plan on going through all the active long term abuse cases and adding whichever ones I can along with some edit filter requests that aren't accepted due to infrequency.

Discussion

Vandalism/abuse monitoring is a difficult area; I suggest that you write your bot and have it edit a page in its or your userspace (no approval necessary unless edit rates are high) as if it were ANI, and monitor what it reports. You can in turn pass the valid observations it makes along to ANI, and if the quality of the reporting is high enough you may find other people watching the page to see what it finds. I expect you'll get a high false-positive rate which you'll need to analyse to improve the performance of your algorithms, and eventually you'll get to a point where regexs just don't cut it for detecting the long-term, low-frequency abuse you're targetting - and you'll have to look at more sophisticated processes. This is the technological evolution that Cluebot went through, but it catches more egregious and obvious vandalism.

Do you think operating in your or the bot's own userspace would be an acceptable stepping stone? Josh Parris 22:18, 20 March 2015 (UTC)

I realize that there is lots of long term abuse that can't be solved by regex alone, this bot will never be able to handle every LTA case but I do plan on implementing more advanced checks in the future. I have no problem running my bot for a bit with it doing nothing but logging to User:ThePhantomBot/log. PhantomTech (talk) 22:36, 20 March 2015 (UTC)
I would want to see a community consensus that bot generated ANI reports are wanted, please discuss and link that discussion here. — xaosflux Talk 05:43, 26 March 2015 (UTC)
@Xaosflux: As I've been working on my bot I've been adding more functionality and thinking about the best ways to have the bot's reports dealt with. Here's my current plan for how it will report things:
  • Bad page recreation - Log to user space
  • High probability sockpuppets - Report to SPI
  • Lower probability sockpuppets - Log to user space
  • LTA detection - Report to AIV or report to AN/I where certainty is reasonably low (not too low, don't want to waste people's time)
  • Newly added LTA filters, including ones being tested - Log to user space
  • IPs using administrative templates - Report to AN/I
  • Sleeper account detection - Not implemented yet so I don't know how often it will go off, if its often log to user space otherwise report to AN/I
I assume you still want to see a discussion for the AN/I reports but do you want to see any for the other places? I'm guessing you'll want SPI mentioned in the discussion too since I don't think any bots currently report to there. Also, do you have any suggestions on where to report these things or how to report them? Admittedly AN/I does feel like a weird place for bot reports but the goal is to get the attention of editors who may not be aware of the bot's existence. PhantomTech (talk) 07:03, 26 March 2015 (UTC)
Start reading AIV archvies such as Wikipedia_talk:Administrator_intervention_against_vandalism/Archive_3#Suggested_merge_of_Wikipedia:Administrator_intervention_against_vandalism.2FTB2 for some suggestions. WP:AIV/TB2 is probably the oldest 'bot reported' noticeboard right now. — xaosflux Talk 10:23, 26 March 2015 (UTC)
@Xaosflux: Are you suggesting that if my bot were to report to ANI it should do so via a transuded page? I like that idea, using transclusion to put the bot's reports somewhere they'll be seen keeps the bot's updates off the watchlists of people who don't care. PhantomTech (talk) 15:32, 26 March 2015 (UTC)
I'm sugesting that prior community discussion on ANI bot reports came to that conclusion - and that after reading up on it you start new discussions to find out where people would make best use of your reports. For ANI it could be the existing TB2 subpage, but they might want it on its OWN subpage; for the other forums people might want subpages, might want main, or might not want bot reports at all. I am not trying to dictate the solution, just that whatever it is should enjoy community consensus before integrating to existing forums. — xaosflux Talk 16:59, 26 March 2015 (UTC)

I posted a discussion at Wikipedia:Village_pump_(idea_lab)#ThePhantomBot_reporting_to_noticeboards to help get an idea of what kind of reporting users would like. Depending on how that goes I'll post something to village pump proposals with notifications on the relevant noticeboard's talk pages. PhantomTech (talk) 05:57, 27 March 2015 (UTC)

edit WP:BRFA/EnzetBot

EnzetBot

Operator: Enzet (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 00:18, Sunday, March 8, 2015 (UTC)

Automatic, Supervised, or Manual: supervised.

Programming language(s): Python.

Source code available: no source code available since bot is a part of major project.

Function overview: fixes inconsistencies and formatting in metro stations articles (in station infobox and S-rail templates).

Links to relevant discussions (where appropriate):

Edit period(s): every time bot found some inconsistency in metro pages.

Estimated number of pages affected: about 1000 pages. There are about 10 K metro stations in the world, so no more than 10 K pages should be affected.

Exclusion compliant (Yes/No): yes.

Already has a bot flag (Yes/No): no.

Function details: I have The Metro Project for metro map automated drawing. It uses Wikipedia for check metro system graphs and sometimes meets inconsistencies and bad formatting in Wikipedia articles. Now I fix them manually (see my contributions) but want to entrust it to my bot.

Tasks for this request:

  • wrap dates in station infobox with date template, e.g. 2000-03-30 to {{date|30-03-2000|mdy}};
  • add links to station structure types and platform types, e.g. Shallow single-vault to [[Single-vault station|Shallow single-vault]];
  • fix redirects in S-rail template.

Discussion

I see the bot account has been editing articles. It is not yet approved for that.

I note you want to edit dates, but I see from your recent edit to Klovska (Kiev Metro) and your function details (above) you haven't realised the importance of ISO 8601 date formatting. I also note that you did not elect to use the style reportedly preferred by Ukrainians in your edit; is there a reason for this? Out of interest, why are these dates of interest to your bot?

The bot fixes inconsistencies between articles; how does it know which one is correct?

The links to station structure types and platform types you're proposing to link - are they the ones in infoboxes, or article text?

What major project is bot a part of, and why does that make the source code unavailable? Josh Parris 14:32, 9 March 2015 (UTC)

I'm sorry for editing without approval. It was a test to make sure bot works. I'll never do it again.
Yeah, I see, date changes seem to be a bad idea. I think, I should remove it from tasks list. Should I undo my edits (there are only 5 of them)?
About inconsistencies. Bot doesn't know which one is correct, it only can detect wrong things or possibly wrong things. For example, wrong date format (month number can't be greater then 12), wrong terminus (station cannot be a next or previous station for itself), if station A is next for station B, station B should be previous for station A, wrong S-rail values (if it conflicts with station lists on metro page or line page), and so on. That's why bot is not automatic, I supervise every edit. I don't know how to formulate it as a task since there are so many types of inconsistencies. May be you can help me?
Yes, bot will add links to infobox only if there is no such link in article text.
My major project is not open source for now. It generates very simple suggestions for bot I exampled above—what to replace in which article. If bot source code is important, I can push it to public repository but it is trivial since it uses pywikibot (no more then 100 LOC). Enzet (talk) 17:01, 9 March 2015 (UTC)
If you're supervising every edit, then this is a "manual bot" and can be run using your own account without approval. Would you like to do so? Josh Parris 11:30, 13 March 2015 (UTC)
OK, I understand all about inconsistencies. If I don't want to use Enzet account for semi-automated editing, can I use EnzetBot account (with removed {{Bot}} template and without approval) or should I register new account without bot keyword? What is a good practice for that? Also, is there some criteria for semi-automated editing (no faster than 1 edit per 5 seconds, no more 100 edits in a row, or something like that)? (Sorry if I missed it from the rules.)
Also, I am realized that (1) wrapping station structure and platform type with links and (2) fixing S-rail redirects tasks may be provided without supervising or supervising for them is really fast (checking is trivial). Can I get approval or disapproval for these tasks in this request or I should create new one? Enzet (talk) 09:27, 17 March 2015 (UTC)

Josh Parris any further comments? -- Magioladitis (talk) 18:44, 19 May 2015 (UTC)

Bots in a trial period

edit WP:BRFA/MusikBot

MusikBot

Operator: MusikAnimal (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 06:23, Wednesday, April 22, 2015 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Ruby

Source code available: GitHub

Function overview: Bot clerking at WP:PERM pages.

Links to relevant discussions (where appropriate): Special:PermaLink/655854110

Edit period(s): Continuous

Estimated number of pages affected: Up to six during one run (one for each PERM page, except Confirmed and AWB requests)

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): No

Function details: This bot works very much like Cyberbot I does at WP:RPP. It monitors all the Request for Permissions pages for new requests, and checks if there were previously declined requests for that user and permission. If matches are found, an automated comment is left linking to those declined requests. Eventually it may also ping the declining admin, but I've side stepped that for now. There are two exceptions: The AWB checkpage which does not have the same structure as the other request for permissions pages, though I might implement special case handling for this at some point. The other is requests for confirmed, where it's very unlikely we'll see multiple requests by the same user, so the bot clerking is not that helpful there. A few notes:

  • It works by using regex to parse out all the necessary info, and constructs the automated comment(s) to be saved. As long as Template:Request for permission generates a level 4 heading and Template:Rfplinks is used than it shouldn't flake out.
  • Thoroughly tested on test-wiki, see testwiki:Wikipedia:Requests for permissions/Rollback (and here).
  • Operates on wmflabs, with a crontab running the script every 10 minutes or so, or whatever we decide on.
  • The perm clerking task can be turned off by changing User:MusikBot/PermClerk/Run to anything other than true.
  • For all six permission pages, it should take less than a minute to complete, with a 2 second pause between processing each page, and it will edit no more than 6 times total. However given the nature of the task you probably won't see but a few edits every day at most.
  • Checks for edit conflicts. If one is detected it will re-attempt to process that permission page for a total of three times, waiting progressively longer each time. So after attempt #1 it will wait 1 second before trying again, after attempt #2 two seconds, etc.
  • Caching is in place where appropriate, such as fetching the declined pages and any declined permalinks for a user.
  • There is verbose logging that I can make publicly accessible.
  • Full exception error handling. If a critical error is encountered (e.g. more than 3 failed attempts to edit a page), the script will proceed to process the next permission page rather than abort the task altogether. Fatal errors such as when the API is down will result in a full abort of the task until it is ran again by the cron job.
  • To be clear, the "cron" jobs are actually submitted to the grid, which helps allocate resources so the bot doesn't get in the way of other jobs on tool labs.

Thank you! MusikAnimal talk 06:23, 22 April 2015 (UTC)

Discussion

{{BAG assistance needed}}

Approved for trial (50 edits). Looks sane; has support from the target audience; reasonable logic; trusted user. The thing I was actually going to ask about (i.e., pointless edits on already-handled entries) looks like it's already covered:
if section.match(/{{(?:template\:)?(done|not done|already done)}}/i)
--slakrtalk / 07:27, 29 April 2015 (UTC)
Thank you! It is now running, processing the PERM pages once every 10 minutes. 50 edits could take a while, but I'm no hurry. In the meantime allow me to note that I am implementing another clerking feature, where it will remove extraneous headers (e.g. see bottom request at testwiki:Wikipedia:Requests for permissions/Rollback). This happens a fair amount from new users, who do not read the instructions stating not put anything in the heading field. This development is happening completely on my local environment and will not interfere with the currently running bot, which is running off of code on tool labs. MusikAnimal talk 16:14, 29 April 2015 (UTC)
Just letting you know I've updated the bot to remove extraneous headers when present. This requires no additional edits should there also be previously declined requests for a user – the bot will simply make all changes to the page at once. Thanks MusikAnimal talk 15:35, 30 April 2015 (UTC)
@MusikAnimal: This message is however totally misplaced, see this edit. It's also incorrectly indented. Armbrust The Homunculus 05:34, 1 May 2015 (UTC)
@Armbrust: The bot acted exactly as programmed, only removing the level 2 header. The rest of the text was left as is. Here the user also ignored the 2= parameter of {{rfp}} and instead wrote the request body on the line below it. I am working on a more intelligent regex solution that can fix this common scenario in full. The incorrectly added level 2 heading is more common, however, so the bot is at least addressing that. Anyway, there's clearly discussion needed so I've disabled that feature for now. Let's talk more at WT:PERM#Bot clerking so others can chime in. MusikAnimal talk 06:03, 1 May 2015 (UTC)

Symbol full support vote.svg Approved for extended trial (50 edits). With the updated regex please. Thanks, Magioladitis (talk) 11:42, 8 May 2015 (UTC)

@Magioladitis: Thank you for the endorsement. Just to be sure, has MusikBot been approved for a total 100 edits? The new regex is now in place and working nicely. An important note: I will be on holiday starting this Friday though the rest of the month. I am programming the bot to automatically shut off when it reaches 50 edits, or 100, as advised. I will still be able to occasionally check its activity for accuracy and act accordingly. Thanks MusikAnimal talk 16:41, 11 May 2015 (UTC)
MusikAnimal please make 50 additional edits. -- Magioladitis (talk) 21:01, 13 May 2015 (UTC)
Thank you, will do MusikAnimal talk 21:03, 13 May 2015 (UTC)

edit WP:BRFA/HostBot_7

HostBot 7

Operator: Jmorgan (WMF) (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 21:11, Thursday, March 5, 2015 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: repo on GitHub

Function overview: Matches users who have created a profile page at the Wikipedia Co-op with mentors who can teach them particular editing skills. When a match is made, the bot posts a message on the profile talkpage—creating that page as a Flow board, (request amended; see discussion) if that talkpage does not already exist. The bot needs to be granted the flow-create-board user right to accomplish this.

This bot request is twofold:

  1. we're requesting the right to implement this matching functionality of the Co-op
  2. we're also requesting that HostBot be added to the flow-bot group so that it can deliver these matches on Flow-enabled talkpages, rather than standard talkpages.

Links to relevant discussions (where appropriate):

Edit period(s): as requested: the bot checks for new mentorship requests every ~5 minutes, and attempts to match each new request with a mentor immediately

Estimated number of pages affected: 10-20 per month

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details:

HostBot is hosted on Labs. Every five minutes, it will query the Wikipedia API for new pages in Co-op-specific categories. These pages will either be new Co-op member profiles (example: Wikipedia:Co-op/AMemberUserName) or existing member profiles where a user has recently added a new Co-op specific category (indicating a change in learning interests). It will also query the API for a list of Co-op mentors with profile pages (Wikipedia:Co-op/AMentorUserName) in the corresponding mentor category who have not opted out of receiving new matches. In both cases, Hostbot/Co-op checks that the category members are in fact subpages of Wikipedia:Co-op.

For each newly-categorized Co-op member, the bot chooses a random mentor from the list of corresponding mentors. If none are available for the given interest, it chooses a random mentor from the fallback category "General editing skills". Once the match is found, HostBot leaves a post on the talk page corresponding to the Co-op member's profile page (example: Wikipedia_talk:Co-op/AMemberUserName). If this page does not already exist, the bot uses the Flow API's new-topic submodule to post its welcome message as a new topic, thus creating the talk page as a Flow board; otherwise, the bot edits the talk page normally. The message mentions the mentor and posts on the Flow-enabled talk page of a page that the member created, and so generates an Echo notification for both of them. The member and the mentor are then free to follow up with each other, and the bot's involvement is finished.

Constraints.

  1. The bot's posting activity is limited to subpages of Wikipedia_talk:Co-op. If a user adds a relevant category to their user page instead of their Co-op profile page, HostBot will ignore it and that user will not receive a match.
  2. HostBot will not—and cannot—convert existing talk pages to Flow boards. If a Co-op profile talk page exists at the time of matching, HostBot will simply edit that page and post its welcome message in a new section.
  3. HostBot can only create a new Flow board if it has the flow-create-board right. A bureaucrat will need to add HostBot to the flow-bot group which has this right. More details here: https://phabricator.wikimedia.org/T76785

Demonstration. This workflow is running on test.wikipedia.org already, by MatchBot. Here's an example learner profile page and talk page. You can sample it yourself if you want: click the "find a mentor" button on the co-op testwiki portal to create a sample profile and wait a few minutes to receive your matches. Note that approval for the FormWizard gadget used to create profiles on testwiki is not part of this request; matching works just as well with manually-created profile pages.

Discussion

  • The Wikimedia Foundation's Collaboration team has been working closely with the Co-op project to use Flow as the discussion system for the new program. We're excited to be a part of this new mentoring project, and we'll continue to provide technical support for the Co-op to deal with any issues that might arise from using this bot. DannyH (WMF) (talk) 22:23, 5 March 2015 (UTC)
  • Comment. I am in charge of managing the development of The Co-op, and wanted to briefly discuss the reasoning behind these proposed bot tasks. The rationale of the matching component is to drastically reduce the amount of effort required on the part of mentors and editors seeking mentors to find each other, and generally, prevent a lot of unnecessary waiting. The rationale behind the use of Flow is to facilitate communication during mentorship. As Flow is designed to address perceived issues with conventional talk pages, particularly with regard to newer editors, and because we are specifically inviting newer editors to use the space, our team is interested in testing it out. That said, using Flow for mentorship-related communication is not required; mentors and editors can use conventional talk pages or whatever communication system they find most convenient. Our pilot for the space is running for about one month, and so a 30-day trial would be appreciated. Thanks for your consideration. I, JethroBT drop me a line 22:40, 5 March 2015 (UTC)
  • Has this functionality been demonstrated on testwiki or anywhere else yet? Currently noone on enwiki has flow-create-board access. What is the rollback plan for actions made in error? — xaosflux Talk 05:33, 8 March 2015 (UTC)
  • Question on operator account: I see this operator listing is under your official WMF staff account - is this purposeful? Will these bot edits represent official WMF actions? — xaosflux Talk 19:13, 8 March 2015 (UTC)
Doing a bit of research, please provided updated data if this is not still blocked by T90077? — xaosflux Talk 01:08, 9 March 2015 (UTC)
Hi Xaosflux. Excellent questions! The functionality is in evidence on testwiki (details under the heading Demonstration above). Give it a try and let me know if there's anything else I can clear up. The actions of HostBot won't be official WMF actions--tho I'm not sure what would constitute an official WMF action in this scenario. Do you mean Office Actions? In any case, no, HostBot's actions would be held to the same standards as any other bot, and I to the same standards as any bot-wrangler.
As to why I'm writing under my staff account: the Co-op is a grant funded experimental project, and part of this grant proposal involved a trial of the system. I work for the team at WMF (formerly known as Grantmaking, now Community Resources) that disburses and oversees grants, and in this case one of the resources they were able to offer the grantee team was some of my time to help with the trial. So right now I'm participating in the Co-op project as a staff member. I have no authority, just the responsibility to make sure the bot does what it's supposed to do, to respond to community input, and to fix anything that breaks. If this HostBot task is approved on an ongoing basis after the trial is concluded, I will continue to manage the task, but in a volunteer capacity. That's what I did with the Teahouse, which also started out as a Foundation-funded venture, but has been completely community-run for more than two years.
Regarding status of blocking bugs: I'll let I_JethroBT handle that one. He's the PM. Regards, Jmorgan (WMF) (talk) 23:43, 9 March 2015 (UTC)
OK, operator question is not an issue; I'm running through the demo on testwiki now. — xaosflux Talk 00:12, 10 March 2015 (UTC)
@Xaosflux: These blocking tasks are due to be addressed in a sprint from the Flow dev team on 11 March, although T90970 and T90969 are being prioritized to allow for deletion of Flowboards that is consistent with how deletion is normally handled, and should be easy to resolve. I expect these tasks to be resolved in about 2-3 weeks time, according to estimates from DannyH (WMF). T90973 is more complicated, and is unlikely to be addressed until much later. We have also de-prioritized T90972 for the purposes of the pilot as it is unlikely for Flow-enabled pages that are deleted to be restored. I, JethroBT drop me a line 02:34, 10 March 2015 (UTC)
  • Couple of issues, please address:
  1. Why can users direct the bot to create arbitrary page names? (By allowing free-text co-op "nicknames" instead of their usernames)? (example: testwiki:Wikipedia talk:Co-op/OffensiveTitleNameHere) — xaosflux Talk 00:39, 10 March 2015 (UTC)
    Jmorgan (WMF) and I discussed the prospect of creating profile pages with the editor's username pre-filled to avoid this sort of thing, but there was not enough development time to implement it for the purposes of the pilot. If we were going to expand this space, I think adding this functionality would be great to avoid typos and more malicious activity like your example. It is important to keep in mind we are only allowing a maximum of 50 learners to create profiles for this pilot because there are just under 25 mentors in the space; it's also unlikely for newer editors to discover the Co-op outside of these invitations. I, JethroBT drop me a line 03:51, 10 March 2015 (UTC)
  2. Lack of "Special:Unflowify" or other means of undoing this bot's flow-create-board action (please correct me if this functionality is present).
    @DannyH (WMF): is in a better position to discuss this question in detail. As far as I am aware, the deletion tasks that are being worked on (T90970, T90969) would create the page as a standard talk page. I, JethroBT drop me a line 03:51, 10 March 2015 (UTC)
Thank you, will need to be seen - I tried on test: with out luck - may be waiting for code changes, able to delete the flow converstation, but still not the topics--but afterwards the page does not revert to standard talk: style. — xaosflux Talk 04:14, 10 March 2015 (UTC)
Thank you! — xaosflux Talk 00:39, 10 March 2015 (UTC)


The overall matching mechanism seems pretty straight forward and uncontroversial; a non-flow enabled trial should be easy to move forward (see page naming issue first). — xaosflux Talk 00:39, 10 March 2015 (UTC)
Would you rather try to resolve these flow-related questions first, or trial with Wikipedia_talk: pages for now? — xaosflux Talk 00:55, 10 March 2015 (UTC)
@Xaosflux: Given the timeline the Co-op grant is operating on, for which our report is due at the end of April, and the time it would require to resolve these blocks, I would prefer to trial the matching component on standard Wikipedia talk pages for now while these matters with Flow are resolved. Thanks, I, JethroBT drop me a line 03:51, 10 March 2015 (UTC)
  • Approved for trial (250 edits or 30 days). (Restricted to Wikipedia: and Wikipedia_talk: space only). NOT approved for Flowbots group addition under this trial. — xaosflux Talk 04:10, 10 March 2015 (UTC)
Thanks Xaosflux: one quick clarification. Is the bot only approved for trial if users are not allowed to specify the title of their profile page? Jmorgan (WMF) (talk) 15:33, 10 March 2015 (UTC)
No, you can trial-as is-before rolling to full productions, we can revisit. — xaosflux Talk 17:00, 10 March 2015 (UTC)
Post-trial follow up

Xaosflux, I_JethroBT Alright, trial period is over. I've turned off the matching script for now. What's next? Jmorgan (WMF) (talk) 18:38, 14 April 2015 (UTC)

Do you want to hold THIS request open until all the software chagnes are made, or change it to be for the non-flow related items and request those at a later time? — xaosflux Talk 01:07, 15 April 2015 (UTC)
@Xaosflux and Jmorgan (WMF): I think it will be best to close this request out for now while the other components (i.e. Flow) are being worked on. We can reference this discussion in a new request in regards to renewing the matching functions of the bot. Thanks, I, JethroBT drop me a line 19:22, 16 April 2015 (UTC)
Actually, Xaosflux I'd like to get this request resolved independently of Flow. I don't yet have confirmation from the Flow team on when they will have addressed the issues you raised, and in the meantime, I'd like the Co-op to continue running (it's basically shut down now, since we can't match mentors with learners. No invites have gone out for over a week.). So my request is: can HostBot be approved to match learners with mentors by posting on their non-Flow-enabled talkpage?
I probably should have separated the matching script request from the flow-bot right request at the beginning--I apologize. I will submit a new request as soon as I've had confirmation from my WMF colleagues that Flow has been patched to address the issues you raised. Best, Jmorgan (WMF) (talk) 17:27, 23 April 2015 (UTC)
@Xaosflux: I've struck out my request above; I'd also prefer that we have the ability to match and notify editors available to us. I, JethroBT drop me a line 19:21, 23 April 2015 (UTC)
@Xaosflux: Just a nudge here. Can we get approval to continue match editors again on non-flow enabled pages? I'm hoping we can get the Co-op back up and running again, hopefully by tomorrow. Thanks, I, JethroBT drop me a line 20:09, 6 May 2015 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── A user has requested the attention of a member of the Bot Approvals Group. Once assistance has been rendered, please deactivate this tag. Per Xaosflux's instructions I'm asking another member of the BAG to make the decision about final approval of this request. I've updated the description to reflect the more limited scope: we're just asking permission to implement the matching functionality as it was tested during the trial period: creating Flow-enabled talkpages in not part of this request. Thanks for your help! Cheers, Jmorgan (WMF) (talk) 21:32, 15 May 2015 (UTC)

Xaosflux Please check this. -- Magioladitis (talk) 13:01, 16 May 2015 (UTC)

edit WP:BRFA/Commons fair use upload bot_3

Commons fair use upload bot 3

Operator:  (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 19:49, Wednesday January 7, 2015 (UTC)

Automatic

Programming language(s): Python

Source code available:

I have this working passively locally, but yet to test out on labs with sample images. When I have significant updates to the code, I will consider uploading a new version of the source under https://github.com/faebug as the wikigit repository is unlikely to be maintained. Migrated version of code on github link above, test version only.

Function overview:

This is a cross-wiki bot to copy files at risk of deletion on Wikimedia Commons to local wikis where they can be retained under either fair use or the image is public domain in the source country but may be problematic under Commons interpretations (such as the URAA).

Links to relevant discussions (where appropriate):

Edit period(s):

  • Previously running hourly, without any issues, so I'm planning on doing the same.

Estimated number of pages affected:

  • The bot was successfully running prior to the toolserver shutdown, coincidentally the last transferred files were some of my own. See ListFiles.

Exclusion compliant : Yes


Already has a bot flag : No Effectively this was reset by the usurp process.

A trial may not be needed considering the track record, however If there is one, I would prefer it to be a month or longer as my availability may be patchy.

Function details: This bot re-uploads files that are deleted on Commons to projects where they are in use, if those projects accept non-free files. It goes over the files in Category:Pending fair use deletes, uploads them to local wikis, then marks them for speedy deletion on Commons when it's done. Any article using the images receives a notice that the file has been re-uploaded as a fair use candidate. Local wikis are responsible for determining if the re-uploaded image is eligible for their non-free content policy, and deleting it in a timely manner if it is not. If for some reason it's not able to upload the image, it will leave an error message on the file page and not mark it for deletion.

Discussion

Arbcom exemption/requirements
  • The Arbitration Committee has passed the following motion which related to this request for approval:
Despite the restrictions on his editing images related to sexuality, may operate the Commons fair use upload bot if the Bot Approvals Group approves it.

The bot may upload sexuality images that would, if Fæ himself had uploaded them to the English Wikipedia, breach Fæ's restriction, only if the upload is requested by a third party.

The bot shall maintain a log of: the images it uploads; the names of the articles on the English Wikipedia where the images appear at the time of upload; and the username of the Commons editor requesting the transfer to the English Wikipedia.

For the Arbitration Committee, Callanecc (talkcontribslogs) 01:24, 15 January 2015 (UTC)

Bot discussion
  • Can you please indicate on the local userpage who owns the account? Courcelles 22:26, 7 January 2015 (UTC)
    • Good point. Done, rather than relying on visiting the Commons page. -- (talk) 22:35, 7 January 2015 (UTC)
  • After playing around with the bot locally and having it fall over a few times, I am planning to rewrite it to rely on pywikibot rather than mwclient as its interface to the API. This will probably work far more reliably on WMFlabs and be much easier to maintain in future years. Though the code is not all that long, with other commitments and the increased testing needed, this will take weeks rather than a few days. -- (talk) 08:49, 12 January 2015 (UTC)
    Note, some 'real' images are ready for the bot to localize, see Commons:Deletion requests/Files uploaded by SPVII DrFresh26. I'm advising that the bot should be operational within a week or two.
    The account commonsfairuseupload has been set up on labs. I have a test version running under Pywikibot core on WMFlabs, however there is a fair amount of rewriting to be done before running it live and it makes sense to put a first snapshot up on github. -- (talk) 23:25, 13 January 2015 (UTC)
    Early snapshot now on github as above. -- (talk) 13:35, 14 January 2015 (UTC)
    A separate bot flag restoration request has been raised on Commons, c:Commons:Bots/Requests/Commons fair use upload bot. -- (talk) 12:52, 15 January 2015 (UTC)
  • This bot would just perform the same task as User:Dcoetzee's bot, right? How will the bot handle Bugzilla:61656? Dcoetzee's bot handled this by reverting CommonsDelinker, see e.g. Special:Diff/615048249. Ideally, this should be fixed in CommonsDelinker instead of the fair use upload bot, but nothing seems to have happened in CommonsDelinker since the bug was reported in 2010. --Stefan2 (talk) 14:53, 20 January 2015 (UTC)
    To be honest, I am not 100% sure I understand the issue, not having looked into the functionality of the delinker (note the bug was reported in 2010, but Dcoetzee's code was successfully running from 2012 to 2014 the way it was). However, the way the CFUUB behaves at the moment is that it locally uploads the file under an amended file name and inserts a redirect as the old local image page text. This should leave the old name untouched to avoid permission problems on local wikis. My understanding is that this precautionary step also avoids possible conflict with the delinker when the original is speedy deleted from Commons. If folks want this to work differently, then this might be something to amend in the delinker's behaviour, rather than building in odd intelligent reverts into CFUUB to undo the work of the delinker.
I have yet to convert this bit of code to pywikibot, but if you look in the current test status source code linked above for the two places that site.upload(open('/tmp/downloadedfile'), newfilename, newdesc, ignore=True) occurs, these are relevant.
As I am regular dialogue with @Steinsplitter:, I would defer to his judgement as he has recently been active in updating the delinker and would welcome his advice during testing. Perhaps he could take ownership of this bug request too? I could do with some test images, so maybe we can agree on a few and demonstrate the system in the trial period. -- (talk) 16:28, 20 January 2015 (UTC)
When Dcoetzee's bot uploaded files, it worked like this:
  1. Someone on Commons requested a local upload
  2. Dcoetzee's bot uploaded the file under a slightly different name by inserting "from Commons" in the file name to avoid permission problems
  3. The bot created a redirect from the old name to the uploaded file
  4. The file was deleted on Commons by a Commons admin
  5. CommonsDelinker failed to notice that a redirect existed locally and therefore incorrectly removed the file from English Wikipedia
  6. Dcoetzee's bot reverted CommonsDelinker's incorrect removal
I think that step 5 should be fixed by correcting the bug in CommonsDelinker, but Dcoetzee decided to fix it by introducing step 6 because the CommonsDelinker programmers didn't fix the bug for several years. There is some discussion in Wikipedia:Bots/Requests for approval/Commons fair use upload bot 2, for example in the "Function details" section. If Steinsplitter can fix CommonsDelinker, then that would be much better. --Stefan2 (talk) 16:49, 20 January 2015 (UTC)
Agreed. I'll pay attention to testing this out based on the fact that Steinsplitter believes this bug has been addressed in Magnus' new version of the delinker (Phabricator:T63656). -- (talk) 18:16, 20 January 2015 (UTC)
See my comment here --Steinsplitter (talk) 18:17, 20 January 2015 (UTC)

As best I can tell, there's no reason to delay a trial. Is that the case? Josh Parris 06:35, 20 February 2015 (UTC)

I'm considering putting some time aside to trial the code in about a week. -- (talk) 09:44, 20 February 2015 (UTC)
My understanding is that you intend to monitor closely, but this is a rewritten bot. I'm also under the impression that there won't be a huge number of edits. As such, Approved for trial (30 edits or 30 days)., commencing sometime in the next couple of weeks. Josh Parris 19:24, 23 February 2015 (UTC)

Any news on that? Trial period has expired. -- Magioladitis (talk) 22:08, 22 March 2015 (UTC)

I have pushed the code forward a little bit this weekend. A family issue has taken priority. I am writing in a reasonable test mode, which I think will help for adding more wikipedias downstream. 30 days was obviously a bit of an aggressive target for my availability. I would expect to be able to run this live in April. -- (talk) 23:04, 22 March 2015 (UTC)

Hi again! Any news on that? -- Magioladitis (talk) 22:17, 3 May 2015 (UTC)

Hi, I'm sorting out clearing a house and the admin burden of much delayed probate, which is eating up my free time. Sorry about these delays. A couple more weeks before I can focus again? -- (talk) 22:35, 3 May 2015 (UTC)

? -- Magioladitis (talk) 11:13, 21 May 2015 (UTC)

edit WP:BRFA/JhealdBot

JhealdBot

Operator: Jheald (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 23:36, Monday December 8, 2014 (UTC)

Automatic, Supervised, or Manual: Supervised

Programming language(s): Perl

Source code available: Still under development.

Function overview: Maintenance of subpages of Wikipedia:GLAM/Your_paintings, in particular the subpages listed at Wikipedia:GLAM/Your_paintings#Artists_by_birth_period. There is currently a drive to identify Wikidata entries for the entries on this list not yet matched. I seek approval to keep these corresponding pages on Wikipedia up to date.

Initially I would just use the bot as an uploader, to transfer wikipages edited off-line into these pages (including fixing some anomalies in the present pages -- which I would probably do sequentially, through more than one stage, reviewing each fix stage before moving on to the next).

Once the off-line code is proven, I would then propose to move to a semi-automated mode, automatically updating the pages to reflect new instances of items with d:Property:P1367 and/or corresponding Wikipedia and Commons pages.

Links to relevant discussions (where appropriate):

Edit period(s): Occasional (perhaps once a fortnight), once the initial updating has been completed. And on request.

Estimated number of pages affected: 17

Exclusion compliant (Yes/No): No. These are purely project tracking pages. No reason to expect a {{bots}} template. If anyone has any issues with what the bot does, they should talk to me directly and I'll either change it or stop running it.


Already has a bot flag (Yes/No): No. I have one on Commons, but not yet here.

Function details:

  • Initially: simple multiple uploader bot -- take updated versions of the 17 pages prepared and reviewed offline, and upload them here.
  • Subsequently: obtain a list of all Wikidata items with property P1367. Use the list to regenerate the "Wikidata" column of the tables, plus corresponding sitelinked Wikipedia and Commons pages.

Discussion

Regarding uploading offline edits: Are these being made by anyone besides the operator? What license are they being made under? — xaosflux Talk 23:44, 18 December 2014 (UTC)
@Xaosflux: The pages have been being prepared by me using perl scripts, drawing from Wikidata.
I've slowly been making the scripts more sophisticated -- so I've recently added columns for VIAF and RKDartists links, both taken from Wikidata, defaulting to searches if there's no link, or no Wikidata item yet identified. Content not drawn from Wikidata (typically legacy entries from the pages as I first found them) I have prefixed with a question mark in the pages, meaning to be confirmed. For the most part these are blue links, which may go to completely the wrong people.
So at the moment I'm running a WDQ search to pull out all Wikidata entries with one (or more) values for the P1367 "BBC Your Paintings identifier" property, along with the properties for Commons category name (P373), VIAF (P214) and RDKartists (P650). I'm also running an Autolist search to get en-wiki article names for all Wikidata items with a P1367. Plus I have run a look-up to get Wikidata item numbers for all other en-wiki bluelinks on the page (this gives the Q-numbers marked with question marks). But the latter was quite slow, so I have only run it the once. At the moment I'm still launching these searches by hand, and making sure they've come back properly, before updating & re-uploading the pages.
As to the licensing -- Wikidata is licensed CC0. My uploads here are licensed CCSA like any other upload to the site (though in reality there is very little originality, creativity or expression, apart from the choice of design of the page overall, so probably (under U.S. law at least), there quite possibly is no new copyrightable content in the diffs. Various people of course are updating Wikidata -- I've been slowly working down this list (well, so far only to the middle of the 1600s page) though unfortunately not all of the Wikidata updates seem to be being picked up by WDQ at the moment; the Your Painters list is also on Magnus's Mix-and-Match tool; and various others are working at the moment, particularly to add RKD entries to painters with works in the Rijksmuseum in Amsterdam. But Wikidata is all CC0, so that all ought to be fine.
What would help though, would be having the permission for a (limited) multiple uploader, so I could then upload the updates to all 17 pages just by launching a script, rather than laboriously having to upload all 17 by hand each time I want to refresh them, or slightly improve the treatment of one of the columns.
I'm not sure if that entirely answers your question, but I hope does make clearer what I've been doing. All best, Jheald (talk) 00:45, 19 December 2014 (UTC)
Approved for trial (25 edits or 10 days). Please post your results here after the trial. — xaosflux Talk 01:48, 19 December 2014 (UTC)
@Xaosflux: First run of 16 edits made successfully -- see contribs for 19 December, from 15:59 to 16:55.
(Links to RKD streamlined + data updated; one page unaffected).
All the Captchas were a bit of a pain to have to deal with; but they will go away. Otherwise, all fine. Jheald (talk) 17:31, 19 December 2014 (UTC)
Sorry about that, I added confirmed flag to avoid this for now. — xaosflux Talk 17:34, 19 December 2014 (UTC)
New trial run carried smoothly (see this related changes page).
Update still prepared by executing several scripts manually, before a final uploader script; but I should have these all rolled together into a single process for the next test. Jheald (talk) 09:11, 11 January 2015 (UTC)
Run again on January 21st, adding a column with the total number of paintings in the PCF for each artist. Jheald (talk) 17:13, 24 January 2015 (UTC)

Have you completed the trial? Josh Parris 10:20, 4 March 2015 (UTC)

I was going to go on running it once a month or so, the next one probably in a day or two, until anyone progressed this any further, possibly making tweaks to my offline processing scripts as I went along. Obviously I'm open to suggestions as to anything I can improve or do better; though the actual unsupervised bit itself is just an upload script, refreshing a dozen or so pages, so nothing very complicated. (The off-line preprocessing is a bit more involved, but still pretty trivial). Jheald (talk) 00:33, 5 March 2015 (UTC)
I note that further edits have been made. Out of interest, why do http://viaf.org IDs change? The painter's been dead for centuries. Are they merges of duplicates? Also, is the trial finished now? Josh Parris 14:54, 9 March 2015 (UTC)
@Josh Parris: Clearly there has been a significant update of VIAF ids on Wikidata in the last three weeks, with a lot of new VIAF ids added -- I think by one of Magnus Manske's bots. This is why there are significant reductions in length for a lot of pages, with VIAF searches being replaced by explicit VIAF links.
I imagine that this may be catch-up resynchronisation for several months of updates at VIAF; but it may also be that now VIAF is explicitly targeting Wikidata items rather than just en-wiki articles, and is actively doing matching at the VIAF end, that may be why there now seems to be a sudden rush of new VIAF <--> Wikidata matches.
You're right that there are a few VIAF matches that have changed. I haven't looked in to any in detail, but two strong possibilities would be either erroneous matches that have been corrected (ie we used to point to the VIAF for somebody quite different); or alternatively that a group of duplicate entries on VIAF may have been merged -- eg if there had been a VIAF for the Library of Congress id, and another for the Getty ULAN id, and the two had not previously been connected.
As to where we're at, matching of the Your Paintings painter identifiers continues to move forwards using mix-n-match. About 80% of the YP identifiers have now been triaged into has / doesn't have / shouldn't have Wikidata item, with progress ongoing; plus I've now got as far as painters born before 1825, using mix-n-match search to match to RDKartists and other databases. Then there will also a stage where new Wikidata items are created for YP ids that currently don't have them but should; and these new ids in turn will also have RKD artists (etc) that they match. So there's still a lot to do going forward, and the tracking pages will continue to need updates if they are to reflect that.
At the moment it's still done using about four scripts that I sequentially run by hand on an occasional basis. The one I'd have to write a bit more code to integrate is the one that merges in the article names on en-wiki for the Wikidata items, because these are currently got using an Autolist query which is then saved manually. I'd need to look into how to replace that batch look-up with an API call, if I was to make the whole thing more integrated and run on regular basis (weekly?) I'm happy to do that work if anybody wants it, but for the time being it's also as easy just to go on doing what I've been doing, generating the updates in a partially manual way. So I'm happy to be open to views, if anybody has got any strong preferences either way. Jheald (talk) 23:27, 4 May 2015 (UTC)

Bots that have completed the trial period

Approved requests

Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.


Denied requests

Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.

Expired/withdrawn requests

These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at anytime. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.