Page semi-protected

Wikipedia:Bots/Requests for approval

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Shortcuts:

If you want to run a bot on the English Wikipedia, you must first get it approved. To do so, follow the instructions below to add a request. If you are not familiar with programming it may be a good idea to ask someone else to run a bot for you, rather than running your own.

 Instructions for bot operators


Current requests for approval

edit WP:BRFA/Yobot_24

Yobot 24

Operator: Magioladitis (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 21:43, Monday, June 1, 2015 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): AWB

Source code available: open source

Function overview: Remove {{Persondata}}

Links to relevant discussions (where appropriate): Wikipedia:Village_pump_(proposals)/Archive_122#RfC:_Should_Persondata_template_be_deprecated_and_methodically_removed_from_articles.3F concluded that "Consensus is to deprecate and remove."

Edit period(s): One time run

Estimated number of pages affected: 1.5 million pages

Exclusion compliant (Yes/No):

Already has a bot flag (Yes/No): Yes

Function details: Straightforward

Discussion

@Magioladitis: Two questions:

  1. Should this bot wait until AWB has been changed to stop adding/updating Persondata?
  2. Since Persondata is not visible in the article, does WP:COSMETICBOT apply? Would it be better to include Persondata removal in AWB general fixes, for other bots & users to remove as they make substantial changes?

Thanks! GoingBatty (talk) 23:30, 1 June 2015 (UTC)

GoingBatty I'll be doing general fixes at the same time. I applied for this so I have control to AWB's code. The bot won't start until we are 100% that mass removal is a good thing to do. Before starting I'll modify the AWB's code not to add Persondata and probably we'll do a new release so that no other editors will add it. -- Magioladitis (talk) 05:41, 2 June 2015 (UTC)
We already have consensus for the removal of Persondata. If the addition of Persondata by automated tools hasn't been a breach of COSMETICBOT, then neither should be its removal. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 07:40, 2 June 2015 (UTC)
@Pigsonthewing: Hi Andy! You make a good point. Are there any bots that have been adding Persondata as their primary approved task? GoingBatty (talk) 12:27, 2 June 2015 (UTC)
Yes, Rjwilmsibot used to add but not anymore. I already contacted Rjwilmsi about the RfC. -- Magioladitis (talk) 12:57, 2 June 2015 (UTC)
Found the approval at Wikipedia:Bots/Requests for approval/RjwilmsiBot 4. Thanks! GoingBatty (talk) 13:33, 2 June 2015 (UTC)

The RFC mentioned above has a section (not an actual wiki markup heading) "Rough plan" which says in part

1. Transfer |SHORT DESCRIPTION= across to Wikidata. Yes check.svg Done

...

4. Transfer any new data to Wikidata, then remove methodically.

I don't see any agreement to modify the rough plan, so I suppose that is the plan. Will this bot transfer new data to Wikidata, or just "remove methodically." If this bot just removes, how will the part about transferring new data be done? Also, does # 1 mean that if any new data is found, only the SHORT DESCRIPTION will be transferred and other, more suspect, data such as birth and death dates will not be transferred? Jc3s5h (talk) 16:03, 2 June 2015 (UTC)

Whoah! I second that concern. The five-point plant presented at the RfC was expressly conditioned on the "transfer any new data to Wikidata" before the systematic removal is implemented. This immediate removal without transfer of new input persondata to Wikidata violates the conditions upon which the RfC was approved. Please adhere to the RfC "rough plan" as presented. Dirtlawyer1 (talk) 18:23, 2 June 2015 (UTC)
I have already suggested that you read the lengthy and detailed discussion of data import under the RfC; and on the pages linked from there, on Wikidata. The RfC was concluded as "deprecate and remove", with no conditions atatched, in the light of that discussion. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:55, 2 June 2015 (UTC)
To add, it became apparent during the course of the RfC that no more data would be transferred to Wikidata, all other PD fields having been deemed unreliable. I can't imagine what "remove methodically" might entail. Alakzi (talk) 20:38, 2 June 2015 (UTC) Oh, I see what you mean now, Dirtlawyer1. I agree that the (or a) bot should migrate any descriptions added after PLbot's last run; that would be eminently sensible. Alakzi (talk) 20:56, 2 June 2015 (UTC)
@Alakzi: Indeed. I didn't just fall off the wiki-turnip truck yesterday. In addition to the recently added persondata descriptions, I have also raised a concern about the married name variations of female bio subjects listed under alternative names. Dirtlawyer1 (talk) 21:13, 2 June 2015 (UTC)
The outcome of the RfC is "deprecate and remove", not "deprecate and remove with caveats". Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:55, 2 June 2015 (UTC)

edit WP:BRFA/Humbot

Humbot

Operator: Rhumidian (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 09:25, Monday, June 1, 2015 (UTC)

Automatic, Supervised, or Manual: Supervised

Programming language(s): AWB

Source code available: NO

Function overview: Used to make edits on behalf of Rhumidian. The bot will be used to transfer information from the libraries of the British Geological Society onto wikipedia. Such as detailed information on various strata in the british isles, As well as other information contained within said libraries.

Links to relevant discussions (where appropriate):

Edit period(s): continuous

Estimated number of pages affected: ?

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details:

Discussion

  • Pictogram voting info.svg Note: This request specifies the bot account as the operator. A bot may not operate itself; please update the "Operator" field to indicate the account of the human running this bot. AnomieBOT 09:36, 1 June 2015 (UTC)
  • @Rhumidian: Could you please fill in the Function details section above to state the changes this bot would make? Thanks! GoingBatty (talk) 16:46, 1 June 2015 (UTC)
  • Comment: This user has 1,173 edits, of which 675 are in article space. Reading through the requester's talk page, it appears that additional WP editing experience is needed before this editor is qualified to operate a bot. – Jonesey95 (talk) 03:56, 3 June 2015 (UTC)
  • Suggestion: The requester should read, or re-read, the first paragraph of Wikipedia:Bot policy#Requests for approval and follow the instructions therein. Perhaps the editor should read "considerations before creating a bot" as well, which will probably guide this request toward the Bot Requests page. – Jonesey95 (talk) 03:56, 3 June 2015 (UTC)

edit WP:BRFA/B-bot_4

B-bot 4

Operator: B (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 11:11, Friday, May 29, 2015 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): C#

Source code available: User:B-bot/source/Expired OTRS pending tagger

Function overview: If an image has been tagged with {{OTRS pending}} longer than {{OTRS backlog}} days, tag the image with {{subst:npd}} and warn the uploader with {{subst:di-no permission-notice-final}}.

Links to relevant discussions (where appropriate): Wikipedia:OTRS_noticeboard#Proposal_to_move_to_dated_pending_and_received_categories

Edit period(s): Daily

Estimated number of pages affected: Capped at 10 files per day (for now) so as to not overwhelm the daily deletion queues.

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: This task patrols Category:Items pending OTRS confirmation of permission by date for images that have been tagged with {{OTRS pending}} longer than {{OTRS backlog}}.

If the {{OTRS pending}} tag contains either the date parameter or month|day|year parameters, then we will use that as the tagging date. (Once Wikipedia:Bots/Requests for approval/B-bot 3 is approved, all images this job ever processes will have month|day|year tags in place.) If neither tag is in place, then we will treat the last revision date as though it was the date that {{OTRS pending}} was added.

In computing the expiration time, we will always wait at least 30 days, even if the backlog is under 30 days. (WP:CSD#F11 only permits deletion of images with {{OTRS pending}} that have been tagged more than 30 days.)

We will also include a grace period of 7 days beyond the backlog time (under the theory that you might add {{OTRS pending}} before actually forwarding the messages to the permissions account) and will add to that number the number of days since the {{OTRS backlog}} template was updated.

In other words, suppose that the template says there is a backlog of 50 days. The template was updated 2 days ago. We will only tag images with {{subst:npd}} if they have been tagged with {{OTRS pending}} for 50 + 2 + 7, or, 59, days.

When an image is tagged, we notify the uploader with a new template - {{subst:di-no permission-notice-final}}. The purpose of this template is to NOT inundate the user with scary copyright symbols or 1000 policies to read. It delivers a very simple message that looks like a human actually wrote it - please forward the email and let us know you have done so. (The hope is that for users whose eyes glaze over at the copyright templates will be able to understand this straight forward message.)

Discussion

A test is underway at User:B-bot/Test page. --B (talk) 11:11, 29 May 2015 (UTC)

When reading the date parameter, you use dtmDate = DateTime.Parse(arrNameValue[1]). Which date formats are allowed here? For example, {{OTRS pending}} allows the date to be entered as ~~~~~, which is a somewhat unusual date format but easy for Wikipedians to use.
{{subst:npd}} has a source parameter, and "OTRS pending" files have sometimes been tagged with {{subst:npd|source={{NoOTRS60}}}}. Would it be a good idea to put something in the source parameter, for example 'Tagged with {{OTRS pending}} for 123 days'?
Can you provide a log of all tagged files somewhere? It could be useful to check after one week has passed to see if there are some files which were not deleted but which should have been deleted, for example because the uploader removed some templates.
This is only about {{OTRS pending}} but not about {{OTRS received}}, right? --Stefan2 (talk) 21:02, 29 May 2015 (UTC)
Yes. At OTRSN, we haven't really decided on what to do for OTRS received. --B (talk) 01:38, 30 May 2015 (UTC)
  • @Stefan2: (1) I have added the categories you mentioned. (2) I have changed my npd tag to {{subst:npd|source={{NoOTRS60|days={{subst:OTRS backlog}}}}}}. (3) DateTime.Parse() will accept any of the standard formats shown at [1] as en-US formats. All of these work correctly: "1/20/1983", "1983-10-20", "December 3, 2010", "15 JUN 2009". It will not handle Euro dates (today is 29/05/2015 in Europe) because, obviously, those are indistinguishable from US ones (05/01/2015 = May 1 in the US and January 5 in Europe). But I imagine that the Wikipedia software does the same thing. {{#time:Y-m-d|5/1/1983}} (1983-05-01) shows up as May 1, 1983 for me, and hopefully for everyone, or else we have some real problems with the software. (4) The log is on my to do list. It's independent of this process because we still want to log even those images where someone manually puts in the date themselves. --B (talk) 02:08, 30 May 2015 (UTC)
There is no 'European' date format as each European country has its own date format. I'm mainly concerned about {{#time:~~~~~}} which is accepted as 2015-05-30. This date format is not listed on that Microsoft page. --Stefan2 (talk) 06:28, 30 May 2015 (UTC)
I have added support for that string. --B (talk) 12:20, 30 May 2015 (UTC)

edit WP:BRFA/BD2412bot

BD2412bot

Operator: BD2412 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 18:15, Thursday, May 14, 2015 (UTC)

Automatic, Supervised, or Manual: Supervised,

Programming language(s): AutoWikiBrowser.

Source code available: AWB.

Function overview: I frequently clean up links left from disambiguation page moves. For example, the page Epping previously was an article on a town in England. This page was moved to Epping, Essex, and Epping became a disambiguation page with several hundred incoming links. As is commonly found in such cases, most of the links intended the town in England, and many were found in formulations like "[[Epping]], Essex", or "[[Epping]], [[Essex]]". A similar issue is the recurring creation of common patterns of disambiguation links to heavily linked articles; for example editors will often make edits creating disambiguation links like "[[heavy metal]] music" and "the [[French]] language", which can easily be resolved as "[[heavy metal music]]" and "the [[French language]]". Over time, large numbers of these links may build up. I would like permission to run AWB as a bot so that when page moves are made or common disambiguation targets become heavily linked, obvious formulations like these can be changed with less of a direct investment of my time.

Links to relevant discussions (where appropriate): Wikipedia:Disambiguation pages with links generally contains the protocol for repairing links to disambiguation pages.

Edit period(s): Intermittent; I intend to run this when a page move creates a large number of disambiguation links, for which obvious formulations for a large number of fixes can be seen.

Estimated number of pages affected: New disambiguation pages are created frequently. I would guess that between a few dozen pages and a few hundred pages might require this kind of attention on any given day, although there are likely to be days where no pages require such attention.

Exclusion compliant (Yes/No): Yes, as AWB does this automatically.

Already has a bot flag (Yes/No):

Function details: When large numbers of links to new disambiguation pages are created from existing pages having been moved to disambiguated titles, or from the buildup of common patterns of editing behavior over time, I will determine if there are obvious patterns of links to be fixed, for example changing instances of "[[Epping]], Essex" or "[[Epping]], [[Essex]]" to "[[Epping, Essex|Epping]], Essex", or "[[Epping, Essex|Epping]], [[Essex]]". I will then run AWB in bot mode to make these changes, and review the changes once made.

Discussion

BD2412 I like the idea of this bot but I think similar proposals have been rejected in the past as WP:CONTEXTBOT. Could you please raise a discussion at WP:VILLAGEPUMP so that we check whether there is consensus for these changes or not? There might be traps I can't think of right now. -- Magioladitis (talk) 12:57, 16 May 2015 (UTC)

Which Village Pump page would that go to? bd2412 T 15:12, 16 May 2015 (UTC)
BD2412 Let's start from Wikipedia:Village pump (miscellaneous). -- Magioladitis (talk) 21:50, 16 May 2015 (UTC)

Wikipedia:Village_pump_(miscellaneous)#Bot_request_for_disambiguation_link_fixing_issue. -- Magioladitis (talk) 11:11, 21 May 2015 (UTC)

As I was afraid... Wikipedia:Village_pump_(miscellaneous)/Archive_49#Bot_request_for_disambiguation_link_fixing_issue. I see no actual consensus there. -- Magioladitis (talk) 23:19, 27 May 2015 (UTC)

BD2412 can you provide me a list of 50 manual edits doing this task? I would like to judge reactions. I do not guarantee approval. In fact, while I like this task a lot, I think it will get a lot of reactions. Still I think you can try to make 50 edits so we can really see reactions. Take it an unofficial bot trial. -- Magioladitis (talk) 23:22, 27 May 2015 (UTC)

I recently did a run of about 10,000 fixes to links to Striker (which is soon to be turned unto a disambiguation page). Not all of these fall into the pattern that I have discussed here, but those that changed [[Midfielder]]/[[Striker]] to [[Midfielder]]/[[Striker (association football)|Striker]] would. There were probably a few hundred of those in the mix. This run of my contributions was in the thick of this run. bd2412 T 23:40, 27 May 2015 (UTC)

BD2412 My experience show that there will be a lot of reaction. I'll reject the bot request and I encourage you that you keep doing this kind of changes supervised by your normal account using AWB. Unless, of course, there is at some point clear consensus that I do that do this kind of stuff. Some editors in the past even complaint for orphaning a link before xfD closes. Just a general remark for oter editors that my be readin this: BRFA is not the place to gain consensus but a place to request based on consensus. -- Magioladitis (talk) 23:14, 30 May 2015 (UTC)

I am not proposing to orphan links prior to an XfD closing - I generally don't, in fact. Striker was an exceptional case based on the volume of links, and the fact that the RM time has run with multiple votes of support and no objections. My proposal is directed solely to link fixes needing to be made after a consensus-based page move has been carried out. I have had very few reactions to runs of thousands of fixes made using AWB, and I have never had a reaction when making obvious fixes of the type I propose. I would be glad to keep doing it this way, but I have actually physically burned out computer mice and had wrist aches that lasted for days! bd2412 T 00:37, 31 May 2015 (UTC)

BD2412 Any ideas of how we can ensure there is consensus for this task? I hope you understand my position. -- Magioladitis (talk) 18:51, 31 May 2015 (UTC)

There is a longstanding consensus for fixing disambiguation links, which is the foundation of Wikipedia:WikiProject Disambiguation. bd2412 T 19:02, 31 May 2015 (UTC)

edit WP:BRFA/MoohanBOT_8

MoohanBOT 8

Operator: Jamesmcmahon0 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 08:33, Sunday, May 10, 2015 (UTC)

Automatic, Supervised, or Manual:Automatic

Programming language(s): AWB

Source code available: AWB

Function overview: Creating redirects from [[Foo]] to [[List of Foo]].

Links to relevant discussions (where appropriate):

Edit period(s): one-time run, then weekly/monthly depending on how many new lists are created without redirects

Estimated number of pages affected: Initially 12617

Exclusion compliant (Yes/No):Yes

Already has a bot flag (Yes/No): Yes

Function details: I have compiled a list of pages where there exists a [[List of Foo]] page but no [[Foo]] page, as a redirect or otherwise. My bot will create all of the pages as redirects to their lists. Specifically with the content;

#REDIRECT [[List of Foo]]
{{R from list topic}} 
{{R with possibilities}}

[[Category:Bot created redirects]]

This is per Pigsonthewing request at Wikipedia:Bot requests#Redirects to lists, from the things they are lists of.


Discussion

You say that you've made a list of all relevant pages; can we see it? עוד מישהו Od Mishehu 08:32, 12 May 2015 (UTC)
List is here Jamesmcmahon0 (talk) 12:31, 12 May 2015 (UTC)

Anomie how do we proceed here? -- Magioladitis (talk) 23:09, 30 May 2015 (UTC)

The "needs advertisement" seems to have been satisfied, it got some support, and it got some good suggestions that reduced the list dramatically. Proceed as you see fit. Anomie 01:19, 2 June 2015 (UTC)

edit WP:BRFA/ThePhantomBot

ThePhantomBot

Operator: PhantomTech (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 02:11, Thursday, March 19, 2015 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python

Source code available: No, not now at least, though I'll share some of the regex if asked during the approval process

Function overview: Monitors recent changes for possible vandalism and edits from long term abuse users, logs findings and (sometimes) gives information to AN/I for review by users.

Links to relevant discussions (where appropriate): Not sure if this would require consensus from AN/I since it would be posting there or not since the posting task is simple and likely to be uncontroversial.

Edit period(s): daily (while I have a computer on) with plans to make it continuous

Estimated number of pages affected: 1 (AN/I) not counting pages in its own user space

Exclusion compliant (Yes/No): no

Already has a bot flag (Yes/No): no

Function details: This bot is meant to allow a decrease in the amount of edit filters and to identify abuse that can't be reverted by bots like ClueBot due to lack of certainty. Every 60 seconds (that time might be lowered to 20-40 seconds to spread load) a list of changes since the time of the last check is filled. On a separate thread, the bot goes through the list, and decides if the actions match a set filter, these filters are usually similar in what they check to the edit filters however are not limited to the same restraints. If a filter is matched the associated actions are taken, usually logging to the user space and sometimes a noticeboard report. Essentially, this bot acts as a post-edit filter, currently targeting long term abuse but technically able to act on any identifiable action. Since it happens after edits, as opposed to "during" edits, it doesn't slow down editing for users so problematic edits don't have to be frequent, like they do to be edit filter worthy, for it to be worth it for this bot to check for them. In its current state I have two LTA matches setup, one stolen from a log only edit filter and another stolen from edit filter requests, and a general abuse match, also stolen from edit filter requests. If the bot is accepted, I plan on going through all the active long term abuse cases and adding whichever ones I can along with some edit filter requests that aren't accepted due to infrequency.

Discussion

Vandalism/abuse monitoring is a difficult area; I suggest that you write your bot and have it edit a page in its or your userspace (no approval necessary unless edit rates are high) as if it were ANI, and monitor what it reports. You can in turn pass the valid observations it makes along to ANI, and if the quality of the reporting is high enough you may find other people watching the page to see what it finds. I expect you'll get a high false-positive rate which you'll need to analyse to improve the performance of your algorithms, and eventually you'll get to a point where regexs just don't cut it for detecting the long-term, low-frequency abuse you're targetting - and you'll have to look at more sophisticated processes. This is the technological evolution that Cluebot went through, but it catches more egregious and obvious vandalism.

Do you think operating in your or the bot's own userspace would be an acceptable stepping stone? Josh Parris 22:18, 20 March 2015 (UTC)

I realize that there is lots of long term abuse that can't be solved by regex alone, this bot will never be able to handle every LTA case but I do plan on implementing more advanced checks in the future. I have no problem running my bot for a bit with it doing nothing but logging to User:ThePhantomBot/log. PhantomTech (talk) 22:36, 20 March 2015 (UTC)
I would want to see a community consensus that bot generated ANI reports are wanted, please discuss and link that discussion here. — xaosflux Talk 05:43, 26 March 2015 (UTC)
@Xaosflux: As I've been working on my bot I've been adding more functionality and thinking about the best ways to have the bot's reports dealt with. Here's my current plan for how it will report things:
  • Bad page recreation - Log to user space
  • High probability sockpuppets - Report to SPI
  • Lower probability sockpuppets - Log to user space
  • LTA detection - Report to AIV or report to AN/I where certainty is reasonably low (not too low, don't want to waste people's time)
  • Newly added LTA filters, including ones being tested - Log to user space
  • IPs using administrative templates - Report to AN/I
  • Sleeper account detection - Not implemented yet so I don't know how often it will go off, if its often log to user space otherwise report to AN/I
I assume you still want to see a discussion for the AN/I reports but do you want to see any for the other places? I'm guessing you'll want SPI mentioned in the discussion too since I don't think any bots currently report to there. Also, do you have any suggestions on where to report these things or how to report them? Admittedly AN/I does feel like a weird place for bot reports but the goal is to get the attention of editors who may not be aware of the bot's existence. PhantomTech (talk) 07:03, 26 March 2015 (UTC)
Start reading AIV archvies such as Wikipedia_talk:Administrator_intervention_against_vandalism/Archive_3#Suggested_merge_of_Wikipedia:Administrator_intervention_against_vandalism.2FTB2 for some suggestions. WP:AIV/TB2 is probably the oldest 'bot reported' noticeboard right now. — xaosflux Talk 10:23, 26 March 2015 (UTC)
@Xaosflux: Are you suggesting that if my bot were to report to ANI it should do so via a transuded page? I like that idea, using transclusion to put the bot's reports somewhere they'll be seen keeps the bot's updates off the watchlists of people who don't care. PhantomTech (talk) 15:32, 26 March 2015 (UTC)
I'm sugesting that prior community discussion on ANI bot reports came to that conclusion - and that after reading up on it you start new discussions to find out where people would make best use of your reports. For ANI it could be the existing TB2 subpage, but they might want it on its OWN subpage; for the other forums people might want subpages, might want main, or might not want bot reports at all. I am not trying to dictate the solution, just that whatever it is should enjoy community consensus before integrating to existing forums. — xaosflux Talk 16:59, 26 March 2015 (UTC)

I posted a discussion at Wikipedia:Village_pump_(idea_lab)#ThePhantomBot_reporting_to_noticeboards to help get an idea of what kind of reporting users would like. Depending on how that goes I'll post something to village pump proposals with notifications on the relevant noticeboard's talk pages. PhantomTech (talk) 05:57, 27 March 2015 (UTC)

edit WP:BRFA/EnzetBot

EnzetBot

Operator: Enzet (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 00:18, Sunday, March 8, 2015 (UTC)

Automatic, Supervised, or Manual: supervised.

Programming language(s): Python.

Source code available: no source code available since bot is a part of major project.

Function overview: fixes inconsistencies and formatting in metro stations articles (in station infobox and S-rail templates).

Links to relevant discussions (where appropriate):

Edit period(s): every time bot found some inconsistency in metro pages.

Estimated number of pages affected: about 1000 pages. There are about 10 K metro stations in the world, so no more than 10 K pages should be affected.

Exclusion compliant (Yes/No): yes.

Already has a bot flag (Yes/No): no.

Function details: I have The Metro Project for metro map automated drawing. It uses Wikipedia for check metro system graphs and sometimes meets inconsistencies and bad formatting in Wikipedia articles. Now I fix them manually (see my contributions) but want to entrust it to my bot.

Tasks for this request:

  • wrap dates in station infobox with date template, e.g. 2000-03-30 to {{date|30-03-2000|mdy}};
  • add links to station structure types and platform types, e.g. Shallow single-vault to [[Single-vault station|Shallow single-vault]];
  • fix redirects in S-rail template.

Discussion

I see the bot account has been editing articles. It is not yet approved for that.

I note you want to edit dates, but I see from your recent edit to Klovska (Kiev Metro) and your function details (above) you haven't realised the importance of ISO 8601 date formatting. I also note that you did not elect to use the style reportedly preferred by Ukrainians in your edit; is there a reason for this? Out of interest, why are these dates of interest to your bot?

The bot fixes inconsistencies between articles; how does it know which one is correct?

The links to station structure types and platform types you're proposing to link - are they the ones in infoboxes, or article text?

What major project is bot a part of, and why does that make the source code unavailable? Josh Parris 14:32, 9 March 2015 (UTC)

I'm sorry for editing without approval. It was a test to make sure bot works. I'll never do it again.
Yeah, I see, date changes seem to be a bad idea. I think, I should remove it from tasks list. Should I undo my edits (there are only 5 of them)?
About inconsistencies. Bot doesn't know which one is correct, it only can detect wrong things or possibly wrong things. For example, wrong date format (month number can't be greater then 12), wrong terminus (station cannot be a next or previous station for itself), if station A is next for station B, station B should be previous for station A, wrong S-rail values (if it conflicts with station lists on metro page or line page), and so on. That's why bot is not automatic, I supervise every edit. I don't know how to formulate it as a task since there are so many types of inconsistencies. May be you can help me?
Yes, bot will add links to infobox only if there is no such link in article text.
My major project is not open source for now. It generates very simple suggestions for bot I exampled above—what to replace in which article. If bot source code is important, I can push it to public repository but it is trivial since it uses pywikibot (no more then 100 LOC). Enzet (talk) 17:01, 9 March 2015 (UTC)
If you're supervising every edit, then this is a "manual bot" and can be run using your own account without approval. Would you like to do so? Josh Parris 11:30, 13 March 2015 (UTC)
OK, I understand all about inconsistencies. If I don't want to use Enzet account for semi-automated editing, can I use EnzetBot account (with removed {{Bot}} template and without approval) or should I register new account without bot keyword? What is a good practice for that? Also, is there some criteria for semi-automated editing (no faster than 1 edit per 5 seconds, no more 100 edits in a row, or something like that)? (Sorry if I missed it from the rules.)
Also, I am realized that (1) wrapping station structure and platform type with links and (2) fixing S-rail redirects tasks may be provided without supervising or supervising for them is really fast (checking is trivial). Can I get approval or disapproval for these tasks in this request or I should create new one? Enzet (talk) 09:27, 17 March 2015 (UTC)

Josh Parris any further comments? -- Magioladitis (talk) 18:44, 19 May 2015 (UTC)

Bots in a trial period

edit WP:BRFA/Mdann52 bot_8

Mdann52 bot 8

Operator: Mdann52 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 20:04, Wednesday, May 20, 2015 (UTC)

Automatic, Supervised, or Manual: Auto

Programming language(s): AWB

Source code available: standard AWB

Function overview: Updates links from http://patient.co.uk to http://patient.info/

Links to relevant discussions (where appropriate): NA

Edit period(s): One time run

Estimated number of pages affected: From link searches, around 500-1000

Exclusion compliant (Yes/No): Mo, no need

Already has a bot flag (Yes/No): Yes

Function details: Per OTRS ticket, the website appears to be updating the URL, and the previous link is likely to go dead. As the URL scheme seems to follow the same format, this should be a simple run.

Discussion

Approved for trial (50 edits). -- Magioladitis (talk) 23:12, 27 May 2015 (UTC)

{{OperatorAssistanceNeeded}} Magioladitis (talk) 23:07, 30 May 2015 (UTC)

Withdrawn by operator. I've run the link search again, and there appears to be no links left, therefore running a bot here is not needed. Mdann52 (talk) 17:51, 1 June 2015 (UTC)

edit WP:BRFA/MusikBot

MusikBot

Operator: MusikAnimal (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 06:23, Wednesday, April 22, 2015 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Ruby

Source code available: GitHub

Function overview: Bot clerking at WP:PERM pages.

Links to relevant discussions (where appropriate): Special:PermaLink/655854110

Edit period(s): Continuous

Estimated number of pages affected: Up to six during one run (one for each PERM page, except Confirmed and AWB requests)

Exclusion compliant (Yes/No): No

Already has a bot flag (Yes/No): No

Function details: This bot works very much like Cyberbot I does at WP:RPP. It monitors all the Request for Permissions pages for new requests, and checks if there were previously declined requests for that user and permission. If matches are found, an automated comment is left linking to those declined requests. Eventually it may also ping the declining admin, but I've side stepped that for now. There are two exceptions: The AWB checkpage which does not have the same structure as the other request for permissions pages, though I might implement special case handling for this at some point. The other is requests for confirmed, where it's very unlikely we'll see multiple requests by the same user, so the bot clerking is not that helpful there. A few notes:

  • It works by using regex to parse out all the necessary info, and constructs the automated comment(s) to be saved. As long as Template:Request for permission generates a level 4 heading and Template:Rfplinks is used than it shouldn't flake out.
  • Thoroughly tested on test-wiki, see testwiki:Wikipedia:Requests for permissions/Rollback (and here).
  • Operates on wmflabs, with a crontab running the script every 10 minutes or so, or whatever we decide on.
  • The perm clerking task can be turned off by changing User:MusikBot/PermClerk/Run to anything other than true.
  • For all six permission pages, it should take less than a minute to complete, with a 2 second pause between processing each page, and it will edit no more than 6 times total. However given the nature of the task you probably won't see but a few edits every day at most.
  • Checks for edit conflicts. If one is detected it will re-attempt to process that permission page for a total of three times, waiting progressively longer each time. So after attempt #1 it will wait 1 second before trying again, after attempt #2 two seconds, etc.
  • Caching is in place where appropriate, such as fetching the declined pages and any declined permalinks for a user.
  • There is verbose logging that I can make publicly accessible.
  • Full exception error handling. If a critical error is encountered (e.g. more than 3 failed attempts to edit a page), the script will proceed to process the next permission page rather than abort the task altogether. Fatal errors such as when the API is down will result in a full abort of the task until it is ran again by the cron job.
  • To be clear, the "cron" jobs are actually submitted to the grid, which helps allocate resources so the bot doesn't get in the way of other jobs on tool labs.

Thank you! MusikAnimal talk 06:23, 22 April 2015 (UTC)

Discussion

{{BAG assistance needed}}

Approved for trial (50 edits). Looks sane; has support from the target audience; reasonable logic; trusted user. The thing I was actually going to ask about (i.e., pointless edits on already-handled entries) looks like it's already covered:
if section.match(/{{(?:template\:)?(done|not done|already done)}}/i)
--slakrtalk / 07:27, 29 April 2015 (UTC)
Thank you! It is now running, processing the PERM pages once every 10 minutes. 50 edits could take a while, but I'm no hurry. In the meantime allow me to note that I am implementing another clerking feature, where it will remove extraneous headers (e.g. see bottom request at testwiki:Wikipedia:Requests for permissions/Rollback). This happens a fair amount from new users, who do not read the instructions stating not put anything in the heading field. This development is happening completely on my local environment and will not interfere with the currently running bot, which is running off of code on tool labs. MusikAnimal talk 16:14, 29 April 2015 (UTC)
Just letting you know I've updated the bot to remove extraneous headers when present. This requires no additional edits should there also be previously declined requests for a user – the bot will simply make all changes to the page at once. Thanks MusikAnimal talk 15:35, 30 April 2015 (UTC)
@MusikAnimal: This message is however totally misplaced, see this edit. It's also incorrectly indented. Armbrust The Homunculus 05:34, 1 May 2015 (UTC)
@Armbrust: The bot acted exactly as programmed, only removing the level 2 header. The rest of the text was left as is. Here the user also ignored the 2= parameter of {{rfp}} and instead wrote the request body on the line below it. I am working on a more intelligent regex solution that can fix this common scenario in full. The incorrectly added level 2 heading is more common, however, so the bot is at least addressing that. Anyway, there's clearly discussion needed so I've disabled that feature for now. Let's talk more at WT:PERM#Bot clerking so others can chime in. MusikAnimal talk 06:03, 1 May 2015 (UTC)

Symbol full support vote.svg Approved for extended trial (50 edits). With the updated regex please. Thanks, Magioladitis (talk) 11:42, 8 May 2015 (UTC)

@Magioladitis: Thank you for the endorsement. Just to be sure, has MusikBot been approved for a total 100 edits? The new regex is now in place and working nicely. An important note: I will be on holiday starting this Friday though the rest of the month. I am programming the bot to automatically shut off when it reaches 50 edits, or 100, as advised. I will still be able to occasionally check its activity for accuracy and act accordingly. Thanks MusikAnimal talk 16:41, 11 May 2015 (UTC)
MusikAnimal please make 50 additional edits. -- Magioladitis (talk) 21:01, 13 May 2015 (UTC)
Thank you, will do MusikAnimal talk 21:03, 13 May 2015 (UTC)

A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) MusikAnimal Is the bot trial done? -- Magioladitis (talk) 23:17, 27 May 2015 (UTC)

@Magioladitis: No, and far from it, actually. I was going to bring a few ideas to your attention... There is now consensus (multiple supports, no opposition) for MusikBot to take on a new task for which it is already capable of doing. That is, comment on requests for permissions where the candidate does not meet some configured requirements. Please see User:MusikBot/PermClerk/Prerequisites for more information. If this task is approved for trial, it could be coupled in with the current allotted 100 edits and we'd be able to meet that threshold quicker. Together those 100 edits should provide ample data to evaluate the bot's overall performance for all of its tasks. What do you think?
Finally, I believe MusikBot may be destined to take over KingpinBot's task of archiving requests. This is both because of the operators inactivity and that MusikBot is already parsing the same pages. This has not been developed yet, however, and whether it should be a separate BRFA altogether is up to you. At the rate we're going, I'll have the archiving functionality ready for trial before the bot has made 100 edits. MusikAnimal talk 20:56, 28 May 2015 (UTC)

edit WP:BRFA/HostBot_7

HostBot 7

Operator: Jmorgan (WMF) (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 21:11, Thursday, March 5, 2015 (UTC)

Automatic, Supervised, or Manual: automatic

Programming language(s): Python

Source code available: repo on GitHub

Function overview: Matches users who have created a profile page at the Wikipedia Co-op with mentors who can teach them particular editing skills. When a match is made, the bot posts a message on the profile talkpage—creating that page as a Flow board, (request amended; see discussion) if that talkpage does not already exist. The bot needs to be granted the flow-create-board user right to accomplish this.

This bot request is twofold:

  1. we're requesting the right to implement this matching functionality of the Co-op
  2. we're also requesting that HostBot be added to the flow-bot group so that it can deliver these matches on Flow-enabled talkpages, rather than standard talkpages.

Links to relevant discussions (where appropriate):

Edit period(s): as requested: the bot checks for new mentorship requests every ~5 minutes, and attempts to match each new request with a mentor immediately

Estimated number of pages affected: 10-20 per month

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details:

HostBot is hosted on Labs. Every five minutes, it will query the Wikipedia API for new pages in Co-op-specific categories. These pages will either be new Co-op member profiles (example: Wikipedia:Co-op/AMemberUserName) or existing member profiles where a user has recently added a new Co-op specific category (indicating a change in learning interests). It will also query the API for a list of Co-op mentors with profile pages (Wikipedia:Co-op/AMentorUserName) in the corresponding mentor category who have not opted out of receiving new matches. In both cases, Hostbot/Co-op checks that the category members are in fact subpages of Wikipedia:Co-op.

For each newly-categorized Co-op member, the bot chooses a random mentor from the list of corresponding mentors. If none are available for the given interest, it chooses a random mentor from the fallback category "General editing skills". Once the match is found, HostBot leaves a post on the talk page corresponding to the Co-op member's profile page (example: Wikipedia_talk:Co-op/AMemberUserName). If this page does not already exist, the bot uses the Flow API's new-topic submodule to post its welcome message as a new topic, thus creating the talk page as a Flow board; otherwise, the bot edits the talk page normally. The message mentions the mentor and posts on the Flow-enabled talk page of a page that the member created, and so generates an Echo notification for both of them. The member and the mentor are then free to follow up with each other, and the bot's involvement is finished.

Constraints.

  1. The bot's posting activity is limited to subpages of Wikipedia_talk:Co-op. If a user adds a relevant category to their user page instead of their Co-op profile page, HostBot will ignore it and that user will not receive a match.
  2. HostBot will not—and cannot—convert existing talk pages to Flow boards. If a Co-op profile talk page exists at the time of matching, HostBot will simply edit that page and post its welcome message in a new section.
  3. HostBot can only create a new Flow board if it has the flow-create-board right. A bureaucrat will need to add HostBot to the flow-bot group which has this right. More details here: https://phabricator.wikimedia.org/T76785

Demonstration. This workflow is running on test.wikipedia.org already, by MatchBot. Here's an example learner profile page and talk page. You can sample it yourself if you want: click the "find a mentor" button on the co-op testwiki portal to create a sample profile and wait a few minutes to receive your matches. Note that approval for the FormWizard gadget used to create profiles on testwiki is not part of this request; matching works just as well with manually-created profile pages.

Discussion

  • The Wikimedia Foundation's Collaboration team has been working closely with the Co-op project to use Flow as the discussion system for the new program. We're excited to be a part of this new mentoring project, and we'll continue to provide technical support for the Co-op to deal with any issues that might arise from using this bot. DannyH (WMF) (talk) 22:23, 5 March 2015 (UTC)
  • Comment. I am in charge of managing the development of The Co-op, and wanted to briefly discuss the reasoning behind these proposed bot tasks. The rationale of the matching component is to drastically reduce the amount of effort required on the part of mentors and editors seeking mentors to find each other, and generally, prevent a lot of unnecessary waiting. The rationale behind the use of Flow is to facilitate communication during mentorship. As Flow is designed to address perceived issues with conventional talk pages, particularly with regard to newer editors, and because we are specifically inviting newer editors to use the space, our team is interested in testing it out. That said, using Flow for mentorship-related communication is not required; mentors and editors can use conventional talk pages or whatever communication system they find most convenient. Our pilot for the space is running for about one month, and so a 30-day trial would be appreciated. Thanks for your consideration. I, JethroBT drop me a line 22:40, 5 March 2015 (UTC)
  • Has this functionality been demonstrated on testwiki or anywhere else yet? Currently noone on enwiki has flow-create-board access. What is the rollback plan for actions made in error? — xaosflux Talk 05:33, 8 March 2015 (UTC)
  • Question on operator account: I see this operator listing is under your official WMF staff account - is this purposeful? Will these bot edits represent official WMF actions? — xaosflux Talk 19:13, 8 March 2015 (UTC)
Doing a bit of research, please provided updated data if this is not still blocked by T90077? — xaosflux Talk 01:08, 9 March 2015 (UTC)
Hi Xaosflux. Excellent questions! The functionality is in evidence on testwiki (details under the heading Demonstration above). Give it a try and let me know if there's anything else I can clear up. The actions of HostBot won't be official WMF actions--tho I'm not sure what would constitute an official WMF action in this scenario. Do you mean Office Actions? In any case, no, HostBot's actions would be held to the same standards as any other bot, and I to the same standards as any bot-wrangler.
As to why I'm writing under my staff account: the Co-op is a grant funded experimental project, and part of this grant proposal involved a trial of the system. I work for the team at WMF (formerly known as Grantmaking, now Community Resources) that disburses and oversees grants, and in this case one of the resources they were able to offer the grantee team was some of my time to help with the trial. So right now I'm participating in the Co-op project as a staff member. I have no authority, just the responsibility to make sure the bot does what it's supposed to do, to respond to community input, and to fix anything that breaks. If this HostBot task is approved on an ongoing basis after the trial is concluded, I will continue to manage the task, but in a volunteer capacity. That's what I did with the Teahouse, which also started out as a Foundation-funded venture, but has been completely community-run for more than two years.
Regarding status of blocking bugs: I'll let I_JethroBT handle that one. He's the PM. Regards, Jmorgan (WMF) (talk) 23:43, 9 March 2015 (UTC)
OK, operator question is not an issue; I'm running through the demo on testwiki now. — xaosflux Talk 00:12, 10 March 2015 (UTC)
@Xaosflux: These blocking tasks are due to be addressed in a sprint from the Flow dev team on 11 March, although T90970 and T90969 are being prioritized to allow for deletion of Flowboards that is consistent with how deletion is normally handled, and should be easy to resolve. I expect these tasks to be resolved in about 2-3 weeks time, according to estimates from DannyH (WMF). T90973 is more complicated, and is unlikely to be addressed until much later. We have also de-prioritized T90972 for the purposes of the pilot as it is unlikely for Flow-enabled pages that are deleted to be restored. I, JethroBT drop me a line 02:34, 10 March 2015 (UTC)
  • Couple of issues, please address:
  1. Why can users direct the bot to create arbitrary page names? (By allowing free-text co-op "nicknames" instead of their usernames)? (example: testwiki:Wikipedia talk:Co-op/OffensiveTitleNameHere) — xaosflux Talk 00:39, 10 March 2015 (UTC)
    Jmorgan (WMF) and I discussed the prospect of creating profile pages with the editor's username pre-filled to avoid this sort of thing, but there was not enough development time to implement it for the purposes of the pilot. If we were going to expand this space, I think adding this functionality would be great to avoid typos and more malicious activity like your example. It is important to keep in mind we are only allowing a maximum of 50 learners to create profiles for this pilot because there are just under 25 mentors in the space; it's also unlikely for newer editors to discover the Co-op outside of these invitations. I, JethroBT drop me a line 03:51, 10 March 2015 (UTC)
  2. Lack of "Special:Unflowify" or other means of undoing this bot's flow-create-board action (please correct me if this functionality is present).
    @DannyH (WMF): is in a better position to discuss this question in detail. As far as I am aware, the deletion tasks that are being worked on (T90970, T90969) would create the page as a standard talk page. I, JethroBT drop me a line 03:51, 10 March 2015 (UTC)
Thank you, will need to be seen - I tried on test: with out luck - may be waiting for code changes, able to delete the flow converstation, but still not the topics--but afterwards the page does not revert to standard talk: style. — xaosflux Talk 04:14, 10 March 2015 (UTC)
Thank you! — xaosflux Talk 00:39, 10 March 2015 (UTC)


The overall matching mechanism seems pretty straight forward and uncontroversial; a non-flow enabled trial should be easy to move forward (see page naming issue first). — xaosflux Talk 00:39, 10 March 2015 (UTC)
Would you rather try to resolve these flow-related questions first, or trial with Wikipedia_talk: pages for now? — xaosflux Talk 00:55, 10 March 2015 (UTC)
@Xaosflux: Given the timeline the Co-op grant is operating on, for which our report is due at the end of April, and the time it would require to resolve these blocks, I would prefer to trial the matching component on standard Wikipedia talk pages for now while these matters with Flow are resolved. Thanks, I, JethroBT drop me a line 03:51, 10 March 2015 (UTC)
  • Approved for trial (250 edits or 30 days). (Restricted to Wikipedia: and Wikipedia_talk: space only). NOT approved for Flowbots group addition under this trial. — xaosflux Talk 04:10, 10 March 2015 (UTC)
Thanks Xaosflux: one quick clarification. Is the bot only approved for trial if users are not allowed to specify the title of their profile page? Jmorgan (WMF) (talk) 15:33, 10 March 2015 (UTC)
No, you can trial-as is-before rolling to full productions, we can revisit. — xaosflux Talk 17:00, 10 March 2015 (UTC)
Post-trial follow up

Xaosflux, I_JethroBT Alright, trial period is over. I've turned off the matching script for now. What's next? Jmorgan (WMF) (talk) 18:38, 14 April 2015 (UTC)

Do you want to hold THIS request open until all the software chagnes are made, or change it to be for the non-flow related items and request those at a later time? — xaosflux Talk 01:07, 15 April 2015 (UTC)
@Xaosflux and Jmorgan (WMF): I think it will be best to close this request out for now while the other components (i.e. Flow) are being worked on. We can reference this discussion in a new request in regards to renewing the matching functions of the bot. Thanks, I, JethroBT drop me a line 19:22, 16 April 2015 (UTC)
Actually, Xaosflux I'd like to get this request resolved independently of Flow. I don't yet have confirmation from the Flow team on when they will have addressed the issues you raised, and in the meantime, I'd like the Co-op to continue running (it's basically shut down now, since we can't match mentors with learners. No invites have gone out for over a week.). So my request is: can HostBot be approved to match learners with mentors by posting on their non-Flow-enabled talkpage?
I probably should have separated the matching script request from the flow-bot right request at the beginning--I apologize. I will submit a new request as soon as I've had confirmation from my WMF colleagues that Flow has been patched to address the issues you raised. Best, Jmorgan (WMF) (talk) 17:27, 23 April 2015 (UTC)
@Xaosflux: I've struck out my request above; I'd also prefer that we have the ability to match and notify editors available to us. I, JethroBT drop me a line 19:21, 23 April 2015 (UTC)
@Xaosflux: Just a nudge here. Can we get approval to continue match editors again on non-flow enabled pages? I'm hoping we can get the Co-op back up and running again, hopefully by tomorrow. Thanks, I, JethroBT drop me a line 20:09, 6 May 2015 (UTC)

──────────────────────────────────────────────────────────────────────────────────────────────────── Per Xaosflux's instructions I'm asking another member of the BAG to make the decision about final approval of this request. I've updated the description to reflect the more limited scope: we're just asking permission to implement the matching functionality as it was tested during the trial period: creating Flow-enabled talkpages in not part of this request. Thanks for your help! Cheers, Jmorgan (WMF) (talk) 21:32, 15 May 2015 (UTC)

Xaosflux Please check this. -- Magioladitis (talk) 13:01, 16 May 2015 (UTC)

Magioladitis could you or another BAG member take this on? Xaosflux isn't wiki-active at the moment. We're trying to open the Co-op up again, and mentor-learner matching is the core of the project. We have all the other important pieces in place: HostBot invites are approved; profile creation gadget is activated; we just need final approval to have the bot post a message on the learner's profile talkpage (example). I'll help any way I can to make it happen--just let me know! Cheers, - J-Mo Talk to Me Email Me 20:22, 27 May 2015 (UTC)

J-Mo can you provide me a link to the bots edits related to this task and comments about them? -- Magioladitis (talk) 23:16, 27 May 2015 (UTC)

Magioladitis, for sure. Here are all of HostBot's edits during the trial. As you can see, the bot only edits the talk pages of learner profile pages in the Wikipedia talk namespace. The bot runs on a 5 minute cron job, checking the API for pages newly added to one of the Co-op "request" subcategories that are also members of Category:Co-op_learner. If the bot finds a newly-added request category, that means a learner is requesting mentorship. The bot then selects an available mentor and posts a message to the profile talk page, @mentioning the mentor and the learner so that they both know they've been matched.
That's pretty much it; if the talkpage doesn't exist yet, the bot creates it (as a regular talkpage). If it already exists, the bot starts a new thread. - J-Mo Talk to Me Email Me 03:21, 28 May 2015 (UTC)

edit WP:BRFA/Commons fair use upload bot_3

Commons fair use upload bot 3

Operator:  (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 19:49, Wednesday January 7, 2015 (UTC)

Automatic

Programming language(s): Python

Source code available:

I have this working passively locally, but yet to test out on labs with sample images. When I have significant updates to the code, I will consider uploading a new version of the source under https://github.com/faebug as the wikigit repository is unlikely to be maintained. Migrated version of code on github link above, test version only.

Function overview:

This is a cross-wiki bot to copy files at risk of deletion on Wikimedia Commons to local wikis where they can be retained under either fair use or the image is public domain in the source country but may be problematic under Commons interpretations (such as the URAA).

Links to relevant discussions (where appropriate):

Edit period(s):

  • Previously running hourly, without any issues, so I'm planning on doing the same.

Estimated number of pages affected:

  • The bot was successfully running prior to the toolserver shutdown, coincidentally the last transferred files were some of my own. See ListFiles.

Exclusion compliant : Yes


Already has a bot flag : No Effectively this was reset by the usurp process.

A trial may not be needed considering the track record, however If there is one, I would prefer it to be a month or longer as my availability may be patchy.

Function details: This bot re-uploads files that are deleted on Commons to projects where they are in use, if those projects accept non-free files. It goes over the files in Category:Pending fair use deletes, uploads them to local wikis, then marks them for speedy deletion on Commons when it's done. Any article using the images receives a notice that the file has been re-uploaded as a fair use candidate. Local wikis are responsible for determining if the re-uploaded image is eligible for their non-free content policy, and deleting it in a timely manner if it is not. If for some reason it's not able to upload the image, it will leave an error message on the file page and not mark it for deletion.

Discussion

Arbcom exemption/requirements
  • The Arbitration Committee has passed the following motion which related to this request for approval:
Despite the restrictions on his editing images related to sexuality, may operate the Commons fair use upload bot if the Bot Approvals Group approves it.

The bot may upload sexuality images that would, if Fæ himself had uploaded them to the English Wikipedia, breach Fæ's restriction, only if the upload is requested by a third party.

The bot shall maintain a log of: the images it uploads; the names of the articles on the English Wikipedia where the images appear at the time of upload; and the username of the Commons editor requesting the transfer to the English Wikipedia.

For the Arbitration Committee, Callanecc (talkcontribslogs) 01:24, 15 January 2015 (UTC)

Bot discussion
  • Can you please indicate on the local userpage who owns the account? Courcelles 22:26, 7 January 2015 (UTC)
    • Good point. Done, rather than relying on visiting the Commons page. -- (talk) 22:35, 7 January 2015 (UTC)
  • After playing around with the bot locally and having it fall over a few times, I am planning to rewrite it to rely on pywikibot rather than mwclient as its interface to the API. This will probably work far more reliably on WMFlabs and be much easier to maintain in future years. Though the code is not all that long, with other commitments and the increased testing needed, this will take weeks rather than a few days. -- (talk) 08:49, 12 January 2015 (UTC)
    Note, some 'real' images are ready for the bot to localize, see Commons:Deletion requests/Files uploaded by SPVII DrFresh26. I'm advising that the bot should be operational within a week or two.
    The account commonsfairuseupload has been set up on labs. I have a test version running under Pywikibot core on WMFlabs, however there is a fair amount of rewriting to be done before running it live and it makes sense to put a first snapshot up on github. -- (talk) 23:25, 13 January 2015 (UTC)
    Early snapshot now on github as above. -- (talk) 13:35, 14 January 2015 (UTC)
    A separate bot flag restoration request has been raised on Commons, c:Commons:Bots/Requests/Commons fair use upload bot. -- (talk) 12:52, 15 January 2015 (UTC)
  • This bot would just perform the same task as User:Dcoetzee's bot, right? How will the bot handle Bugzilla:61656? Dcoetzee's bot handled this by reverting CommonsDelinker, see e.g. Special:Diff/615048249. Ideally, this should be fixed in CommonsDelinker instead of the fair use upload bot, but nothing seems to have happened in CommonsDelinker since the bug was reported in 2010. --Stefan2 (talk) 14:53, 20 January 2015 (UTC)
    To be honest, I am not 100% sure I understand the issue, not having looked into the functionality of the delinker (note the bug was reported in 2010, but Dcoetzee's code was successfully running from 2012 to 2014 the way it was). However, the way the CFUUB behaves at the moment is that it locally uploads the file under an amended file name and inserts a redirect as the old local image page text. This should leave the old name untouched to avoid permission problems on local wikis. My understanding is that this precautionary step also avoids possible conflict with the delinker when the original is speedy deleted from Commons. If folks want this to work differently, then this might be something to amend in the delinker's behaviour, rather than building in odd intelligent reverts into CFUUB to undo the work of the delinker.
I have yet to convert this bit of code to pywikibot, but if you look in the current test status source code linked above for the two places that site.upload(open('/tmp/downloadedfile'), newfilename, newdesc, ignore=True) occurs, these are relevant.
As I am regular dialogue with @Steinsplitter:, I would defer to his judgement as he has recently been active in updating the delinker and would welcome his advice during testing. Perhaps he could take ownership of this bug request too? I could do with some test images, so maybe we can agree on a few and demonstrate the system in the trial period. -- (talk) 16:28, 20 January 2015 (UTC)
When Dcoetzee's bot uploaded files, it worked like this:
  1. Someone on Commons requested a local upload
  2. Dcoetzee's bot uploaded the file under a slightly different name by inserting "from Commons" in the file name to avoid permission problems
  3. The bot created a redirect from the old name to the uploaded file
  4. The file was deleted on Commons by a Commons admin
  5. CommonsDelinker failed to notice that a redirect existed locally and therefore incorrectly removed the file from English Wikipedia
  6. Dcoetzee's bot reverted CommonsDelinker's incorrect removal
I think that step 5 should be fixed by correcting the bug in CommonsDelinker, but Dcoetzee decided to fix it by introducing step 6 because the CommonsDelinker programmers didn't fix the bug for several years. There is some discussion in Wikipedia:Bots/Requests for approval/Commons fair use upload bot 2, for example in the "Function details" section. If Steinsplitter can fix CommonsDelinker, then that would be much better. --Stefan2 (talk) 16:49, 20 January 2015 (UTC)
Agreed. I'll pay attention to testing this out based on the fact that Steinsplitter believes this bug has been addressed in Magnus' new version of the delinker (Phabricator:T63656). -- (talk) 18:16, 20 January 2015 (UTC)
See my comment here --Steinsplitter (talk) 18:17, 20 January 2015 (UTC)

As best I can tell, there's no reason to delay a trial. Is that the case? Josh Parris 06:35, 20 February 2015 (UTC)

I'm considering putting some time aside to trial the code in about a week. -- (talk) 09:44, 20 February 2015 (UTC)
My understanding is that you intend to monitor closely, but this is a rewritten bot. I'm also under the impression that there won't be a huge number of edits. As such, Approved for trial (30 edits or 30 days)., commencing sometime in the next couple of weeks. Josh Parris 19:24, 23 February 2015 (UTC)

Any news on that? Trial period has expired. -- Magioladitis (talk) 22:08, 22 March 2015 (UTC)

I have pushed the code forward a little bit this weekend. A family issue has taken priority. I am writing in a reasonable test mode, which I think will help for adding more wikipedias downstream. 30 days was obviously a bit of an aggressive target for my availability. I would expect to be able to run this live in April. -- (talk) 23:04, 22 March 2015 (UTC)

Hi again! Any news on that? -- Magioladitis (talk) 22:17, 3 May 2015 (UTC)

Hi, I'm sorting out clearing a house and the admin burden of much delayed probate, which is eating up my free time. Sorry about these delays. A couple more weeks before I can focus again? -- (talk) 22:35, 3 May 2015 (UTC)

? -- Magioladitis (talk) 11:13, 21 May 2015 (UTC)

Yes still tied up with real life (which include about 25 shopping bags full of unexpected correspondence to check over in relation to a sizeable fraud). If anyone else is keen to get the bot running earlier, I'll happily liaise as a co-operator, otherwise it may be another month yet before I put aside time to test it out. -- (talk) 12:45, 31 May 2015 (UTC)

edit WP:BRFA/JhealdBot

JhealdBot

Operator: Jheald (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 23:36, Monday December 8, 2014 (UTC)

Automatic, Supervised, or Manual: Supervised

Programming language(s): Perl

Source code available: Still under development.

Function overview: Maintenance of subpages of Wikipedia:GLAM/Your_paintings, in particular the subpages listed at Wikipedia:GLAM/Your_paintings#Artists_by_birth_period. There is currently a drive to identify Wikidata entries for the entries on this list not yet matched. I seek approval to keep these corresponding pages on Wikipedia up to date.

Initially I would just use the bot as an uploader, to transfer wikipages edited off-line into these pages (including fixing some anomalies in the present pages -- which I would probably do sequentially, through more than one stage, reviewing each fix stage before moving on to the next).

Once the off-line code is proven, I would then propose to move to a semi-automated mode, automatically updating the pages to reflect new instances of items with d:Property:P1367 and/or corresponding Wikipedia and Commons pages.

Links to relevant discussions (where appropriate):

Edit period(s): Occasional (perhaps once a fortnight), once the initial updating has been completed. And on request.

Estimated number of pages affected: 17

Exclusion compliant (Yes/No): No. These are purely project tracking pages. No reason to expect a {{bots}} template. If anyone has any issues with what the bot does, they should talk to me directly and I'll either change it or stop running it.


Already has a bot flag (Yes/No): No. I have one on Commons, but not yet here.

Function details:

  • Initially: simple multiple uploader bot -- take updated versions of the 17 pages prepared and reviewed offline, and upload them here.
  • Subsequently: obtain a list of all Wikidata items with property P1367. Use the list to regenerate the "Wikidata" column of the tables, plus corresponding sitelinked Wikipedia and Commons pages.

Discussion

Regarding uploading offline edits: Are these being made by anyone besides the operator? What license are they being made under? — xaosflux Talk 23:44, 18 December 2014 (UTC)
@Xaosflux: The pages have been being prepared by me using perl scripts, drawing from Wikidata.
I've slowly been making the scripts more sophisticated -- so I've recently added columns for VIAF and RKDartists links, both taken from Wikidata, defaulting to searches if there's no link, or no Wikidata item yet identified. Content not drawn from Wikidata (typically legacy entries from the pages as I first found them) I have prefixed with a question mark in the pages, meaning to be confirmed. For the most part these are blue links, which may go to completely the wrong people.
So at the moment I'm running a WDQ search to pull out all Wikidata entries with one (or more) values for the P1367 "BBC Your Paintings identifier" property, along with the properties for Commons category name (P373), VIAF (P214) and RDKartists (P650). I'm also running an Autolist search to get en-wiki article names for all Wikidata items with a P1367. Plus I have run a look-up to get Wikidata item numbers for all other en-wiki bluelinks on the page (this gives the Q-numbers marked with question marks). But the latter was quite slow, so I have only run it the once. At the moment I'm still launching these searches by hand, and making sure they've come back properly, before updating & re-uploading the pages.
As to the licensing -- Wikidata is licensed CC0. My uploads here are licensed CCSA like any other upload to the site (though in reality there is very little originality, creativity or expression, apart from the choice of design of the page overall, so probably (under U.S. law at least), there quite possibly is no new copyrightable content in the diffs. Various people of course are updating Wikidata -- I've been slowly working down this list (well, so far only to the middle of the 1600s page) though unfortunately not all of the Wikidata updates seem to be being picked up by WDQ at the moment; the Your Painters list is also on Magnus's Mix-and-Match tool; and various others are working at the moment, particularly to add RKD entries to painters with works in the Rijksmuseum in Amsterdam. But Wikidata is all CC0, so that all ought to be fine.
What would help though, would be having the permission for a (limited) multiple uploader, so I could then upload the updates to all 17 pages just by launching a script, rather than laboriously having to upload all 17 by hand each time I want to refresh them, or slightly improve the treatment of one of the columns.
I'm not sure if that entirely answers your question, but I hope does make clearer what I've been doing. All best, Jheald (talk) 00:45, 19 December 2014 (UTC)
Approved for trial (25 edits or 10 days). Please post your results here after the trial. — xaosflux Talk 01:48, 19 December 2014 (UTC)
@Xaosflux: First run of 16 edits made successfully -- see contribs for 19 December, from 15:59 to 16:55.
(Links to RKD streamlined + data updated; one page unaffected).
All the Captchas were a bit of a pain to have to deal with; but they will go away. Otherwise, all fine. Jheald (talk) 17:31, 19 December 2014 (UTC)
Sorry about that, I added confirmed flag to avoid this for now. — xaosflux Talk 17:34, 19 December 2014 (UTC)
New trial run carried smoothly (see this related changes page).
Update still prepared by executing several scripts manually, before a final uploader script; but I should have these all rolled together into a single process for the next test. Jheald (talk) 09:11, 11 January 2015 (UTC)
Run again on January 21st, adding a column with the total number of paintings in the PCF for each artist. Jheald (talk) 17:13, 24 January 2015 (UTC)

Have you completed the trial? Josh Parris 10:20, 4 March 2015 (UTC)

I was going to go on running it once a month or so, the next one probably in a day or two, until anyone progressed this any further, possibly making tweaks to my offline processing scripts as I went along. Obviously I'm open to suggestions as to anything I can improve or do better; though the actual unsupervised bit itself is just an upload script, refreshing a dozen or so pages, so nothing very complicated. (The off-line preprocessing is a bit more involved, but still pretty trivial). Jheald (talk) 00:33, 5 March 2015 (UTC)
I note that further edits have been made. Out of interest, why do http://viaf.org IDs change? The painter's been dead for centuries. Are they merges of duplicates? Also, is the trial finished now? Josh Parris 14:54, 9 March 2015 (UTC)
@Josh Parris: Clearly there has been a significant update of VIAF ids on Wikidata in the last three weeks, with a lot of new VIAF ids added -- I think by one of Magnus Manske's bots. This is why there are significant reductions in length for a lot of pages, with VIAF searches being replaced by explicit VIAF links.
I imagine that this may be catch-up resynchronisation for several months of updates at VIAF; but it may also be that now VIAF is explicitly targeting Wikidata items rather than just en-wiki articles, and is actively doing matching at the VIAF end, that may be why there now seems to be a sudden rush of new VIAF <--> Wikidata matches.
You're right that there are a few VIAF matches that have changed. I haven't looked in to any in detail, but two strong possibilities would be either erroneous matches that have been corrected (ie we used to point to the VIAF for somebody quite different); or alternatively that a group of duplicate entries on VIAF may have been merged -- eg if there had been a VIAF for the Library of Congress id, and another for the Getty ULAN id, and the two had not previously been connected.
As to where we're at, matching of the Your Paintings painter identifiers continues to move forwards using mix-n-match. About 80% of the YP identifiers have now been triaged into has / doesn't have / shouldn't have Wikidata item, with progress ongoing; plus I've now got as far as painters born before 1825, using mix-n-match search to match to RDKartists and other databases. Then there will also a stage where new Wikidata items are created for YP ids that currently don't have them but should; and these new ids in turn will also have RKD artists (etc) that they match. So there's still a lot to do going forward, and the tracking pages will continue to need updates if they are to reflect that.
At the moment it's still done using about four scripts that I sequentially run by hand on an occasional basis. The one I'd have to write a bit more code to integrate is the one that merges in the article names on en-wiki for the Wikidata items, because these are currently got using an Autolist query which is then saved manually. I'd need to look into how to replace that batch look-up with an API call, if I was to make the whole thing more integrated and run on regular basis (weekly?) I'm happy to do that work if anybody wants it, but for the time being it's also as easy just to go on doing what I've been doing, generating the updates in a partially manual way. So I'm happy to be open to views, if anybody has got any strong preferences either way. Jheald (talk) 23:27, 4 May 2015 (UTC)

Bots that have completed the trial period

edit WP:BRFA/B-bot_3

B-bot 3

Operator: B (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 23:39, Sunday, May 24, 2015 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): C#

Source code available: User:B-bot/source/OTRS Dated Filer

Function overview: 1. If the monthly OTRS pending and OTRS received categories, e.g. Category:Items pending OTRS confirmation of permission as of May 2015 and Category:Wikipedia files with unconfirmed permission received by OTRS as of May 2015, do not exist for either the current month or for next month, create them. 2. For any {{OTRS pending}} or {{OTRS received}} tag, if the tag is not dated, add the appropriate date.

Links to relevant discussions (where appropriate): Wikipedia:OTRS_noticeboard#Proposal_to_move_to_dated_pending_and_received_categories

Edit period(s): Daily

Estimated number of pages affected: Lots initially (there are 400 or so pages to initially be tagged) and thereafter probably around 10 per day on average.

Exclusion compliant (Yes/No): Not applicable, no users are notified by this task.

Already has a bot flag (Yes/No): Yes

Function details:

  1. Check to see if the current month's OTRS pending category exists and if not, create it using {{subst:OTRS pending subcat starter|date=2015-05-31}}. Do the same for next month.
  2. Check to see if the current month's OTRS received category exists and if not, create it using {{subst:OTRS received subcat starter|date=2015-05-31}}. Do the same for next month.
  3. Patrol the category Category:Items pending OTRS confirmation of permission as of unknown date, adding year|month|day based on the last edit date.
  4. Patrol the category Category:Wikipedia files with unconfirmed permission received by OTRS as of unknown date, adding year|month|day based on the last edit date.

This is a precursor to the next task, which will be automatically giving a final warning / request for follow-up if we have not received appropriate permission in more than {{OTRS backlog}} days.

Discussion

This is running in test mode at User:B-bot/Test page. --B (talk) 23:56, 24 May 2015 (UTC)

  • The test page looks fine. --Stefan2 (talk) 20:22, 27 May 2015 (UTC)
  • Note, I have, subsequent to this test, run another test on the test page. The relevant test can be found in the page history at [2]. --B (talk) 10:43, 29 May 2015 (UTC)

Approved for trial (50 edits). -- Magioladitis (talk) 11:15, 29 May 2015 (UTC)

  • @Magioladitis: After several attempts where I had to fix a few things, the process worked successfully at User:B-bot/Event_log#OTRS_dated_filer_-_19:47.2C_29_May_2015_.28UTC.29. (I had it capped at five images for each of the three segments.) I added a new function to auto-create monthly categories for old pages or files - I didn't realize it until doing some of these that there are a lot of PAGES (not files) that have been tagged for a VERY long time - five years in some cases. So this process will be good for being able to readily identify those. [3] has the edits involved in this successful test. --B (talk) 20:23, 29 May 2015 (UTC)
  • (edit conflict) I have checked some random files in subcategories to Category:Items pending OTRS confirmation of permission to which the bot added dates, and those look fine.
What will happen if the bot can't edit a page which uses {{OTRS pending}}? For example, the bot will not be able to edit User:Rjd0060/OTRS-Text.js or MediaWiki:FileUploadWizard.js, which are both 'using' the template due to bad script coding. --Stefan2 (talk) 20:26, 29 May 2015 (UTC)
@Stefan2: I actually encountered that problem and it crashed. I put a try/catch around it so I will trap it and log an error. My intention is to modify the template so that it will not put the page in the category at all if it ends in .js, but for now, I'm just trapping the error. I also noticed that a handful of user talk pages had {{OTRS pending}} tags where users had incorrectly added the template there instead of on the image (or article talk page). The same problem could arise here where the user talk page is protected and the behavior would be the same - it would get an exception, which is trapped and it will log it to its event log. Earlier today, I went through all of the user talk pages in the category that were actual user talk pages (not the talk page of a draft in user space), commented out the template, and left a message for the owner. A possible future feature of this bot would be to automate that process - there is never a case where a user talk page should have {{OTRS pending}} on it and if we catch them quickly rather than months down the line, we might help them put it in the right place and prevent the image from being deleted prematurely. --B (talk) 20:47, 29 May 2015 (UTC)

Trial complete. --B (talk) 20:49, 29 May 2015 (UTC)

There is always a risk that the template is used at the wrong place, for example added to discussions by new users who don't know about {{tl}}. If you are somehow able to detect them, then fine. If not, then they will be spotted in the monthly categories after a long time when the intended files or articles already have been deleted. --Stefan2 (talk) 21:26, 29 May 2015 (UTC)
If it is in a base user talk page (not a subpage), then I could simply always regard that as incorrect and log an error. (I don't really want to automatically remove it - I'd rather just notify myself so that I can take care of it and engage the user to ask them what it was intended for, or, if I can tell myself, just add it to the right place myself.) If the tag is on an article talk page (when it was intended for an image page), I don't know that there's really a way good way to automatically detect that because it's perfectly legitimate for the tag to be on the article talk page. --B (talk) 22:01, 29 May 2015 (UTC)
I have added a check so that if the tag is on a user talk page, I will log an error (which I can manually follow up on to either remove the tag or move it to the image where the user really wants it) [4]. Also, it's not really related to the bot, but I have modified MediaWiki:FileUploadWizard.js to automatically add the date ... so anyone who uses the upload form to put an OTRS pending tag in place will get the date with it automatically. --B (talk) 13:58, 31 May 2015 (UTC)
User:B: Instead of doing this, it may a good idea to put nowiki tags on the page. It currently looks as if the script uses a huge variety of different templates which might put the script in other unexpected categories. Nowiki would solve this. Compare with how I am doing it at User:Stefan2/common.js. --Stefan2 (talk) 16:07, 31 May 2015 (UTC)
@Magioladitis: Is there anything else to do for approval? --B (talk) 13:58, 31 May 2015 (UTC)


Approved requests

Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.


Denied requests

Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.

Expired/withdrawn requests

These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at anytime. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.