Wikipedia:Bot requests

From Wikipedia, the free encyclopedia
  (Redirected from Wikipedia:BOTREQ)
Jump to: navigation, search
Shortcuts:

This is a page for requesting work to be done by bots per the bot policy. This is an appropriate place to simply put ideas for bots. If you need a piece of software written for a specific article you may get a faster response time at the computer help desk. You might also check Wikipedia:Bots/Status to see if the bot you are looking for already exists, in which case you can contact the operator directly on his or her talkpage.

If you have a question about one particular bot, it should be directed to the bot owner's talk page or to the Bot Owners' Noticeboard. If a bot is acting improperly, a note about that should be posted to the owner's talk page and to the Administrators' Noticeboard. A link to such a posting may be posted at the Bot Owners' Noticeboard.

If you are a bot operator and you complete a request, note what you did, and archive it. {{BOTREQ}} can be used to give common responses, and to make it easier to see at-a-glance what the response is.

There are a number of common requests which are regularly denied, either because they are too complicated to program, or do not have consensus from the Wikipedia community. Please see Wikipedia:Bots/Frequently denied bots for a list of such requests, and ensure that your idea is not among them.

If you are requesting that a bot be used to add a WikiProject banner to the talkpages of all articles in a particular category or its subcategories, please be very careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively. Compare the difference between a recursive list and a properly vetted one.



De Hollandsche Molen[edit]

De Hollandsche Molen have changed their database. I'd like a bot to change urls in the following lists:-

The string http://www.molens.nl/molens.php?molenid= needs to be replaced with http://www.molens.nl/site/dbase/molen.php?mid= I've done the Drenthe and Friesland lists already. Strings in articles will have to be a manual job as they link to subpages which have been altered and in some cases combined. Mjroots (talk) 10:48, 5 October 2014 (UTC)

Maybe consider creating a template to avoid this in the future? -- Magioladitis (talk) 11:28, 8 October 2014 (UTC)

@Magioladitis: not sure what you mean by that. One would hope that now DHM have had a major change like this, they won't be changing again anytime soon. Mjroots (talk) 18:53, 10 October 2014 (UTC)
A little under three years ago, Merseytravel changed the URL format on their website (from e.g. http://www.merseyrail.org/stations/?iStationId=2 to e.g. http://www.merseyrail.org/stations/station-details.html?station_id=2), which meant that 60 or so of our articles suddenly had dead links. I then created {{Merseyrail info lnk}} with the basic link and popped that into articles like this, so that if they change the URL format again, we would only need to amend that one template and not 60+ individual articles. It seems that I need to amend the template, because that URL is non-working again, and should become http://www.merseyrail.org/plan-your-journey/stations/ainsdale.aspx --Redrose64 (talk) 19:08, 10 October 2014 (UTC)
Thusly. --Redrose64 (talk) 20:07, 10 October 2014 (UTC)

Yes check.svgY Done The only page that had more than one offending link was List of windmills in Groningen. I fixed them with a text editor. BMacZero (talk) 17:02, 18 October 2014 (UTC)

Bot for creating synonym redirects[edit]

Can a bot be made that finds all alternative scientific names listed in the synonym field of a taxobox or speciesbox and makes them into redirects to the article? This is something we normally do manually. It is useful in preventing duplicate articles from being created.

User:Peter coxhead points out that sometimes these boxes contain information like "Fooia bara Smith, non Fooia bara Jones". Normally, they contain items formatted like these examples:

''Slimus Slimus'' <small>Betty Biologist, 1901</small><br />

*''Slimus Slimus'' <small>Betty Biologist, 1901</small>

*''Slimus Slimus'' Betty Biologist, 1901

The common feature is the italics.

A common alternative uses {{Specieslist}} e.g. {{Specieslist |Slimus slimus|Betty Biologist, 1901 |Slimus maximus |(Smith, 1898)}}. Also there's increasing use of {{small}} rather than <small>..</small>. Peter coxhead (talk) 08:17, 13 October 2014 (UTC)

Links:

Anna Frodesiak (talk) 01:11, 13 October 2014 (UTC)

Rjwilmsi has been performing similar tasks in the past. -- Magioladitis (talk) 10:10, 13 October 2014 (UTC)

Has Rjwilmsi performed these manually? Anna Frodesiak (talk) 01:52, 15 October 2014 (UTC)
I'm not aware that I've done anything specifically relating to taxoboxes. Magio probably meant that I have run tasks that involve extraction/manipulation of data in infoboxes. Rjwilmsi 12:15, 16 October 2014 (UTC)
I see. Okay. So, what about a bot? Wouldn't it save us all a lot of time and be useful? Anna Frodesiak (talk) 01:53, 17 October 2014 (UTC)

Main page image vandalism - adminbot request[edit]

See Talk:Main Page#Main page image vandalism. In short, because commons:user:KrinkleBot was inactive for a few days, images approaching their turn on our main page weren't getting protected at Commons, and someone exploited that vulnerability to change the TFA image to pornography. I have raised the need for a back-up image protection system in the past (Wikipedia:Bot requests/Archive 57#Bot to upload main page images in November 2013), and Legoktm got as far as Wikipedia:Bots/Requests for approval/TFA Protector Bot 2 which did not complete. Legoktm, do you fancy reviving this? Or does anyone else fancy adding this to their bot rota? Thanks, BencherliteTalk 05:36, 14 October 2014 (UTC)

TBH, I think a better usage of time would be making KrinkleBot distributed so it doesn't have a SPOF on tool labs. Trying to upload local copies of images just feels icky. Legoktm (talk) 07:09, 14 October 2014 (UTC)
And in practical terms that means we need to do what? Are you saying that Commons should have clones of Krinklebot running from different locations? (I have now left a message for Krinkle at Commons pointing to this discussion, incidentally.) BencherliteTalk 09:32, 14 October 2014 (UTC)
Since I am now paying a shared hosting anyway and only bandwidth is charged, I could offer running it there, provided it's possible with reasonable efforts. @Legoktm: The tricky thing here is that KrinkleBot is an admin bot, I think. Storing its password or oAuth access tokens somewhere on third party servers could be also considered risky. -- Rillke (talk) 00:02, 15 October 2014 (UTC)
Krinkle has advised against running a duplicate bot, but has offered to let Commons admins / stewards / sysadmins have access to his Tool Labs user group. I qualify under all three heads... oh no, I don't. Does anyone here qualify and fancy a spare adminbot? BencherliteTalk 21:22, 15 October 2014 (UTC)
Krinkle has added me to the tool labs group, so I'll be able to restart/poke it if necessary. Legoktm (talk) 16:51, 17 October 2014 (UTC)

Images are tagged with "Image not protected as intended." Todays featured image seems to allow uploads at commons. [1]. --DHeyward (talk) 17:11, 16 October 2014 (UTC)

The images do appear to have cascaded protection by Krinklebot. -mattbuck (Talk) 09:05, 17 October 2014 (UTC)
Ah, got it. That did srop upload. Thx. --DHeyward (talk) 11:14, 17 October 2014 (UTC)

Question: how hard would it be to extend this cascaded protection to every image used on a Wikipedia page? Maybe a "revision accepted" privilege for updates at commons? It wouldn't prevent new files and it wouldn't prevent articles from pointing to new files but it would prevent a bad revision from going live on a "protected" page and also attract vandalism patrollers when an image reference is changed to a new name. It wouldn't lock any images that aren't used by Wikipedia. --DHeyward (talk) 05:18, 18 October 2014 (UTC)

Question: Add archiveurl and archivedate to links[edit]

What's the tool / gadget / script / bot -- that automatically or semi-automatically checks for archived versions of URLs and adds them into the citations with archiveurl parameter?

Thank you for your time,

Cirt (talk) 00:06, 15 October 2014 (UTC)

Wikipedia:Link rot#Internet_archives lists bookmarklets that will check for archived versions of the page you are viewing, and Wikipedia:Citing sources/Further considerations#Archiving_bookmarklets lists bookmarklets that will (attempt to) create an archive version of the page you are viewing. See also the Wikipedia:Bot_requests/Archive_61#Replace_dead_links_with_Wayback_machine discussion. - Evad37 [talk] 03:01, 15 October 2014 (UTC)
Thank you! But, Evad37, what about a bot to do it? — Cirt (talk) 15:39, 16 October 2014 (UTC)
I don't think any bots currently do so. The main problem is that an automatic process could produce utter crap, such as 404 errors or "this page does not exist" notices; or the archived version could be the wrong version that doesn't verify any/all information originally obtained, i.e. if the information was removed prior to archiving, or was yet to be added when archiving occurred. The only way I can think of to get around that would be to have maintenance categories (e.g. Category:Pages with automatically added archive URLs from October 2014) track bot-added archive links, activated by a bot also adding another parameter (e.g. |auto-archived=October 2014) in the cite/citation templates, and have wikipedians working through the category, removing that parameter and any bad archive urls. - Evad37 [talk] 17:27, 16 October 2014 (UTC)
Ah okay those don't seem like the best ideal solutions. Thank you, — Cirt (talk) 20:35, 16 October 2014 (UTC)

Produce a list of articles[edit]

WP:NRHP maintains lists of historic sites throughout the United States, with one or more separate lists for each of the country's 3000+ counties. These lists employ {{NRHP row}}, which (among its many parameters) includes parameters to display latitude and longitude through {{coord}}. For most of the project's history, the lists used an older format with manually written coords (e.g. a page would include the code {{coord|40|30|0|N|95|30|0|W|name=House}}, when today they just have |lat=40.5 |lon=95.5), and when a bot was run to add the templates to the lists, it somehow didn't address the coordinates in some lists. With this in mind, I'd like if someone could instruct a bot to discover all WP:NRHP lists that are currently using both {{NRHP row}} and {{coord}}. I tried to use Special:WhatLinksHere/Template:Coord, but it didn't produce good results: since {{NRHP row}} transcludes {{coord}} when it's correctly implemented, all of these lists have links to {{coord}}. As a result, I was imagining that the bot would perform the following procedure:

  • Go to each of the pages linked from WP:NRHPPROGRESS. All of our 3000+ lists are linked from this page, and virtually nothing else is, so this would reduce the number of false positives
  • Check to see if the page transcludes {{NRHP row}}
  • If the page does not transclude that template, record it in Results and go to the next WP:NRHPPROGRESS-linked page
  • If the page transcludes {{NRHP row}}, check to see if the characters {{coord| are present in the code of the page (basically a Ctrl+F for the code). My primary goal is to see which pages transclude {{coord}} directly, and searching for the string of text seems to be the simplest course
  • If the page transcludes {{coord}}, record it in Results; if not, don't. Either way, go to the next WP:NRHPPROGRESS-linked page

"Results" could be a spot in the bot's userspace. Since the bot won't be doing anything except editing the results page, you won't need to worry about opening a BRFA. Nyttend (talk) 01:49, 15 October 2014 (UTC)

@Nyttend: If you need these results updated regularly, then a bot or script is the way to go, and I'll leave that to the regular experts on this page. But as a one off, I've done it using AWB's "preparse mode" and created User:Nyttend/NRHP row and coord for you. -- John of Reading (talk) 13:48, 15 October 2014 (UTC)
Thank you! This is a one-off thing, since people don't add new coordinates this way to lists that don't already have them this way. And it definitely helps that you supplied the list of articles that didn't have {{NRHP row}}; I asked for this in case we had articles that never got converted to the {{NRHP row}} in the first place, and it's good to know that everything has been converted properly. Nyttend (talk) 13:51, 15 October 2014 (UTC)

Add articles to the newly formed WP:Tejano[edit]

I don't know how to use bots and the taskforce is too large to go one by one adding articles to WP:Tejano. Best, .jonatalk 18:37, 15 October 2014 (UTC)

Here are some important categories for the bot to cover:

Tejano music, Banda, Duranguense, Jarocho, Ranchera, Mariachi, Norteño (music). Erick (talk) 21:17, 15 October 2014 (UTC)

Subtemplates used in mainspace[edit]

Yes check.svg Done. -DePiep (talk) 15:29, 19 October 2014 (UTC)

In December 2013, Template:Convert (edit|talk|history|links|watch|logs) was converted to Lua code (680k transclusions). The old wikicode template used subpages (subtemplates) of Template:Convert, like Template:Convert/flip2. There are some 3900 subtemplates in this pattern. To manage cleanup (e.g., improve the module:convert or its /data page), we'd like to know which subtemplates still are used in mainspace.

Request: produce a list with all pages that have pagename prefix (pattern) Template:Convert and that have transclusions in mainspace. Pages with zero transclusions in mainsp can omitted (do not list).

Example:

Note 1: True subpages are listed by requiring Template:Convert/ (with slash). However, to cast the net a bit wider that slash is omitted from the filter (we want to catch page "Template:Convertx" too).

Note 2: Format suggestion: add the number, link the template pagename, newline per page+bullet.

My bet would be: you'll find between 25 and 100 pages. -DePiep (talk) 19:10, 18 October 2014 (UTC)

Simple enough query to do on Tool Labs:
HTH Anomie 21:03, 18 October 2014 (UTC)
Thanks! You mean next time I could go to Tool Labs myself? -DePiep (talk) 21:58, 18 October 2014 (UTC)
I manually fixed the articles containing a redlink template of "Convert" followed by a number. GoingBatty (talk) 22:20, 18 October 2014 (UTC)
GoingBatty did you edit the articles? -DePiep (talk) 22:45, 18 October 2014 (UTC)
@DePiep: I only made these six edits, and had no plans to do anything else. From reading your request, I thought these edits would be outside of the scope of your subtemplate learnings. Apologies if I interrupted. GoingBatty (talk) 01:43, 19 October 2014 (UTC)
Thnx for this reply. Not a big issue and no harm, I can say now, but at the moment I was surprised missing errors I had seen a minute earlier ;-). Closed, all fine. @GoingBatty:. -08:25, 19 October 2014 (UTC)
Anomie, a lot of numbers seem off. If I go to a page with a few transclusions, the WLH pages shows zero transclusions in mainspace. E.g., {{Convert/And1}} (1), {{Convert/Dual/Loff}} (1). Any explanation? (I don't think most page abandonings, towards zero tranclusions, are recent. That is, most changes are past any delay in days, so should show correct). -DePiep (talk) 22:35, 18 October 2014 (UTC)
Huh, there is an entry in the templatelinks table that doesn't correspond with anything in the page table (and there's no index that I can use to try to search for it in the archive table). Here's a new version, which will also reflect any cleanup done since yesterday:
Template Mainspace pages transcluding
{{Convert}} 658760
{{Convert/CwtQtrLb_to_kg}} 35
{{Convert/E}} 2
{{Convert/TonCwt_to_t}} 286
{{Convert/numdisp}} 1
{{Convert/per}} 1
{{Convert/words}} 12
{{ConvertAbbrev}} 40086
{{ConvertAbbrev/ISO_3166-1/alpha-2}} 40086
{{ConvertAbbrev/ISO_3166-1/alpha-3}} 629
{{ConvertAbbrev/ISO_3166-2/US}} 39457
{{ConvertAbbrev/ISO_639-1}} 629
{{ConvertAbbrev/ISO_639-2}} 1
{{ConvertIPA-hu}} 372
Anomie 12:54, 19 October 2014 (UTC)
Useful. Consider done. Thx. -DePiep (talk) 15:24, 19 October 2014 (UTC)

Bot or script to function like DASHBot[edit]

I used to really like DASHBot (talk · contribs) operated by Tim1357.

It would scan an article, find archive links, and automatically add them to the page.

Is there a bot, or script that I could even use semi-automatically, that could perform this function?

See for example DIFF.

Any help would be appreciated,

Cirt (talk) 02:05, 19 October 2014 (UTC)

Evad37, had you heard of this function by this bot before? — Cirt (talk) 02:07, 19 October 2014 (UTC)
I just found Wikipedia:WikiProject External links/Webcitebot2, which may be of interest to you or to bot programmers, but I think my previous comment still stands: An automatic process can't actually tell if the archived versions actually verify the text in articles. - Evad37 [talk] 07:37, 23 October 2014 (UTC)
Thanks very much, Evad37, the fact is DASHBot (talk · contribs) used to work before, can anyone take up the mantle for a new bot to do the same thing as DASHBot (talk · contribs) ? — Cirt (talk) 11:57, 23 October 2014 (UTC)
If the source code appeared, I'll happily take this over. As it is, I've emailed them and left them messages, to no responses. I lack the time to write this from scratch, so It'll have to wait for a more enthusiastic operator I suspect. --Mdann52talk to me! 12:55, 23 October 2014 (UTC)
Ah, I see, thank you. — Cirt (talk) 16:03, 23 October 2014 (UTC)

Bot for combining references[edit]

Today there are no Bot who only focuses on combining references /duplicate references. I think such a Bot would be really useful for article creators and older already existing articles that are added with new references as well. That is why I now request that such a Bot should be created. And that the Bot in some way goes after a List of articles with non-combined references or similar. An option could be to add this task to an already existing Bot. --BabbaQ (talk) 15:26, 19 October 2014 (UTC)

@BabbaQ: All bots that use AWB's general fixes already do this, but not as their primary task. How do you suggest a way that someone could create a "List of articles with non-combined references"? Thanks! GoingBatty (talk) 15:41, 19 October 2014 (UTC)
Is there already an available similar list for other tasks for the Bots? In that case one could just add this task to such a list. I am no expert but atleast it is a suggestion.@GoingBatty:--BabbaQ (talk) 15:44, 19 October 2014 (UTC)
@BabbaQ: Some bots work off of template-created maintenance categories (e.g. Category:CS1 errors: dates or Category:Orphaned articles from October 2014) while others alternate logic. GoingBatty (talk) 15:52, 19 October 2014 (UTC)

Category sort keys needed[edit]

Some 1500 stub articles have recently been added to Category:Megachile. Can any of your bots please add sortkeys to these pages so that they are sorted according to the species name like [[Category:Megachile|Mucida]]? The operation would be quite simple: If the page name begins with "Megachile" and contains two words, take the second word and use it as sort key beginning with an upper case letter. De728631 (talk) 18:58, 21 October 2014 (UTC)

I can do this. Hold on a day or two. Rcsprinter123 (gab) @ 20:59, 21 October 2014 (UTC)

Wikipedia:Village pump (proposals)/Archive 110#Bot blank and template really, really, really old IP talk pages.[edit]

The consensus of the discussion at Wikipedia:Village pump (proposals)/Archive 110#Bot blank and template really, really, really old IP talk pages. was never followed through on, so I am following through now. We would like a bot to blank and add the {{OW}} template to all IP user talk pages for which no edits have been made by the IP within the last seven years; and the IP is not been blocked within the last five years. These time frames may be tightened further in future discussions. Cheers! bd2412 T 20:47, 22 October 2014 (UTC)