Jump to content

Wikipedia:Bot requests: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Mnicolosi (talk | contribs)
Line 588: Line 588:


:Signed your comment. Could you also provide the link that needs replacing? [[User:Noommos|<font color="green">'''Noom'''</font>]] [[User talk:Noommos|<sub><font color="maroon">talk</font></sub>]] <sup><span style="color:blue;font-size:0.75em">[[Special:Contributions/Noommos|contribs]]</span></sup> 23:11, 12 February 2011 (UTC)
:Signed your comment. Could you also provide the link that needs replacing? [[User:Noommos|<font color="green">'''Noom'''</font>]] [[User talk:Noommos|<sub><font color="maroon">talk</font></sub>]] <sup><span style="color:blue;font-size:0.75em">[[Special:Contributions/Noommos|contribs]]</span></sup> 23:11, 12 February 2011 (UTC)


Thanks for the help with the signature. Sorry, not quite used to this system. All urls in wikipedia that are seattlepi.nwsource.com need to be updated to seattlepi.com. Details on the background behind our url change are here if you're interested: http://en.wikipedia.org/wiki/Seattle_Post-Intelligencer

Thanks much,
Michelle
[[User:Mnicolosi|Mnicolosi]] ([[User talk:Mnicolosi|talk]]) 02:06, 13 February 2011 (UTC)

Revision as of 02:06, 13 February 2011

This is a page for requesting tasks to be done by bots per the bot policy. This is an appropriate place to put ideas for uncontroversial bot tasks, to get early feedback on ideas for bot tasks (controversial or not), and to seek bot operators for bot tasks. Consensus-building discussions requiring large community input (such as request for comments) should normally be held at WP:VPPROP or other relevant pages (such as a WikiProject's talk page).

You can check the "Commonly Requested Bots" box above to see if a suitable bot already exists for the task you have in mind. If you have a question about a particular bot, contact the bot operator directly via their talk page or the bot's talk page. If a bot is acting improperly, follow the guidance outlined in WP:BOTISSUE. For broader issues and general discussion about bots, see the bot noticeboard.

Before making a request, please see the list of frequently denied bots, either because they are too complicated to program, or do not have consensus from the Wikipedia community. If you are requesting that a template (such as a WikiProject banner) is added to all pages in a particular category, please be careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively (see example difference).

Alternatives to bot requests

Note to bot operators: The {{BOTREQ}} template can be used to give common responses, and make it easier to keep track of the task's current status. If you complete a request, note that you did with {{BOTREQ|done}}, and archive the request after a few days (WP:1CA is useful here).


Please add your bot requests to the bottom of this page.
Make a new request
# Bot request Status 💬 👥 🙋 Last editor 🕒 (UTC) 🤖 Last botop editor 🕒 (UTC)
1 Automatic NOGALLERY keyword for categories containing non-free files (again) 25 10 Thryduulf 2024-08-04 01:54 Legoktm 2024-06-24 01:34
2 Can we have an AIV feed a bot posts on IRC? 8 3 Legoktm 2024-06-21 18:24 Legoktm 2024-06-21 18:24
3 Bot to update match reports to cite template BRFA filed 14 5 Yoblyblob 2024-06-20 21:21 Mdann52 2024-06-20 21:11
4 Bot to mass tag California State University sports seasons Doing... 5 4 Frostly 2024-06-10 17:05 Headbomb 2024-06-09 17:28
5 Clear Category:Unlinked Wikidata redirects 9 6 Wikiwerner 2024-07-13 14:04 DreamRimmer 2024-04-21 03:28
6 Fixing stub tag placement on new articles Declined Not a good task for a bot. 5 4 Tom.Reding 2024-07-16 08:10 Tom.Reding 2024-07-16 08:10
7 Bot to change citations to list defined references Declined Not a good task for a bot. 3 2 Apoptheosis 2024-06-09 17:44 Headbomb 2024-06-09 16:56
8 Adding Facility IDs to AM/FM/LPFM station data Y Done 13 3 HouseBlaster 2024-07-25 12:42 Mdann52 2024-07-25 05:23
9 Tagging women's basketball article talk pages with project tags BRFA filed 15 4 Hmlarson 2024-07-18 17:13 Usernamekiran 2024-07-18 17:10
10 Adding links to previous TFDs 7 4 Qwerfjkl 2024-06-20 18:02 Qwerfjkl 2024-06-20 18:02
11 Bot that condenses identical references Coding... 12 6 ActivelyDisinterested 2024-08-03 20:48 Headbomb 2024-06-18 00:34
12 Convert external links within {{Music ratings}} to refs 2 2 Mdann52 2024-06-23 10:11 Mdann52 2024-06-23 10:11
13 Stat.kg ---> Stat.gov.kg 2 2 DreamRimmer 2024-06-23 09:21 DreamRimmer 2024-06-23 09:21
14 Add constituency numbers to Indian assembly constituency boxes 3 2 C1MM 2024-06-25 03:59 Primefac 2024-06-25 00:27
15 Bot to remove template from articles it doesn't belong on? 3 3 Thryduulf 2024-08-03 10:22 Primefac 2024-07-24 20:15
16 One-off: Adding all module doc pages to Category:Module documentation pages 6 2 Nickps 2024-07-25 16:02 Primefac 2024-07-25 12:22
17 Draft Categories 7 4 DannyS712 2024-07-27 07:30 DannyS712 2024-07-27 07:30
18 Remove new article comments 3 2 142.113.140.146 2024-07-28 22:33 Usernamekiran 2024-07-27 07:50
19 Removing Template:midsize from infobox parameters (violation of MOS:SMALLFONT)
Resolved
14 2 Qwerfjkl 2024-07-29 08:15 Qwerfjkl 2024-07-29 08:15
20 Change stadium to somerhing else in the template:Infobox Olympic games Needs wider discussion. 8 5 Jonesey95 2024-07-29 14:57 Primefac 2024-07-29 13:48
21 Change hyphens to en-dashes 16 7 1ctinus 2024-08-03 15:05 Qwerfjkl 2024-07-31 09:09
22 Consensus: Aldo, Giovanni e Giacomo 15 4 Bsoyka 2024-08-02 20:48 Qwerfjkl 2024-08-02 20:23
23 Cyclones 1 1 OhHaiMark 2024-08-04 01:47
Legend
  • In the last hour
  • In the last day
  • In the last week
  • In the last month
  • More than one month
Manual settings
When exceptions occur,
please check the setting first.


WildBot tag cleanup

Since User:WildBot has not been working for awhile and User:Josh Parris is no longer active can another bot be used to cleanup the tags that have been left on the talk pages? – Allen4names 03:16, 28 January 2011 (UTC)[reply]

Are you aware of a page that records which tags of wildbot have become obsolete? WilBot was giving warning for editors, so I don't think we should remove all tags en-masse. Yobot is cleaning tags using the pages I linked to you, every 3 or 4 days. -- Magioladitis (talk) 09:21, 28 January 2011 (UTC)[reply]
No, I was not aware of that page. I do think that all the WildBot tags in the article talk space should be cleaned up unless WildBot is reactivated or replaced. Will Yobot do this? BTW, When I followed the link I got the following error message at the end of the file "ERROR 1317 (70100) at line 16: Query execution was interrupted". – Allen4names 17:56, 28 January 2011 (UTC)[reply]
I had mistakenly used /* SLOW OK */ instead of /* SLOW_OK */ and the query was killed during high replication. I have taken the liberty of updating WildBot's pages and template to reflect the inactive status. — Dispenser 22:02, 31 January 2011 (UTC)[reply]
Why would you need one? Just have a bot check the article, and if the disambig link has been cleaned up, remove the WildBot tag. bd2412 T 23:58, 5 February 2011 (UTC)[reply]

Page seems broken to me today. -- Magioladitis (talk) 12:58, 6 February 2011 (UTC)[reply]

I'd figured you'd download the original source file and open that in AWB. Anyway, I've removed the CSV autodetect which was screwing it up and make some small improvements. — Dispenser 03:00, 7 February 2011 (UTC)[reply]

Replacement of Template:oscoor with Template:Gbmappingsmall

Would it be possible for a bot to replace all instances of {{oscoor}} with {{gbmappingsmall}}? Mjroots (talk) 22:14, 30 January 2011 (UTC)[reply]

Is there consensus for this change? --Admrboltz (talk) 02:04, 31 January 2011 (UTC)[reply]
The main changes would be:
  1. {{oscoor|TQ518027|TQ 518 027}} → {{gbmappingsmall|TQ 518 027}} - example
  2. {{oscoor|SX148971|OS grid reference SX 148 971}} → {{gbmapping|SX 148 971}} - example
  3. {{oscoor|SU552175_region:GB_scale:100000|Map sources}} → {{oscoor gbx|SU552175}} - example
  4. osgridref = {{oscoor|TF464882|TF 464 882}} → osgraw = TF 464 882 - example
The scope of the edits can be judged from this list of all the pages involved. I shall wait a few days to see if any objections are raised on this page. Additionally, change number 4 involves a change to a template so I shall also watch template talk:infobox church. But it has already received a favourable comment.
There are several special cases so I shall probably implement it as a tool to assist hand editing rather than as a bot. — [[::User:RHaworth|RHaworth]] (talk · contribs) 22:08, 31 January 2011 (UTC)[reply]
Now implemented as a tool. See template talk:oscoor. — [[::User:RHaworth|RHaworth]] (talk · contribs) 13:25, 11 February 2011 (UTC)[reply]

Taxobox maintenance, one-time

After several years of using the "unranked taxon" parameters in the {{taxobox}}, someone's pointed out that a few parameters are inconsistent with the rest of the taxobox. I've spent the last few hours resolving this issue, and I've got a bit left to go before I'm done, but there's one gargantuan mountain standing in my way-- about 26K articles that all (thankfully) share exactly the same problem.

All the articles appearing in the automatically updated Category:Taxoboxes employing both unranked_familia and superfamilia need to have the text unranked_familia replaced with unranked_superfamilia. The text appears only once on each of these pages, so a search-and-replace with no second pass should suffice. There are currently 25,892 pages catalogued under this category. Once this category is emptied out, the task should be terminated.

I appreciate any help you can offer! Thanks! Bob the WikipediaN (talkcontribs) 23:56, 30 January 2011 (UTC)[reply]

This would be very easy to do in several frameworks. Could it work with pywiki with python replace.py -cat:Taxoboxes_employing_both_unranked_familia_and_superfamilia "unranked_familia" "unranked_superfamilia"?Smallman12q (talk) 00:36, 31 January 2011 (UTC)[reply]
I can help with this task. So, there is no other logic, just replace unranked_familia with unranked_superfamilia in all cases in that category? Plastikspork ―Œ(talk) 01:01, 31 January 2011 (UTC)[reply]
No strings attached, that's all it is. I appreciate it! Bob the WikipediaN (talkcontribs) 02:01, 31 January 2011 (UTC)[reply]
Actually, I just remembered there may be a second parameter needing changed, |unranked_familia_authority= would need changed to |unranked_superfamilia_authority=. Bob the WikipediaN (talkcontribs) 06:38, 31 January 2011 (UTC)[reply]
Okay, will do that as well. Plastikspork ―Œ(talk) 06:43, 31 January 2011 (UTC)[reply]
Thanks in advance! Bob the WikipediaN (talkcontribs) 06:52, 31 January 2011 (UTC)[reply]
Comment These are in a category: Taxoboxes which can be fixed by being automated. Is this correct, then, that this fix would be repaired by automation? Then, is it necessary to run a bot 26,000X for a cosmetic change that will soon be superseded? These are not broken taxoboxes. The reader can read them. I don't think bots are supposed to be used for cosmetic changes. I disagree with running a bot to make a change that will, in the future, allow for ease of automating. --Kleopatra (talk) 14:54, 31 January 2011 (UTC)[reply]
Can we have more information about this "soon be superseded" claim? WhatamIdoing (talk) 19:42, 31 January 2011 (UTC)[reply]
I think there's a slight misunderstanding here. When categorizing that category, I placed it into Category:Taxobox cleanup as it's a taxobox style issue, but also placed it into the now-removed category you just mentioned. Automating the taxoboxes certainly wasn't the only means for doing this, but the category served as a means to categorize it as potential work for those looking for buggy taxoboxes in the Category:Automatic taxobox cleanup. The operating word was can. A human automating them would be ideal, but by no means do I expect anyone, even a bot to roll out automated taxoboxes on 26K articles. The only other solution is far less controversial if even that-- to replace the hidden ranking by simply adding super in one or two places-- a change that will have zero effect on the appearance/functionality of the taxoboxes, but once completed, will allow for the final revision needing made to {{taxobox/core}} in order to normalize the functionality of unranked taxa. Bob the WikipediaN (talkcontribs) 00:13, 1 February 2011 (UTC)[reply]
Bot policy does not appear to allow for bot edits that are essentially null edits. I would be surprised if this obtained approval. "Cosmetic changes should only be applied when there is a substantial change to make at the same time." Bot edits also, like editing protected templates, require community consensus. Please link to the consensus discussion and the RfBA for this task. Thanks. --Kleopatra (talk) 05:16, 1 February 2011 (UTC)[reply]
I've not launched a bot. This is a formal request for a bot to carry out a series of noncontroversial edits. The bot owner has already launched the bot a good several thousand edits ago. It's not a null edit, either. And it's definitely not cosmetic; it's functional. Without this task, the last step of this normalization of the unranked taxa will cause superfamily-level unranked taxa to appear as daughters of the families. This is cleanup of poor coding, not definitely not cosmetic. Bob the WikipediaN (talkcontribs) 06:31, 1 February 2011 (UTC)[reply]
Then where is the bot authorization for the task? Or is that, like your non-consensus edits to fully protected templates, something you're not required to have also?
This is why wikipedia loses editors. The polices aren't real. They're badger tools for some to use against others. And they're nothing to another entire group: administrators like you for whom policies don't apply. Fully protected? Doesn't matter. Bot authorization? Doesn't matter, it's just doing 27,000 unauthorized bot edits.
Apparently, though, I do have to follow policy simply because I'm not an administrator and can't use my administrative powers to do whatever I want.
Have it your way. I simply can't edit under these conditions where the rules are not the rules except for when it's convenient for you.— Preceding unsigned comment added by Kleopatra (talkcontribs) 06:45, 1 February 2011 (UTC)[reply]
1) If this bot task is getting opposition, the edits are by definition controversial. 2) A clearer explanation of exactly what the task is supposed to accomplish would be helpful. The vague description makes it sound like the goal might be obtainable with careful template programming instead of editing 27,000 articles. 71.141.88.54 (talk) 18:23, 2 February 2011 (UTC)[reply]
Yes, and the edits have stopped, now that I am aware of the objections. I think we should wait for the RFC to conclude, and if there is consensus, an official bot request can be filed. Thanks! Plastikspork ―Œ(talk) 01:12, 3 February 2011 (UTC)[reply]
You made over 1000 edits over many hours after I made my first objection. --Kleopatra (talk) 14:48, 3 February 2011 (UTC)[reply]
I believe this is a misunderstanding, responded at AN/I. Thanks! Plastikspork ―Œ(talk) 00:25, 5 February 2011 (UTC)[reply]
I posted my comment[1] at 07:54, 31 January 2011 with this edit summary: "whoa! 26,000 bot cosmetic changes to prep for a future not-yet-approved automation? no!" (See comment above, this thread.) You stopped making the edits at 17:07, 1 February 2011.[2] So, you're right, there is a misunderstanding, and I apologize for it. You made only about 975 edits from my whoa! post to your stopping your editing. Again, I apologize for misrepresenting the fact as "over 1000 edits" when it was under 1000 edits. --Kleopatra (talk) 01:31, 5 February 2011 (UTC)[reply]
Plastikspork didn't mean it that way. -- Magioladitis (talk) 01:37, 5 February 2011 (UTC)[reply]
You think it was just indented wrong? Could be; seemed like a strange off-target comment. --Kleopatra (talk) 01:56, 5 February 2011 (UTC)[reply]
I replied in AN/I too. Maybe you would like to take a look. [3]. -- Magioladitis (talk) 01:59, 5 February 2011 (UTC)[reply]

WebCite Bot

What about creating another bot that automatically archives urls at WebCite and adds the archive links to articles? I am aware of User:WebCiteBOT, but

I consider WP:LINKROT to be a major threat to Wikipedia, that asks for a response, thus I hereby request an efficiently working bot for that purpose. Regards. Toshio Yamaguchi (talk) 16:59, 1 February 2011 (UTC)[reply]

This probably need to be reiterated elsewhere. Checklinks is both a bot and tool when scanning selected web pages it proactively archives them using WebCite. When using as a tool, it semi-automatically repair broken links using both WebCite and the Wayback Machine and if only a single repair is needed try clicking "(info)" next to the link. — Dispenser 22:39, 1 February 2011 (UTC)[reply]
Where could I request such a bot? Toshio Yamaguchi (talk) 07:35, 3 February 2011 (UTC)[reply]
you can post to tools:~betacommand/webcite.html and I can take a look at it, with a script im working on. ΔT The only constant 11:46, 3 February 2011 (UTC)[reply]
I have no specific page to archive right now. But thanks for the link, I will add it to my userpage for future use. I will try it as soon as I have something new to archive and compare with the archive interface of WebCite. Thanks. Toshio Yamaguchi (talk) 12:21, 3 February 2011 (UTC)[reply]

This category has an enormous backlog, and I think the clearing of the backlog could benefit from bot assistance. Would it be possible to code a bot to output possible coordinates (found by a bot search on google maps) into a list, and then have people go through the list to check these and manually add them to articles? The bot might also consider what the province is based on categories to do an even smarter search. (This might have to be worked out on a country-by-country basis.) Calliopejen1 (talk) 18:42, 2 February 2011 (UTC)[reply]

I might be able to help out with this (not committing to anything yet, if anyone else wants to have a go, feel free). I have a comment and a question though: My comment is that a cursory glance through the category reveals a whole lot of articles where a google map search isn't going to do much good (e.g. articles for events like historic battles, other articles that have no business being in the category like American Orthodox Catholic Church, etc.) We may be able to get some info for some of these articles, but my gut feeling is that the percentage of success will be fairly low. My question (and this is for the more experienced bot people) is whether a task like this one (which will be a one-time task that results in no edits to article space) requires approval to run, or if one can just run it without approval and dump the results into the bot's userspace. SnottyWong communicate 21:07, 2 February 2011 (UTC)[reply]
Google is evil and you cannot use a bot to search it, (violation of their ToS). ΔT The only constant 21:09, 2 February 2011 (UTC)[reply]
What they don't know won't hurt em... You could either search it slowly (once every 30 seconds or so), or you might be able to use their API depending on how easy it is to get a key from them. SnottyWong chat 21:14, 2 February 2011 (UTC)[reply]
Two things, they dont give out API keys any longer, and BAG cannot approve a bot that violates the ToS of a third party website. ΔT The only constant 21:16, 2 February 2011 (UTC)[reply]
Argh. Is an approval necessary for this type of bot which doesn't edit article space? SnottyWong speak 21:23, 2 February 2011 (UTC)[reply]
From WP:BOTPOL#Approval: "any bot or automated editing process that affects only the operators', or their own, user and talk pages (or subpages thereof), and which are not otherwise disruptive, may be run without prior approval." If you need article text you'll want to use a database dump, though. Anomie 21:55, 2 February 2011 (UTC)[reply]
Thanks for the clarification. I think that in this instance, an unsophisticated bot would only be using article titles, not content. SnottyWong converse 23:02, 2 February 2011 (UTC)[reply]

Delta, you don't need an API key for this function. You can search their location database using this url: http://maps.google.com/maps/api/geocode/json?address=LOCATION&sensor=false. I have code written for this function that you can see here. Its pretty crappy looking, but it works. I'm going to try and contact google and see if we can get an exception to their TOS. Tim1357 talk 21:27, 2 February 2011 (UTC)[reply]

That's a pretty good link. I don't see any other use for such a link besides automated access, so I don't understand why they'd make that link accessible to the public but then say you can't use it... SnottyWong spout 21:32, 2 February 2011 (UTC)[reply]
Well, its supposed to be used with google maps, and only google maps. But I'm currently talking with someone from the foundation to see what we can do. Tim1357 talk 21:37, 2 February 2011 (UTC)[reply]
OK, so here we are: Currently there is a discussion with google to get a full export of the GeoCode data. If that's the case, we can find applicable coordinates for some of the articles with missing coordinates, and bypass the GeoCode API altogether. Tim1357 talk 21:40, 2 February 2011 (UTC)[reply]
Maybe there is a way to use a TIGER dump or some other such source to get coordinates out. 71.141.88.54 (talk) 22:57, 2 February 2011 (UTC)[reply]
In my opinion at first we can use our source (Wikipedia)instead of Google after that we can switch to Google search. my idea is we can develop bot that can works like inter wiki it seachs all of interwikis and comparing coordinations if they are the same changes that languges that they have incorrect cordination or thay havent any cordination.it is so easier to use google search after that we can switch to google search.Reza1615 (talk) 18:02, 3 February 2011 (UTC)reza1615[reply]
That interwiki-like bot sounds interesting. This might also be useful: I didn't know about geonames.org. 71.141.88.54 (talk) 18:43, 3 February 2011 (UTC)[reply]
Might I gently point out that before anyone gets coding any bots, there is a WikiProject devoted to this stuff, and that it's probably best discussed with the experts over at Wikipedia talk:WikiProject Geographical coordinates? (and they're already talking to Google) You can't do coords completely automagically (User:The Anomebot2 makes valiant efforts, and even he can be 100's of miles out) and no source is completely accurate - IME a lot of Google coords are a bit out, and can be waay out; Wikipedia tends to be either spot on or badly out, on average it's a bit worse than Google. Also worth noting that if Bing doesn't know where somewhere is, it assumes Wikipedia coords are correct, which isn't always a sensible assumption.... There's a scorecard of how geocoding is getting on, the guys on their project are making manful progress. The UK is now pretty much done for instance (for real, not just an artifact of the way that scorecard records things). Le Deluge (talk) 02:28, 4 February 2011 (UTC)[reply]
  • Google maps loves Wikipedia so much that it includes a Wikipedia option[4] (NO other organization in the entire world is listed to get such free advertising from Google). The Wikipedia option taps into Wikipedia article listed coordinates and displays a "W" at those coordinates. Mouse over/click on the "W" and you see the Wikipedia article. Google maps has area labels that might be used to obtain geo coordinates for large areas. For example, if you look at Google map with the Wikipedia option selected for the article Mariposa Grove, [5] you will see Google's "Mariposa Grove" map label and see the "W" up and to the right Google's "Mariposa Grove" map label. (Idealy, that "W" should appear closer to Google's "Mariposa Grove" map label, but that is a different issue). OK, here is the bot idea. Get the label name and the geo coordinates for that Google label from Google for each feature(? or what ever Google calls it) that Google has identified on its map. Use the bot to for the matching name Wikipedia article and, if it lacks coordinates, have the bot add it to a list (don't add it to the article) for someone to check. -- Uzma Gamal (talk) 14:39, 4 February 2011 (UTC)[reply]
I found [library for python http://py-googlemaps.sourceforge.net/] that can use google map coordinationReza1615 (talk) 14:54, 4 February 2011 (UTC)[reply]

Bot Request

I am requesting that there is a bot created who clears the completed requests of Wikipedia:Requested_articles. For example, User:Jimbo Wales requests that the article Wikipedia be made, because it doesn't exist yet. Then I create that article, but do not delete the request at the project page. What I am requesting is a bot who automatically deletes the request of Wikipedia at that project page. I do not know whether or not a bot that does this already exists, but it could sure be helpful. Unfortunately, this could possibly also eliminate inspecific requests, such as one that already has a Wikipedia page but the requester is referring to something else. --114.250.35.13 (talk) 08:13, 3 February 2011 (UTC) :Strongly oppose-- humans are much more effective at creating stubs, and bots have failed numerous times in creating stubs within the WP:TOL WikiProject. If a branch of science with unique names for everything has issues, imagine the issues that would arise in other fields. Also, there is a very nice-sized article at Wikipedia and has been for a long time. Bob the WikipediaN (talkcontribs) 12:58, 3 February 2011 (UTC)[reply]

Sorry, but that is not what the request is for. The bot would not be creating stubs, just noting that they were created. ▫ JohnnyMrNinja 13:01, 3 February 2011 (UTC)[reply]
My apology! I shouldn't be allowed to edit Wikipedia so early in the morning! That proposal sounds much better.
  1. Provisional support: Provided the bot only clears blue links that link to the subject without redirect, I support. Bob the WikipediaN (talkcontribs) 13:54, 3 February 2011 (UTC)[reply]
  2. Support Totally useful. Sometimes the requests sit around for ages, or they used to. If that is still the case this would be a useful maintenance bot. --Kleopatra (talk) 15:23, 3 February 2011 (UTC)[reply]
  3. Weak oppose Doesn't need automation; it's easy enough to fix bluelinks manually once in a while. If the bluelinks sometimes sit in Requested Articles for a long time after article creation, they're obviously not causing problems. I actually like it when they stay around for awhile, since they sometimes need human review (or are just interesting to look at). Also, giving the occasional formatting errors in Requested Articles, it could be easy for the bot to clobber useful but malformed entries and surrounding entries by accident. This bot proposal seems like another example of a solution looking for a problem. 71.141.88.54 (talk) 18:51, 3 February 2011 (UTC)[reply]

Article page is a dabpage, talk page is a redirect

Request a bot which can detect when an article page is a dab page but the talk page associated with it is a redirect, and then fix it by replacing the talkpage redirect with the {{WikiProject Disambiguation}} template. If it could also tag the talk pages of all dab pages which are missing the template as well that would be even better :) Thanks, DuncanHill (talk) 15:17, 3 February 2011 (UTC)[reply]

Consensus for that? -- Magioladitis (talk) 16:02, 3 February 2011 (UTC)[reply]
There actually seems to be consensus against WikiProject tagging dab pages that have redlinked talk pages: Wikipedia talk:WikiProject Disambiguation/Archive 24#Talk page tagging. Changing a redirect to a project template seems reasonable, though - Ditto for pages already bluelinked - but you should try and get an affirmative consensus for that. –xenotalk 16:09, 3 February 2011 (UTC)[reply]
Well, I initially asked at VPT if there was a bot that could fix the redirects. There were two replies, one pointing me here and the other saying that it should tag all the talk pages, so I came here. DuncanHill (talk) 16:13, 3 February 2011 (UTC)[reply]
I would also prefer that it did all the pages. --Kumioko (talk) 16:34, 3 February 2011 (UTC)[reply]
This is a decision that needs to be made at the WikiProject talk page. They already had a previous consensus not to tag talk pages with their banner if the talk pages are red. –xenotalk 16:36, 3 February 2011 (UTC)[reply]
How about just fixing the bad redirects? Is that too controversial? DuncanHill (talk) 16:37, 3 February 2011 (UTC)[reply]
No, that seems fine. Desirable, even. –xenotalk 16:43, 3 February 2011 (UTC)[reply]
I could do it. AWB has a skip if doesn't exist feature. I need to be sure that the project agrees. -- Magioladitis (talk) 16:50, 3 February 2011 (UTC)[reply]
Since we are on the topic. I suggest if the dab page includes red links then get rid of those too. People shouldn't be adding articles that don't yet exist to the DAB pages per the MOS but they do it all the time. --Kumioko (talk) 18:14, 3 February 2011 (UTC)[reply]
Kumioko, redlinks in dab pages are perfectly fine, you will never get consensus to remove them with a bot. See Wikipedia:MOSDAB#Red links. On another topic, is this intended to be a one-time run, or something that should be run once per week or once per month? I'm not very familiar with AWB, but if this is more than a one-time thing, it might be better implemented using a more automated process. After all, there are almost 200,000 dab pages to check. Again, I'm not familiar with AWB, but my understanding is that it is something where each edit must be manually accepted by the user (please correct me if I'm wrong). Even if you could check a dab page on average every 5 seconds, it would take you over 11 days to finish the task (assuming you were working 24 hours a day). SnottyWong chat 18:20, 3 February 2011 (UTC)[reply]
AWB can scan the pages to check for redirects on its own (preparse) without human intervention, or a dump report can be requested to show how many dab pages have talk pages that are redirects. AWB can also be run in bot mode. –xenotalk 18:25, 3 February 2011 (UTC)[reply]
  • I support having a bot fix all the redirecting talk pages. I also support bot tagging all redlinked disambig talk pages. I oppose very strongly any scheme for automated removal of red links. Some red links are for articles that we unquestionbly should have, but just have not made yet (for example, many names of people who served on U.S. state supreme courts). Perhaps we can generate a list of disambig pages containing red links, and go over the list manually. bd2412 T 23:55, 5 February 2011 (UTC)[reply]

(edit conflict) The SQL so no one actually have to write it is:

SELECT CONCAT("* [[Talk:", talk.page_title, "]]")
FROM page
JOIN categorylinks ON cl_from=page.page_id
JOIN page AS talk ON talk.page_title=page.page_title AND talk.page_namespace=1

WHERE page.page_namespace=0
AND page.page_is_redirect=0
/* in categories */
AND cl_to IN ("All_disambiguation_pages", "All_set_index_articles")
AND talk.page_is_redirect=1;

4607 rows in set (9.34 sec) if you're wondering. If you want redirects with tagged talk pages you can use catscan. — Dispenser 18:47, 3 February 2011 (UTC)[reply]

Tagging files for WikiProject Video games

Could someone please train a bot to make a one-time sweep for WP:WPVG? Basically, we are hoping that a bot will check all of the articles tagged with Template:WikiProject Video games, make note of all local images that are image-linked on those articles, and then tag the talk page of those articles with Template:WikiProject Video games. The template can automatically tell when it is on a file, and will place that image in a file category. Editors have done this manually for a while, but there are still a lot more to tag. This will help us maintain our images. We may want the bot to run again at some point, but I don't think we need one constantly sweeping. Thanks! ▫ JohnnyMrNinja 22:06, 3 February 2011 (UTC)[reply]

I count roughly 3,500 pages in the "File talk" namespace that have the WPVG banner. So you want a bot to identify all of the articles that link to those 3,500 images, and tag their talk pages with a WPVG banner? How can you be sure that all of the articles that link to these images are actually articles on video games (or that otherwise would be appropriate for inclusion in WPVG)? SnottyWong communicate 22:52, 3 February 2011 (UTC)[reply]
I thought this proposed it the other way around – go through VG articles and tag linked images (files) with the banner. —  HELLKNOWZ  ▎TALK 23:00, 3 February 2011 (UTC)[reply]
(edit conflict) I think Johnny is asking for a bot to tag all the images that are used in VG articles and aren't already tagged. –xenotalk 23:01, 3 February 2011 (UTC)[reply]
Ahh gotcha, I misread that. In that case, is it appropriate to assume that every image used on every video game article is appropriate for inclusion into WPVG? SnottyWong spout 23:03, 3 February 2011 (UTC)[reply]
For instance, just looking at some random articles, I find that Cybersex is part of WPVG, and the image File:Cybersex pic.JPG is part of that article. Should that image really be tagged with a WPVG banner? Some other random images I found that would be tagged: File:Nvidiaheadquarters.jpg, File:Nvidia logo.svg (from Nvidia), and File:Closed captioning symbol.svg, File:Closed Caption Demonstration Still-Felix.png, File:Cc3tout.jpg (from Closed captioning). Also, the majority of these images do not have a talk page at this point. Would it be appropriate for a bot to create potentially thousands of new talk pages solely to add a wikiproject banner to them? SnottyWong confabulate 23:16, 3 February 2011 (UTC)[reply]
For the record, I'm not opposed to the idea, I just want to make sure it's fully thought out before it is implemented. SnottyWong comment 23:53, 3 February 2011 (UTC)[reply]
I thought I responded already, but I think I had the same edit conflict as xeno above. The only thing that is vital to be avoided is that the bot must skip any image in a template, any image that is not image-linked in the text of the article. There are certainly images that will be tagged that shouldn't be, but as no other projects tag images in this way, and nobody looks at image talk pages anyway, I don't think anyone will mind besides us. And again, it should be local files. These will be either A:Non-free or B:moved to Commons. This tagging is useful to us because by the nature of our project, we have a TON of non-free files that need to be wrangled. ▫ JohnnyMrNinja 01:34, 4 February 2011 (UTC)[reply]
At this time, there are 24363 locally-uploaded files used in WPVG articles: 1, 2, 3, 4, 5. 3242 files are currently tagged for WPVG. I made no attempt to exclude images in templates or non-linked images, it's just the results of a query on all images used in the pages. Anomie 02:50, 4 February 2011 (UTC)[reply]
Okay, so there could be 20,000 images to tag (though there may be significantly less due to template images). A one-time bot to tag specific these photos, any takers? ▫ JohnnyMrNinja 22:55, 4 February 2011 (UTC)[reply]
I don't have time today, but I could probably start the process to make this happen on Monday if no one else has done it before then. However, I'd want to be sure that there are no problems with a bot creating thousands of talk pages solely for the purpose of adding a wikiproject banner. Can anyone comment on that? SnottyWong confer 23:21, 4 February 2011 (UTC)[reply]
You may want to see the discussion at Wikipedia talk:Bot policy about that. :) Avicennasis @ 18:05, 2 Adar I 5771 / 6 February 2011 (UTC)
As far as having community support, this was discussed at the main project page, and creating these talk pages for VG articles has been standard practice for some time (that's why there are so many already tagged, as mentioned above). ▫ JohnnyMrNinja 02:31, 7 February 2011 (UTC)[reply]
Made a request at Wikipedia:Bots/Requests for approval/Snotbot 2. SnottyWong squeal 18:11, 8 February 2011 (UTC)[reply]

Bot for geo coordinates where article lists street address

Category:Articles needing coordinates includes articles that need to have the relevant coordinates (latitude;longitude) added. Lewis Ainsworth House is listed in a subcategory of Category:Articles needing coordinates. In addition, Lewis Ainsworth House uses a template where an address location is listed. In particular, its {{Infobox nrhp}} lists "location = 414 E. Chapman Ave<br>[[Orange, California]]." Now, if you add 414 E. Chapman Ave, Orange, California to gpsvisualizer.com, you get 33.787698,-117.849617. So here's the bot idea: Have the bot search out all articles listed in Category:All articles needing coordinates that also use {{Infobox nrhp}} AND have the parameter "location =" filled in. The bot should then get the location info, pass the stree address information through a program such as gpsvisualizer.com to find their latitude and longitude. Then, take that latitude and longitude and add it to that article's {{Infobox nrhp}}. Then remove the article from Category:Articles needing coordinates. There might be other templates that use address locations that also are missing their geo coordinates. -- Uzma Gamal (talk) 14:09, 4 February 2011 (UTC)[reply]

The external link Brazilian Tourism Portal, present in about 700 articles, is broken. Even the "good" link [6] does not seem to be a good link to the articles (no useful info). Does anyone could create a bot to solve this problem? I think the best thing to do is just remove all links. Caiaffa (talk) 14:21, 4 February 2011 (UTC)[reply]

I only found about 15 or so using that exact link, and about 40-50 using the domain (direct links to files broken). Noom talk contribs 15:57, 6 February 2011 (UTC)[reply]
You are probably right, I used the "search button" and didn´t attempt to differences. Sorry for the inconvenience Caiaffa (talk) 17:24, 6 February 2011 (UTC)[reply]

interwiki coordination bot

does any one developed interwiki coordination bot that can check another wikis and like interwiki bot compare them and add or remove coordination template? also i found [library for python http://py-googlemaps.sourceforge.net/] that can use google map coordination Reza1615 (talk) 14:56, 4 February 2011 (UTC)[reply]

Bot Request, Please

Hey All, I was wondering if someone could do an assessment job on all the articles in Category:Unassessed Albemarle County articles via a bot. They would just need to match the assessments of the already exsisting templates. Like if WP:FOO is Class C with Low Importance, WP:ALVA (the WP link for the project connected to this category) would be the same. Could someone do this? - NeutralhomerTalk02:36, 7 February 2011 (UTC)Go Steelers![reply]

There is a bot that already does this. It's pretty easy and the owner (xeno) is prompt and gets the job done in a reasonable timeframe. Click here to file a new request. Make sure you say "yes" under the section asking about default logic. Good luck, and ask questions if you don't understand the form. Tim1357 talk 23:53, 8 February 2011 (UTC)[reply]
I think User:Anomie (who I asked first, on Sunday, at the beginning of the Super Bowl) is going to take care of it. She has a page up and everything. Thanks though. :) - NeutralhomerTalk00:06, 9 February 2011 (UTC)[reply]

Cleanup {{cite doi}} templates

{{cite doi}} templates should have as an argument, the doi. Some of these templates were malformed, so need to be cleaned up.

The full list is

2

These template would need to be moved from {{cite doi/doi:foobar}} to {{cite doi/foobar}}. Then, the articles linking to {{cite doi/doi:foobar}} should have their {{cite doi}} template be updated from {{cite doi|doi:foobar}} to {{cite doi|foobar}}. When that's done, the {{cite doi/doi:foobar}} should be tagged as {{db-g6}} per uncontroversial maintenance, as it would be an unlikely and unused redirect. Headbomb {talk / contribs / physics / books} 04:38, 9 February 2011 (UTC)[reply]

Done. Many of these I had done before, but were subsequently recreated. Perhaps we could make {{cite doi}} strip off the "doi:" and/or flag these. Plastikspork ―Œ(talk) 06:41, 9 February 2011 (UTC)[reply]
Doubtful, as this would require parser functions which I don't think have been enabled here. Regardless, many thanks for the quick job! Headbomb {talk / contribs / physics / books} 06:42, 9 February 2011 (UTC)[reply]
I was thinking we could add a {{#ifeq:{{lc:{{str left| {{{1}}} |4 }} }} | doi: | [[Category:Cite doi requiring repair]] }} or something simlar to {{cite doi}}. But, if it doesn't happen that often, then it's probably not worth it. Plastikspork ―Œ(talk) 06:46, 9 February 2011 (UTC)[reply]

Searching for bot

hello,

I need a bot which archives my user talk page each month (and don't ask me why each month). Thank you.-- ♫Greatorangepumpkin♫ T 15:20, 10 February 2011 (UTC)[reply]

See User:MiszaBot/Archive HowTo or User:ClueBot III#How to archive your page. Anomie 17:04, 10 February 2011 (UTC)[reply]

Would a bot kindly update Portal:Tropical cyclones/Active tropical cyclones? --Perseus8235 17:45, 10 February 2011 (UTC)[reply]

Links:
RSMC/TCWC
NHC, CPHC, JMA, IMD, MFR, BMG, BOM, FMS, Well
NMHSS
Met Office, JTWC, NWS Guam, PAGASA, CWB, HKO, CMA STI, KMA, TMD, VMD CMA
Other links
IRC, NDCC, GMDSS, APECDI, WX Trop, MT Archive, NRL imagery
Running Best Tracks

NRL RBT, NHC RBT.

Help with citation cleanup (tables)

I'd like for someone to go through Category:Cite doi templates / Category:Cite hdl templates / Category:Cite jstor templates / Category:Cite pmc templates / Category:Cite pmid templates and build tables akin to

Etc...

These tables could be hosted somewhere at WP:JOURNALS and maybe in parallel on the toolserver? These tables would immensely help with template cleanup/completion. Headbomb {talk / contribs / physics / books} 22:46, 10 February 2011 (UTC)[reply]

Help with citation cleanup (templates)

Several templates created by Citation bot have had their structure rot over time. Some of it was due to sloppy bot edits, others to human mistakes, and others to sloppy human editing. There's general a dislike for bot-edits which do not create actual changes in appearance, but these are all hosted in the template space, with very very few people watching them (aka, this would really not annoy a lot of people, and people likely to be watching these templates would also be the one to appreciate their tidying up). A "good" template should be formatted in this manner (aka in this order):

CLICK TO SHOW →
{{cite journal

 |last= |first= |authorlink=
 |last1= |first1= |author1link=        Remove empty parameters, if no author-related parameter remains, keep |last1= and |first1=
 |last2= |first2= |author2link=

...

 |author= |authorlink=
 |author1= |author1link=               Remove empty parameters, if no author-related parameter remains, add  |last1= and |first1=
 |author2= |author2link=

...

 |date=
 |year=                                Remove empty parameters, if no date-related parameter remains, add |year=
 |month=

...

 |title=                               Remove empty parameters, |title= should always be present
 |language=
 |transtitle=
 |url=
 |format=

...

 |journal=                             Remove |series= if empty; the others should always be present
 |series=
 |volume=
 |issue=
 |pages=

...

 |publisher=                           Remove empty parameters, also remove |location= if |publisher= is empty
 |location=

...

 |bibcode=                             Remove |isbn=, |issn=, |oclc=, and |id= if empty,
 |doi=                                 add |bibcode=, |doi=, |jstor=, |pmc=, and |pmid= if missing
 |isbn=
 |issn=
 |jstor=
 |oclc=
 |pmc=
 |pmid=
 |id=

...

 |accessdate=                          Remove empty parameters, except |accessdate= if |url= is present
 |archiveurl=
 |archivedate=
 |laysource=
 |laysummary=
 |laydate=

...

 |quote=                               Remove empty parameters
 |ref=
 |separator=
 |postscript=

...
<!--UNUSED DATA-->
                                       All other parameters should be moved here
}}

This would have two great benefits. If the templates are formatted consistently and legibly, newcomers would be much less intimidated by the structure of these templates, and both regular and newcomers will benefit from the improved readability. Compare [7] and [8] for instance. Plus, with the parameter pruning, you can immediately see what is missing, and what should be added instead of being mislead into finding a journal's publisher or wonder what a "separator" is or what the "series" refer to. This would in turn facilitate and encourage good citation completion/maintenance by humans. A daily run wouldn't be needed for this, but a one-time run over all citations combined with monthly runs would be so incredibly helpful here. Once we do {{cite journal}}, we could move on to the other templates ({{citation}}, {{cite book}}, etc...). I've notified Smith609 (who runs Citation bot) for feedback here. Headbomb {talk / contribs / physics / books} 23:42, 10 February 2011 (UTC)[reply]

Sounds like a great plan! Citation bot now formats new cite doi templates in this general manner, but I don't think that it modifies existing parameter orders, for fear of upsetting editors. Martin (Smith609 – Talk) 23:54, 10 February 2011 (UTC)[reply]
In articles it would be a problem as you often have 10-20 references formatted in a specific way. For instance {{cite book}}/{{cite conference}}/{{cite press}}/{{cite journal}} all have different parameters, and a global parameter ordering might have been imposed, instead of a local ordering (aka, {{cite journal}} are ordered one way, but {{cite press}} ordered in another way...). Watchlists get cluttered with edits which some feel are low-value, and will often disturb a conscious decision about how to present citation templates in the edit window. In the {{cite doi/...}} templates however, virtually no one watches them, and you can't "upset" an article's consistency.Headbomb {talk / contribs / physics / books} 00:05, 11 February 2011 (UTC)[reply]

Help with citation cleanup (updated and withdrawn papers)

At User talk:Citation bot#Withdrawn_papers it is noted that on occasion cited papers are updated or withdrawn. A maintenance bot or other tool could follow cited pubmed, doi, or other database identifiers to check for such, then (where appropriate) tag the citation for human attention, possibly amending the citation in the process (e.g. changing |title=Dewey Wins! to |title=Withdrawn:Dewey Wins! (or whatever the database indicates). Martin advises this is beyond Citationbot's scope, so it would need to be a different tool. Given that {{cite doi}} and its ilk bury information in subpages where it is rarely seen, these should get priority. LeadSongDog come howl! 16:24, 11 February 2011 (UTC)[reply]

One example of the above is DOI 10.1002.2F14651858.CD000007 PMID 19588315 as seen at
  • Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1002.2F14651858.CD000007, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with |doi=10.1002.2F14651858.CD000007 instead.

and

  • Attention: This template ({{cite pmid}}) is deprecated. To cite the publication identified by PMID 19588315, please use {{cite journal}} with |pmid=19588315 instead.

LeadSongDog come howl! 16:50, 11 February 2011 (UTC)[reply]

Book report bot, take two

I posted something similar a while ago, but i guess it was rather a daunting task, so I'm scaling it down a bit. Previously, I wanted a bot that creates report of all problems with the books ({{citation needed}} tags, {{POV}} tags ...), but that doesn't look like it'll happen. So, I'm scaling the request down to only report what assessment class the articles of a book are.

Book:Helium Talk page report
Helium
An overview
Overview
Helium
Helium atom
Liquid helium
Isotopes
Isotopes of helium
Helium-2
Helium-3
Helium-4
Miscellany
Alpha particle
Heliair
Heliox
Helium–neon laser
Helium hydride ion
Hydreliox
National Helium Reserve
RasGas
Rollin film
Trimix

Template:Book report start Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report Template:Book report end

These reports would be done for all Category:Wikipedia books (community books), and updated daily. Of course, if someone feels like taking up the original request, that would also be peachy. WikiProject Wikipedia-Books has been notified of this request. Headbomb {talk / contribs / physics / books} 04:08, 11 February 2011 (UTC)[reply]

Hmm, I read your other request to, and I agree with the previous comment about daily updates being too often. When the bot generates a report for a book, where should the report be posted? Noom talk contribs 18:36, 11 February 2011 (UTC)[reply]
On the book's talk page. Daily updates might have been overkill for all the cleanup tags (mostly because I have idea how often these would change), but for the assessment classes I don't think it's too often, as reassessment are usually far and few between. If daily turns out to be too often, it shouldn't be a big deal to make it wait two/three/seven days between report updates.Headbomb {talk / contribs / physics / books} 18:44, 11 February 2011 (UTC)[reply]
I don't think it should be too hard to fulfil your first request. Run rates I would probably say as once per week for article assessments, indexing them for the next run to see if there was any change, but I'm not sure about how often to run a check on the articles contents. Noom talk contribs 19:00, 11 February 2011 (UTC)[reply]
Well that could be figured out during trial. Headbomb {talk / contribs / physics / books} 20:29, 11 February 2011 (UTC)[reply]
Coding... I'll get started then. Noom talk contribs 00:01, 12 February 2011 (UTC)[reply]
However, on some article pages, there are multiple ratings for different wikiprojects. The book you gave me seems to use the V0.5 template, but the book I was testing the bot code on used different templates. Also, should a report be opt-in or opt-out? Noom talk contribs 02:33, 12 February 2011 (UTC)[reply]
It should be exclusion-compliant (aka, responsive to ), but nothing special other than that. Headbomb {talk / contribs / physics / books} 03:18, 12 February 2011 (UTC)[reply]


See User:NoomBot/BookTest for 3-4 examples of book reports. Going to set the bot to append a couple more reports to see if the formatting works for several of them. Also adding more 'problem' templates to detect. Noom talk contribs 18:35, 12 February 2011 (UTC)[reply]

Thousands of find/replace edits may be needed

If the requested move discussion at Talk:New York City Subway succeeds, there will need to be a couple of hundred of page moves and several thousands of find/replace runs for "New York City Subway". The find/replace runs can be done automatically. If the bot ignores image/interwiki links (but not wikilinks), there shouldn't be any false positives. It's too much for assisted AWB to do, so can someone with a bot take on this task, assuming the discussion results in "move"? — Train2104 (talk • contribs • count) 16:41, 12 February 2011 (UTC)[reply]

Hi all,

I have the problem described in this post, but am not clear on how to apply a solution: http://en.wikipedia.org/wiki/Wikipedia:Bots/Requests_for_approval/Erik9bot_5

Like the Irish times, as mentioned in this post, my news website has changed its url. How do I update the 2,000+ links to my site in Wikipedia?

Apparently this fix was created by someone who has been banned so I can't ask him to explain. "This account is a sock puppet of John254 and has been blocked indefinitely."

Thanks much Michelle — Preceding unsigned comment added by Mnicolosi (talkcontribs) 22:59, 12 February 2011 (UTC)[reply]

Signed your comment. Could you also provide the link that needs replacing? Noom talk contribs 23:11, 12 February 2011 (UTC)[reply]


Thanks for the help with the signature. Sorry, not quite used to this system. All urls in wikipedia that are seattlepi.nwsource.com need to be updated to seattlepi.com. Details on the background behind our url change are here if you're interested: http://en.wikipedia.org/wiki/Seattle_Post-Intelligencer

Thanks much, Michelle Mnicolosi (talk) 02:06, 13 February 2011 (UTC)[reply]