Wikipedia:Bot requests
Commonly Requested Bots |
This is a page for requesting tasks to be done by bots per the bot policy. This is an appropriate place to put ideas for uncontroversial bot tasks, to get early feedback on ideas for bot tasks (controversial or not), and to seek bot operators for bot tasks. Consensus-building discussions requiring large community input (such as request for comments) should normally be held at WP:VPPROP or other relevant pages (such as a WikiProject's talk page).
You can check the "Commonly Requested Bots" box above to see if a suitable bot already exists for the task you have in mind. If you have a question about a particular bot, contact the bot operator directly via their talk page or the bot's talk page. If a bot is acting improperly, follow the guidance outlined in WP:BOTISSUE. For broader issues and general discussion about bots, see the bot noticeboard.
Before making a request, please see the list of frequently denied bots, either because they are too complicated to program, or do not have consensus from the Wikipedia community. If you are requesting that a template (such as a WikiProject banner) is added to all pages in a particular category, please be careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively (see example difference).
- Alternatives to bot requests
- WP:AWBREQ, for simple tasks that involve a handful of articles and/or only needs to be done once (e.g. adding a category to a few articles).
- WP:URLREQ, for tasks involving changing or updating URLs to prevent link rot (specialized bots deal with this).
- WP:USURPREQ, for reporting a domain be usurped eg.
|url-status=usurped
- WP:SQLREQ, for tasks which might be solved with an SQL query (e.g. compiling a list of articles according to certain criteria).
- WP:TEMPREQ, to request a new template written in wiki code or Lua.
- WP:SCRIPTREQ, to request a new user script. Many useful scripts already exist, see Wikipedia:User scripts/List.
- WP:CITEBOTREQ, to request a new feature for WP:Citation bot, a user-initiated bot that fixes citations.
Note to bot operators: The {{BOTREQ}} template can be used to give common responses, and make it easier to keep track of the task's current status. If you complete a request, note that you did with {{BOTREQ|done}}
, and archive the request after a few days (WP:1CA is useful here).
Legend |
---|
|
|
|
|
|
Manual settings |
When exceptions occur, please check the setting first. |
Bot-related archives |
---|
Press multi adder
I'd like to request a bot that would go through the news and discover when links are made to wikipedia articles and create a press multi on the talk page. Ideally, the bot would scrape google news and see which articles contain a wikipedia link. Does anyone support this?Smallman12q (talk) 20:09, 5 May 2009 (UTC)
- Any comments?Smallman12q (talk) 01:32, 11 May 2009 (UTC)
- Possible. I suppose it could be done and would be a decent idea, but I'm not going to do it. I already have one bot that's in trial, one whose case is open, and one that I'm coding right now. I'd do it, but I have too many things to handle right now. Sorry. The Earwig (Talk | Contributions) 01:40, 11 May 2009 (UTC)
- Well at least I know its possible now=D. Smallman12q (talk) 14:18, 16 May 2009 (UTC)
I like the idea-I'm going to have a look at it over the next 1/2/3 weeks! Anyone else welcome to help!dottydotdot (talk) 20:01, 19 May 2009 (UTC)- Although, having had a quick look into it, it looks like that's already being done on some articles-For instance the Ted Kennedy article & the Maurice Jarre article talk pages, so probably not needed. dottydotdot (talk) 20:36, 19 May 2009 (UTC)
- What do you mean, "It's already being done on some articles?" This request is for a bot to scan Google News (or something) and find Wikipedia links, not for users to discover links themselves. Just because Ted Kennedy has a note on it about being linked to on MSNBC doesn't mean every link will be found. Am I correct? The Earwig (Talk | Editor review) 20:44, 19 May 2009 (UTC)
- Although, having had a quick look into it, it looks like that's already being done on some articles-For instance the Ted Kennedy article & the Maurice Jarre article talk pages, so probably not needed. dottydotdot (talk) 20:36, 19 May 2009 (UTC)
- Well at least I know its possible now=D. Smallman12q (talk) 14:18, 16 May 2009 (UTC)
- Possible. I suppose it could be done and would be a decent idea, but I'm not going to do it. I already have one bot that's in trial, one whose case is open, and one that I'm coding right now. I'd do it, but I have too many things to handle right now. Sorry. The Earwig (Talk | Contributions) 01:40, 11 May 2009 (UTC)
(New indent)No I suppose-I think I just assumed, well I don't really know!(I was a bit rushed)! OK, I'll keep looking! dottydotdot (talk) 07:27, 20 May 2009 (UTC)
- I'm looking at Google News currently & it would appear that it is very rare for news organisations to link directly to a Wikipedia article-[1]. Any ideas on how best to find when news organisations are talking about a specific article/maybe I've missed something in Google news search etc.?! dottydotdot (talk) 11:10, 20 May 2009 (UTC)
- Google tends not to pick them up as the links are not explicit but rather as a hyperlink with a different set of text(I don't remember the term). For starters, have a look at Wikipedia in Guardian and wikipedia in New York Times. An example of an article containing wikilinks is example.Smallman12q (talk) 01:28, 25 May 2009 (UTC)
Regardless, accessing Google News for links to Wikipedia via a bot would be a breach of their Terms of Service:
"You agree not to access (or attempt to access) any of the Services by any means other than through the interface that is provided by Google, unless you have been specifically allowed to do so in a separate agreement with Google. You specifically agree not to access (or attempt to access) any of the Services through any automated means (including use of scripts or web crawlers) and shall ensure that you comply with the instructions set out in any robots.txt file present on the Services." — Google Terms of Service, Section 5.3
You'd have to do it manually, as Google does not issue API keys anymore. Another option is to try another news search engine, such as Yahoo News, that does give out API keys. I ran into the same problem with EarwigBot II (BRFA · contribs · actions log · block log · flag log · user rights), as did Coren with CorenSearchBot (BRFA · contribs · actions log · block log · flag log · user rights). I hope that helps! The Earwig (Talk | Editor review) 02:58, 25 May 2009 (UTC)
- Another option would be rather than to scrape it off the search engines, one could scrape it directly off the news sites (the top 20 or so), but yahoo news would probably be best. So, are there any takers yet?Smallman12q (talk) 02:29, 30 May 2009 (UTC)
- I would do this, but I can't. I'm currently using the Yahoo API to run EarwigBot II, so if I write another bot to use the Yahoo API, I will be exceeding Yahoo's maximum query limit. I can certainly help another user write the code, though. The Earwig (Talk | Editor review) 02:35, 30 May 2009 (UTC)
- Perhaps you could write the code and then someone else (a willing admin) could run it. Or you could write it in php and it could run on the toolserver? Smallman12q (talk) 18:00, 30 May 2009 (UTC)
- I would do this, but I can't. I'm currently using the Yahoo API to run EarwigBot II, so if I write another bot to use the Yahoo API, I will be exceeding Yahoo's maximum query limit. I can certainly help another user write the code, though. The Earwig (Talk | Editor review) 02:35, 30 May 2009 (UTC)
- I'm pretty much a beginner when it comes to PHP, so that might be a little difficult, and I don't have a toolserver account. I'd much rather do it in Python, because I already have some of the code for it. Hopefully I can get another user to run it for me. Let's see how this turns out: Coding... The Earwig (Talk | Editor review) 18:09, 30 May 2009 (UTC)
- Scratch that, I can run the bot. Yahoo issues multiple API keys to the same user. The bot will be run as EarwigBot III. The Earwig (Talk | Editor review) 18:36, 30 May 2009 (UTC)
Still doing... Most of the code is done. I just have to finish writing two of the functions (query() and makechanges()), and then I'll put up a BRFA for it. The Earwig (Talk | Editor review) 23:56, 30 May 2009 (UTC)
- I would be interested in seeing the code once it's done if that's possible? Dottydotdot (talk) 00:01, 31 May 2009 (UTC)
- Scrap that, found it! Thanks. Dottydotdot (talk) 00:04, 31 May 2009 (UTC)
- Is it possible to also write a bot for Template:Onlinesource for articles referencing wikipedia articles.Smallman12q (talk) 18:04, 31 May 2009 (UTC)
Bot to repair tables created before Aug 2008, for Accessibility
In Bugzilla:18829 we learn that
- "Note: As of August 20, 2008 new tables created by using the Wikipedia table button include border="1" and so they do not have this problem."
however many created previous to that date lack border="1", and will barely readable when cut and pasted outside wikipedia, or by text browser users.
By writing and deploying this bot, you will help Wikipedia better fulfil its Wikipedia:Accessibility#Users_with_limited_CSS.2FJavaScript_support goals of Wikipedia:Accessibility#Tables.
The tables in question, those with class wikitable, are all expected to have borders on them, as that is what is in the stylesheet. So all the author needs to do is check for the class, wikitable, and then double check if there already is a border=... parameter. And then only if not, go ahead and add border="1". (There is no need to also check for "before Aug 2008", as that is not an exact check anyway, as people might have cloned the bad tables from elsewhere later.
Also tables without class wikitable do not have the problem (they look the same, borders or not, stylesheets or not.) so their border= choices should be respected and not tampered with.
Jidanni (talk) 14:13, 18 May 2009 (UTC)
- I would like to see a policy/propsal for this type of change over at Village Pump somewhere first before a change/task like this is undertaken. 203.25.140.97 (talk) 23:06, 18 May 2009 (UTC)
- As near as I can tell, this only affects people copy & paste the text. If that is actually the case, then I think it would be a waste of resources to mass add border=1 to every wikitable that lack one. I also don't understand why it would only affect tables classed as wikitables - why wouldn't every table without border=1 be affected? --ThaddeusB (talk) 23:47, 18 May 2009 (UTC)
- It also effects people using non graphical web browsers (lynx for example) and other possible portable devices in the future, that don't read the css files which is where we style everything these days. 203.25.140.97 (talk) 03:58, 19 May 2009 (UTC)
- The chat on the bugzilla page seems to indicate this is not actually the case, although it is possible those people are mistaken. --ThaddeusB (talk) 04:01, 19 May 2009 (UTC)
- From my reading of the chat, having border="1" does make a difference for some text browsers. Aside from saying specifically that lynx was not affected, they did not mention any specific browsers that this does in fact affect. Nor did they mention how prevalent said browsers are. I think we need more information on just how many users this is affecting and to what degree before we go making more than a 140,000 edits over this. (I did a rough search to come up with that number, and if anything it's overly conservative.) --Dycedarg ж 04:49, 19 May 2009 (UTC)
- They (I) did too mention w3m there in bugzilla. But I'm am hoping you will at least sympathize with the more numerous "cut and paste from Wikipedia" cases, and see the need. Jidanni (talk) 02:40, 27 May 2009 (UTC)
- An analysis of server logs was done a few months ago. The stats for browser are at [2]. Text browsers would be included in "non-mobile, other" which make up 0.40% of requests. Mr.Z-man 05:03, 19 May 2009 (UTC)
- Why not simply add this to AWB's general fixes? There doesn't appear to be any opposition to the addition in and of itself; it's merely that people don't seem to like the idea of a bot specifically doing it. 「ダイノガイ千?!」(Dinoguy1000) 05:28, 21 May 2009 (UTC)
- While we are waiting, I have fixed Help:Table by hand. Jidanni (talk) 07:32, 27 May 2009 (UTC)
- An analysis of server logs was done a few months ago. The stats for browser are at [2]. Text browsers would be included in "non-mobile, other" which make up 0.40% of requests. Mr.Z-man 05:03, 19 May 2009 (UTC)
- The chat on the bugzilla page seems to indicate this is not actually the case, although it is possible those people are mistaken. --ThaddeusB (talk) 04:01, 19 May 2009 (UTC)
- It also effects people using non graphical web browsers (lynx for example) and other possible portable devices in the future, that don't read the css files which is where we style everything these days. 203.25.140.97 (talk) 03:58, 19 May 2009 (UTC)
- As near as I can tell, this only affects people copy & paste the text. If that is actually the case, then I think it would be a waste of resources to mass add border=1 to every wikitable that lack one. I also don't understand why it would only affect tables classed as wikitables - why wouldn't every table without border=1 be affected? --ThaddeusB (talk) 23:47, 18 May 2009 (UTC)
- This request is for a bot to go through all 6,912,559 articles (or all 61,862,387 pages?) just to add "border=1" to every table with class wikitable? Even working from a database dump to avoid loading all those pages, this seems unnecessary. I second Dinoguy1000's suggestion: just have it added to AWB's general fixes. Anomie⚔ 12:07, 27 May 2009 (UTC)
URL change for Template:Amg movie & Template:Amg name
AMG has changed their url format. Because of this the link may soon no longer work. So can you search and change the two templates removing the 1: and 2:.
changing them from:
- {{Amg movie|1:##### |Name}} to {{Amg movie|##### |Name}}
- {{Amg name |2:##### |Name}} to {{Amg name |##### |Name}}
This would be a GREAT help for us. Thanks. -- Phoenix (talk) 04:31, 19 May 2009 (UTC)
- This is not the best solution because the current template will not work with the new format. For a clean transition, I would suggest using a different template. I have set up {{AMG}} to handle all the AMG website links. So the required changes would be:
- {{Amg movie|1:##### |Title}} to {{AMG|movie|id=#####|label=Title}}
- {{Amg name |2:##### |Name}} to {{AMG|movie|id=#####|label=Name}}
- — Martin (MSGJ · talk) 07:22, 19 May 2009 (UTC)
Insert non-formatted text here
- Its just a simple change. Once the bot is finished we just change the url... Its not that complicated. I am very proud of the template I created and how popular it has become. I would love for it to be continued. There are a few services that AMG does and is very well known for their music reviews. The amg name is similar to Template:IMDB name that is for actors. Can the quick change be created to fix these two templates? -- Phoenix (talk) 21:30, 19 May 2009 (UTC)
- I'm sure it can be done with the same name for the template... but I don't understand what the "1:" and "2:" are for. And how are the URLs changing? – Quadell (talk) 22:04, 19 May 2009 (UTC)
- Its just a simple change. Once the bot is finished we just change the url... Its not that complicated. I am very proud of the template I created and how popular it has become. I would love for it to be continued. There are a few services that AMG does and is very well known for their music reviews. The amg name is similar to Template:IMDB name that is for actors. Can the quick change be created to fix these two templates? -- Phoenix (talk) 21:30, 19 May 2009 (UTC)
- Thanks :-D The reason that I made the user input 1:###### or 2:###### before was due to the odd url eg. http://www.allmovie.com/cg/avg.dll?p=avg&sql=1:###### I made it so it had to have the full number encase the numbers before the colon changed the whole template wouldn't be broken... It looks like I should have just hard coded it as http://www.allmovie.com/cg/avg.dll?p=avg&sql=1:###### because the new URLs are http://www.allmovie.com/work/###### and http://www.allmovie.com/artist/##### respectively. -- Phoenix (talk) 01:02, 20 May 2009 (UTC)
Currently the movie templates are used like {{Amg movie |1:356351 |Quantum of Solace}}, which creates the link http://www.allmovie.com/cg/avg.dll?p=avg&sql=1:356351 which redirects to http://www.allmovie.com/work/356351. Luckily the link http://www.allmovie.com/work/1:356351 also pulls up the page just fine (at the moment), so if you change the template to link to http://www.allmovie.com/work/{{{1}}} it shouldn't break any links. Then someone (I can do it) could run AWB to remove 1: and 2: from the templates. – Quadell (talk) 01:26, 20 May 2009 (UTC)
- Ok Quadell the templates have been changed. You can run AWB whenever you wish. Please let us know when its done so that we can change the templates documentation :-) -- Phoenix (talk) 04:16, 21 May 2009 (UTC)
- Done You can now change the documentation. – Quadell (talk) 18:29, 26 May 2009 (UTC)
Thanks :-) -- Phoenix (talk) 01:14, 28 May 2009 (UTC)
Tagging uncategorized pages
I propose a task for finding and tagging with the tag {{subst:dated|uncategorized}} pages without any categories. Beagel (talk) 19:14, 23 May 2009 (UTC)
- Like this? - Jarry1250 (t, c) 19:23, 23 May 2009 (UTC)
- Exactly. Somehow never seen it operating. Beagel (talk) 19:47, 23 May 2009 (UTC)
- I could code this. --Chris 13:27, 24 May 2009 (UTC)
- There are a number of bots that have been approved for this, but I am not sure if any are active. You might be interested in this Untagged Uncategorized Articles toolserver report. It is a "live" report vs. Special:UncategorizedPages. -- JLaTondre (talk) 13:49, 24 May 2009 (UTC)
- If you want I can run a bot like this. I'm running one at the Dutch Wikipedia, so I'd just need to exclude "inuse" from my database query and apply for approval. You would need to explain to me what is meant by inuse though. --Erwin (talk) 20:24, 24 May 2009 (UTC)
- BRFA filed at Wikipedia:Bots/Requests for approval/Erik9bot 8. Erik9 (talk) 22:00, 25 May 2009 (UTC)
- I could code this. --Chris 13:27, 24 May 2009 (UTC)
- Exactly. Somehow never seen it operating. Beagel (talk) 19:47, 23 May 2009 (UTC)
Need abot to modify some templates
I need a bot to go through the articles in Category:Mixed Drinks articles by quality and change any banner with the |focus=bar switch so that the banner will contain the |bar=yes switch.
eg
{{WikiProject Mixed Drinks|focus=bar}}
→ {{WikiProject Mixed Drinks|bar=yes}}
The total number of articles is around three hundred (300) and about 30% have the |focus=bar switch
Thank you for your time, --Jeremy (blah blah) 02:31, 25 May 2009 (UTC)
- Possible. 100 changes? Looks like an easier job for someone with AutoWikiBrowser than a bot, but it certainly is doable both ways. Contact me if AutoWikiBrowser can't be used, and I'd be happy to make a bot for it. The Earwig (Talk | Editor review) 03:05, 25 May 2009 (UTC)
I'd love to do this using AWB, I even downloaded it but cannot figure out how to use it. --Jeremy (blah blah) 06:18, 25 May 2009 (UTC)
- Hi there, my bot Thehelpfulbot has been approved to do find/replace tasks like this one. I can do this task if you like, after my bot has finished running the task that it is currently on. The Helpful One 14:17, 25 May 2009 (UTC)
Thank you, that would be greatly appreciated. --Jeremy (blah blah) 14:30, 25 May 2009 (UTC)
{{Doing}}
- I ran a filter through the list, so it's going through all the talk pages in the Category - a total of 298. As you say there are only about 100 find/replaces to be done, it should go through the list and complete the task soon! :) The Helpful One 14:58, 25 May 2009 (UTC)
- Done! Hmm, that was certainly odd, the bot went through the entire list but only made 6 find/replace edits. I think that this might mean that someone else might have already ran through the list? The Helpful One 15:05, 25 May 2009 (UTC)
- Or it simply could be an over-estimation (although unlikely). I didn't know that we approved bots for generic find/replace tasks— I thought we only approved them for specific find/replace tasks. Gee, if I had known about this with EarwigBot I's 3rd task... The Earwig (Talk | Editor review) 15:18, 25 May 2009 (UTC)
Could some one tag all of the articles under Category:Bartending with {{WPMIX|bar=yes}}
? The reason this was the problem is that not all articles have the |bar= switch on them. --Jeremy (blah blah) 04:48, 27 May 2009 (UTC)
- Done I checked using AWB, and none of the pages under the category were missing the |bar parameter. MacMedtalkstalk 02:53, 30 May 2009 (UTC)
I thank you sir. --Jeremy (blah blah) 04:07, 30 May 2009 (UTC)
Stub tagging for WP:Food
This is a large request and I would like to know if this can be done easily...
I would like to have a bot to go through the Category:Unassessed Food and drink articles and assess the stubs. The problem is there are 10,322 unassessed articles and I am afraid this would bog the system down.
Could this be done without disturbing the system?
--Jeremy (blah blah) 02:46, 25 May 2009 (UTC)
- Possible. If I remember correctly, this is something BetacommandBot used to do before it and its owner were indef-blocked. Like the above task, it doesn't look that hard. Probably a little more complex, but still very doable. The Earwig (Talk | Editor review) 03:21, 25 May 2009 (UTC)
- Easy, really. Besides assessing stubs as "stub", it's also trivial to assess redirects (if any) as "redirect" and disambiguation pages as "disambig". Slightly more complicated, but still easy, is to also copy the highest or lowest rating from any other projects' banners on the talk page. Several bots can do this; AnomieBOT is busy with another assessment run at the moment, but if no one else has done this by the time that's done then I can take care of it. Anomie⚔ 04:37, 25 May 2009 (UTC)
Thank you, if that would not be a bother, it would be appreciated. --Jeremy (blah blah) 04:44, 25 May 2009 (UTC)
- Sambot has approval for exactly this kind of task. I'll do a few edits for you to check over first. [[Sam Korn]] (smoddy) 15:23, 25 May 2009 (UTC)
- See what you think of these edits. [[Sam Korn]] (smoddy) 15:43, 25 May 2009 (UTC)
Those are very good, most are dead on there were a couple that were rated as Start that could be a C. The difference I use is the amount of content in the article. I would say go ahead and do that for us.--Jeremy (blah blah) 00:59, 26 May 2009 (UTC)
- Doing... [[Sam Korn]] (smoddy) 18:16, 26 May 2009 (UTC)
- Done About 7,200 articles tagged; 3046 left untagged. [[Sam Korn]] (smoddy) 11:09, 28 May 2009 (UTC)
Thank you very much sir! --Jeremy (blah blah) 04:06, 30 May 2009 (UTC)
Bot to regularly save data from Special:Statistics
As discussed here, it would be useful to see the evolution of the data from Special:Statistics. I'd like to request a bot regularly saving this data in Wikipedia space, so we can have regular data for each parameter, then create graphs, etc, with the exceptions of Founder, Stewards, Importers, Transwiki importers and Uploaders, as they are not changing or unused. The first eight parameters change more often, so they may need to be saved more often than others, say every day, while others maybe every week. Cenarium (talk) 13:42, 27 May 2009 (UTC)
- Hello, this is a pretty easy task and I'd be glad to do it. I just need to know how/where you want the data stored. The bot could generate some graphs as well. One possibility is to have a page with some stats graphs and a link to the raw data,which could be saved in a very simple (text) database so that anyone can download and analyze the data however they see fit. --ThaddeusB (talk) 20:04, 27 May 2009 (UTC)
Licensing update bot
Okay, who wants to write and run a bot to retag 1.6 million files? ;-)
See commons:Commons:License Migration Task Force. Dragons flight (talk) 22:56, 27 May 2009 (UTC)
- I replied here: commons:Commons talk:License Migration Task Force#Bot request. --MZMcBride (talk) 23:01, 27 May 2009 (UTC)
Redirects/disambig from ticker symbols
Some ticker symbols (e.g. JAKK) do not link to the companies they represent. I'd like to see a bot that would ensure that every company that had an article and a ticker symbol had its article linked (either as a redirect or on a disambiguation page) from the ticker symbol. NeonMerlin 05:00, 28 May 2009 (UTC)
- Is there a list of these ticker symbols with the appropriate company articles somewhere? "Pages embedding certain templates (which?) in Category:Ticker symbol templates" would work as a list, if we could rely on any article about a publicly traded company using the appropriate template.
- A bot could easily enough create the redirect if the page is non-existent (i.e. JAKK), and it could possibly append to the list for a dab page formatted as MMM (although that one is already there), but for a categorized dab like A (disambiguation) it would be hard for the bot to know where to add it (although that too is already there) and thus would probably have to just log them for human attention. Anomie⚔ 10:59, 28 May 2009 (UTC)
<ref> tags and punctuation/whitespace
Hi there, I often find myself obsessive-compulsively moving around <ref> tags because I believe they don't comply to what I think I once read are their usage guidelines.
So, if we denote our citation with "[1]", I believe the following applies (please kindly point me to the written and agreed-upon guidelines because I can't find them any more)
1. Ref should go after punctuation
- Wrong: Bla bla bla[1].
- Correct: Bla bla bla.[1]
- Wrong: Bla bla bla[1], but also bla.
- Correct: Bla bla bla,[1] but also bla.
2. No whitespace between punctuation and ref, or between ref and ref.
- Wrong: Bla bla bla. [1]
- Correct: Bla bla bla.[1]
- Wrong: Bla bla bla.[1] [2]
- Correct: Bla bla bla.[1][2]
Would this be something that could be done by a bot? Or does it fall into the "cosmetics" class, which I understand is not bottable? If this is possible and considered useful, please let me know. I could have a go and do it myself. Thank you. 114.150.83.231 (talk) 18:49, 28 May 2009 (UTC)
- No, a bot can't do this. There are too many corner cases to do this automatically. I have a script which works fairly well, but the edits need to be previewed because it screws up occasionally. Unfortunately, I think you would need an account to run a script. Gimmetrow 18:52, 28 May 2009 (UTC)
- Do you think that a bot could reliably remove spaces from in front of reference tags, but without changing punctuation? That would probably have far fewer false positives and it would also be super-easy to code. –Drilnoth (T • C • L) 18:55, 28 May 2009 (UTC)
- If an article already has a consistent style with spaces between punctuation and ref tag, then it should usually be left alone. I don't think a bot (ie, automatic) is a good idea here. I personally think it's better that editors use a script and take responsibility for checking the edits. Gimmetrow 19:06, 28 May 2009 (UTC)
Thanks. Could you please give me examples of the possible false positives you have in mind, so I can understand? Cheers. 114.150.83.231 (talk) 18:59, 28 May 2009 (UTC)
- Some examples are given on the talk page of the script. Most situations I notice involve html comments and ellipses. It's also possible that line breaks can interfere. Gimmetrow 19:06, 28 May 2009 (UTC)
- Sorry to be thick, but where is the script you are referring to? 114.150.83.231 (talk) 19:09, 28 May 2009 (UTC)
I think at least automatically eating any whitespace to the left of any reference tag (item 2 above) should be fairly safe, no? 114.150.83.231 (talk) 19:03, 28 May 2009 (UTC)
- Not if an article already has a consistent style with spaces between punctuation and ref tag. Gimmetrow 19:11, 28 May 2009 (UTC)
- "If an article already has a consistent style with spaces between punctuation and ref tag, then it should usually be left alone." - I actually disagree to this. This is an area where a consensus should be reached if there isn't one already, and all articles should stick to it, leaving WP as a whole looking more consistent and professional. 114.150.83.231 (talk) 19:11, 28 May 2009 (UTC)
- See WP:REFPUNC. And even if there were a consensus, it would be a cleanup thing that isn't worth having a bot running around to do only that. Anomie⚔ 19:53, 28 May 2009 (UTC)
Thanks. It looks to me like there is a clear consensus for point 2 above (spaces). I'm new to bots. Could you kindly summarize (or point me to relevant discussion/policy) for the rationale behind "it would be a cleanup thing that isn't worth having a bot"? Thank you. 114.150.72.41 (talk) 00:26, 29 May 2009 (UTC)
- Making large numbers of edits for the sole purpose of the addition or removal of whitespace is strongly discouraged, as it increases server load with little resulting benefit. Erik9 (talk) 00:52, 29 May 2009 (UTC)
- Agreed. With all due respect, this bot does not seem very helpful. Having ref tags in different places is not a serious problem, and will result in a large number of edits that are, frankly, unnecessary. The Earwig (Talk | Editor review) 00:58, 29 May 2009 (UTC)
I'm not sure I understand. It's bad to have a bot do this, but nobody would object to users wasting their time effecting the same number of edits manually fixing the refs in compliance with agreed-upon guidelines. If server load is the problem, can we not throttle the bot to whatever level is perceived necessary? 114.150.72.41 (talk) 02:27, 29 May 2009 (UTC) ...and as for the usefulness of such a bot, I guess it's subjective. To me, it would be helping making Wikipedia look more consistent and professional. 114.150.72.41 (talk) 02:30, 29 May 2009 (UTC)
- I think that AutoWikiBrowser has something that deals with references and whitespaces, but I'm not sure. It would be worth checking with the maintainers of AWB what the status is.Headbomb {ταλκκοντριβς – WP Physics} 04:11, 29 May 2009 (UTC)
Unreferenced football BLPs
A bot is needed to update Wikipedia:WikiProject Football/Unreferenced BLPs/Sorted by country by computing the intersections of each subcategory of Category:Football (soccer) players by nationality with Category:All unreferenced BLPs, per the discussion on my talk page. Though I'm a bot operator myself, this is beyond what I can easily handle. Erik9 (talk) 23:51, 28 May 2009 (UTC)
Journal compilation
It would be nice if a bot retrieved the |journal= parameter from {{citation}} and {{cite journal}} (probably using data dumps) and built a list of journals and journal abbreviations with the number of times they are found. This would be useful for Wikipedia:WikiProject Journals, so they could assess what are the high-priority missing journals, redirect to main articles, etc...
The list should be alphabetically ordered, with entries linked. Redirects should be italicized. Place the list at Wikipedia:WikiProject Academic Journals/Bot compilation/X1, where X is the appropriate letter. If articles start with The X, then classify according to X. A 500 entries per page limit would be a good idea (then go to X2, X3...). After this is done, any redlink with a count of over 10 hits (1 citation = 1 hit) should be placed at Wikipedia:WikiProject Academic Journals/Bot compilation/Missing articles and redirects/1. Again a 500 entries per page limit would be a good idea (then go to /2, /3, ...). Headbomb {ταλκκοντριβς – WP Physics} 03:10, 30 May 2009 (UTC)
- The bot would be runned everytime there is a dump. Headbomb {ταλκκοντριβς – WP Physics} 03:13, 30 May 2009 (UTC)
- Hello, I would be glad to take this on if you can wait a few days before I start on it. --ThaddeusB (talk) 17:57, 31 May 2009 (UTC)
Abuse filter reporter
We need a bot to report cases of certain abuse filters being tripped. I watch several AFs where the only activity is by easy-to-spot long-term-abuse socks. But if I am not actively watching them and do not have abuse IRC open, then the socks just keep trying and trying until they find a way around the filter. If a bot monitored such AFs and immediately reported them to WP:AIV/TB2 - which is watchlisted by many admins - the socks would be blocked very quickly. Recommend the bot operator have sole discretion over which AFs are reported immediately, taking discussion/consensus into account in contentious cases. As a bonus, maybe other AFs are reported only if tripped a certain number of times - as happens in the abuse IRC. Thanks! Wknight94 talk 18:32, 30 May 2009 (UTC)
- Coding... Mr.Z-man 20:47, 30 May 2009 (UTC)
List of articles
List all articles alphabetically (no doublon please) of the category and subcategories Prince Edward Island here. Thanks! —Sniff (talk) 14:05, 31 May 2009 (UTC)
- Article #1
- Article #2
- ...
- Done I think. - Jarry1250 (t, c) 14:26, 31 May 2009 (UTC)
- Thank you, but can you to do again this request ? A bad category is include, Category:Aboriginal peoples in Atlantic Canada, in first list. It's my error, sorry... Best regards, --Sniff (talk) 19:15, 31 May 2009 (UTC)
- Done I think. - Jarry1250 (t, c) 14:26, 31 May 2009 (UTC)
Longtalk Template
I'm requesting for a bot to stick the {{Longtalk}} template onto talkpages that are indeed excessively long. This should be a very very simple bot to code.=D Smallman12q (talk) 18:15, 31 May 2009 (UTC)
- How long is an excessively long piece of string? - Jarry1250 (t, c) 18:29, 31 May 2009 (UTC)