Wikipedia:Bot requests
Commonly Requested Bots |
This is a page for requesting tasks to be done by bots per the bot policy. This is an appropriate place to put ideas for uncontroversial bot tasks, to get early feedback on ideas for bot tasks (controversial or not), and to seek bot operators for bot tasks. Consensus-building discussions requiring large community input (such as request for comments) should normally be held at WP:VPPROP or other relevant pages (such as a WikiProject's talk page).
You can check the "Commonly Requested Bots" box above to see if a suitable bot already exists for the task you have in mind. If you have a question about a particular bot, contact the bot operator directly via their talk page or the bot's talk page. If a bot is acting improperly, follow the guidance outlined in WP:BOTISSUE. For broader issues and general discussion about bots, see the bot noticeboard.
Before making a request, please see the list of frequently denied bots, either because they are too complicated to program, or do not have consensus from the Wikipedia community. If you are requesting that a template (such as a WikiProject banner) is added to all pages in a particular category, please be careful to check the category tree for any unwanted subcategories. It is best to give a complete list of categories that should be worked through individually, rather than one category to be analyzed recursively (see example difference).
- Alternatives to bot requests
- WP:AWBREQ, for simple tasks that involve a handful of articles and/or only needs to be done once (e.g. adding a category to a few articles).
- WP:URLREQ, for tasks involving changing or updating URLs to prevent link rot (specialized bots deal with this).
- WP:USURPREQ, for reporting a domain be usurped eg.
|url-status=usurped
- WP:SQLREQ, for tasks which might be solved with an SQL query (e.g. compiling a list of articles according to certain criteria).
- WP:TEMPREQ, to request a new template written in wiki code or Lua.
- WP:SCRIPTREQ, to request a new user script. Many useful scripts already exist, see Wikipedia:User scripts/List.
- WP:CITEBOTREQ, to request a new feature for WP:Citation bot, a user-initiated bot that fixes citations.
Note to bot operators: The {{BOTREQ}} template can be used to give common responses, and make it easier to keep track of the task's current status. If you complete a request, note that you did with {{BOTREQ|done}}
, and archive the request after a few days (WP:1CA is useful here).
Legend |
---|
|
|
|
|
|
Manual settings |
When exceptions occur, please check the setting first. |
Bot-related archives |
---|
WP:CHICAGO bot assistance needed
A few weeks ago I made the same request below and curiously someone attempted to help using AWB. That is not what I want. I need a bot to run every few days. It seems User:SatyrTN and his bot User:SatyrBot are no longer active. WP:CHICAGO needs articles in WP:CHIBOTCATS tagged with {{WikiProject Chicago}}. It would also be helpful if the bot autostubbed talk pages that have templates from other projects with class=stub. I think his bot also tagged newly found articles with FA, FL, and GA parameters and added them to class when it added the template. Satyr is not responding to either wiki messages or email regarding so we can not start with his old code.--TonyTheTiger (t/c/bio/WP:CHICAGO/WP:LOTM) 07:10, 2 June 2008 (UTC)
- I'll see what I can do. As for running it every few days is something that at this time due to school I cannot do. Weekly perhaps? CWii(Talk|Contribs) 13:26, 2 June 2008 (UTC)
Weekly would be fine. It is sure better than not once in the last six weeks.--TonyTheTiger (t/c/bio/WP:CHICAGO/WP:LOTM) 05:15, 6 June 2008 (UTC)
- Oh god. I forgot. Sorry. :P CWii(Talk|Contribs) 02:07, 12 June 2008 (UTC)
- Okay, I'll tell you what. I'll do a run right now without assessments. CWii(Talk|Contribs) 02:10, 12 June 2008 (UTC)
- I see you added Category:John Bot tagging 1. Do most tagging bots use AWB and categories? When will you actually search the cats for new articles?--TonyTheTiger (t/c/bio/WP:CHICAGO/WP:LOTM) 18:39, 12 June 2008 (UTC)
- My bot uses python so it will recurse the Category:John Bot tagging 1 and do the categories in it. I'm working on the assessment code right now and there is a bug that needs to be fixed so hang tight. CWii(Talk|Contribs) 19:41, 12 June 2008 (UTC)
- I don't know what python is or the meaning of the word recurse as used, but remain patient and hopeful.--TonyTheTiger (t/c/bio/WP:CHICAGO/WP:LOTM) 18:53, 13 June 2008 (UTC)
- Python is a programming language, used for programming Wikipedia bots with the pywikipedia libraries, just so you know =P RichardΩ612 Ɣ ɸ 19:46, June 13, 2008 (UTC)
- Thanks.--TonyTheTiger (t/c/bio/WP:CHICAGO/WP:LOTM) 15:55, 18 June 2008 (UTC)
- Python is a programming language, used for programming Wikipedia bots with the pywikipedia libraries, just so you know =P RichardΩ612 Ɣ ɸ 19:46, June 13, 2008 (UTC)
- I don't know what python is or the meaning of the word recurse as used, but remain patient and hopeful.--TonyTheTiger (t/c/bio/WP:CHICAGO/WP:LOTM) 18:53, 13 June 2008 (UTC)
Category:Fooian football clubs
Is there any chance that someone could create a bot that would move all "Fooian football clubs" categories (which can be found in Category:Football (soccer) clubs) to Category:Football (soccer) clubs by country? I have started the process off by doing all of the As and Bs manually, but to do the rest from C to Z is a bit too much to do manually, so a bot would be perfect. – PeeJay 10:03, 2 June 2008 (UTC)
- This could be done relatively easily with AWB, but you might want to point to an established consensus first. dihydrogen monoxide (H2O) 10:13, 2 June 2008 (UTC)
- I can probably get you one of those. I should also add to the request that each category should be sorted by the name of the country, rather than the demonym used in the category's name. – PeeJay 10:17, 2 June 2008 (UTC)
- That would be a bit more difficult to do with AWB (or at least, with my skills), though it probably could be done with something more complex. If you can point to some agreement amongst Soccer/Football editors that the cats should be the way you're asking for them, it should be a good start. dihydrogen monoxide (H2O) 10:24, 2 June 2008 (UTC)
- I've created a discussion at WP:FOOTY requesting consensus on this matter. – PeeJay 10:25, 2 June 2008 (UTC)
- My only worry is that I won't get (m)any replies to the discussion topic, due to the fact that no one at WP:FOOTY seems to care that much about the organisation of their sub-categories. I'm fairly sure I can guarantee there will be no opposition to the move, however. – PeeJay 12:00, 2 June 2008 (UTC)
- IF and when you get the consensus let me know. I could do it with python. CWii(Talk|Contribs) 13:23, 2 June 2008 (UTC)
- Well, after 10 hours, I currently have two users (including myself) in support of the move, and none opposed. – PeeJay 20:59, 2 June 2008 (UTC)
- IF and when you get the consensus let me know. I could do it with python. CWii(Talk|Contribs) 13:23, 2 June 2008 (UTC)
- That would be a bit more difficult to do with AWB (or at least, with my skills), though it probably could be done with something more complex. If you can point to some agreement amongst Soccer/Football editors that the cats should be the way you're asking for them, it should be a good start. dihydrogen monoxide (H2O) 10:24, 2 June 2008 (UTC)
- I can probably get you one of those. I should also add to the request that each category should be sorted by the name of the country, rather than the demonym used in the category's name. – PeeJay 10:17, 2 June 2008 (UTC)
- OK, it's been over two days now, and I've only had one reply to the suggestion. Could we go ahead with making the bot now? – PeeJay 11:23, 5 June 2008 (UTC)
- Ah yes. BRFA filed CWii(Talk|Contribs) 01:56, 6 June 2008 (UTC)
- Doing... a trial CWii(Talk|Contribs) 16:13, 8 June 2008 (UTC)
- It doesn't seem to have worked properly. Instead of just moving some categories, it's also moved the articles that were in Category:Football (soccer) clubs. I'll undo this now. – PeeJay 14:49, 13 June 2008 (UTC)
- Doing... a trial CWii(Talk|Contribs) 16:13, 8 June 2008 (UTC)
- Ah yes. BRFA filed CWii(Talk|Contribs) 01:56, 6 June 2008 (UTC)
OK, I've done all the moves manually now. No need for this request any more. – PeeJay 16:08, 18 June 2008 (UTC)
As per consensus agreed upon by the Rugby league WikiProject, the template {{Infobox rugby league biography}} would be changed to the sandboxed version here. The newer template (sandboxed version) while different, makes it easier for users to change information than the current version. The consensus can be seen on the WikiProject talk page.
However, the format of template is our worry but our problem is that the current version is used on approx 1500 articles and manually changing each article would be tedious and long. This is why we ask for a bot to do this, of all the instructions are explained below.
Changes
This may be hard for a programmer, but I'm sure it can be done.
For every instance of the syntax of the {{Infobox rugby league biography}} on MainSpace articles, the following should be done:
The parameter years, needs to be seperated at each use of <br/> or <br> and then seperated by each use of a dash and placed in the coresponding parameters year1 ... so on. For example:
- Old parameter years = 1914<br/>1915-1918<br/>1918-1920<br>1921
- Would become: year1 = 1914, year2 = 1915-1918, year3 = 1918-1920, year4 = 1921 (the template allows up to ten years)
THEN
- Would become: year1start = 1914, year2start = 1915, year2end = 1918, year3start = 1918, year3end = 1920, year4start = 1921
OK, so from the first line it would get to the third line. It needs to be seperated first at each use of
or
then by each use of a dash (remembering dashes come in many forms)
The parameters transferred are:
years into year(counter)start and year(counter)end
The counter should change every use of
or
into an integer (1, 2, 3 ...)
Then the first year/word would become the start parameter, and the second term the end paramter
So "1914" would have no end parameter
"1914-1915" would have both a start and end parameter
"19??-present" use of strings in both start and end parameter
"1914-1915, 1917-1918" Uses two dashes, which should take the first date (1914) and the last date (1918) as start and finish.
THE SAME EXAMPLES ABOVE NEED TO BE DONE AS WELL FOR THE FOLLOWING PARAMETERS:
- repyears into year(counter)start and year(counter)end WHERE the counter is a capital Letter (A, B, C ...) not a number
- coachyears into coachyear(counter)start and coachyear(counter)end where the counter is a Number
- repcoachyears into coachyear(counter)start and coach(*counter*)end where the counter is a capital Letter (A, B, C ...) not a number
- refereeyears into refereeyear(counter)start and refereeyear(counter)end where the counter is a number
The parameter clubs needs to be seperated at each use of <br/> or <br> and placed in the coresponding parameters club1, club2... so on. For example:
- clubs into club(counter) where counter is a number (starts at 1) which is a new one at the seperation of each <br> or <br/>
- repteam into team(counter) where the counter is a LETTER (starts at A) and seperated at each <br> or <br/>
- coachclubs into coachteam(counter) where the counter is a number (starts at 1) and seperated at each <br> or <br/>
- refereecomps into refereecomp(counter) where the counter is a number (starts at 1) and seperated at each <br> or <br/>
- refereecaps into refereeappearances(counter) where the counter is a number (starts at 1) and seperated at each <br> or <br/>
The parameter caps(points) will be more tedious. It needs to be seperated both at each use of <br/> or <br> AND the brackets. These will be placed in the parameters appearances1 ... so on AND points1 ... so on. For example:
- Old parameter caps(points) = 12 (32)<br/>14 (324)<br>281 (1010)
- Would become: appearances1 = 12, points1 = 32, appearances2 = 14, points2 = 324, appearances3 = 281, points3 = 1010 (template allows ten times)
- So each would be seperated by either a <br> or <br/>
- You would then get multiple values in the form of "x (y)" or just "x" or just "(y)"
- So x (y) would be seperated into x and y.
- The x will go into the parameter appearances(counter) where the counter starts at one and goes up by 1 for each value (seperated by the br)
- The y will go into the points(counter) same as above.
This would have to be the same for:
- repcaps(points) into appearances(counter) and points(counter) where the counters ARE LETTERS (starting at A) and seperated like above.
The Birth and Death date, may have some information for others but no information or limited information for others:
- The old parameter dateofbirth and dateofdeath respectively.
Now, there will be different formats for both of these old parameters. Some will be in {{birth date and age}}, others in other templates. Some will be just written out in date form. Others might just have a year. But they somehow have to be pushed into three seperated parameters each: dayofbirth, monthofbirth, yearofbirth and dayofdeath, monthofdeath, yearofdeath.
Some other small tasks:
- The pcupdate and repupdate should get the most recent date (one may be 12 may 08, the other 12 june 08 (in which case it would be 12 June 2008) and then seperate it into a year ("2008" in this instance) to the year parameter and the date "12 June" into the date parameter, BUT NOT LINKED.
The following parameters are no longer used and should be deleted:
- | occupation =
- | school =
- | university =
- | spouse =
- | children =
- | relatives =
- | currentclub =
- | clubnumber =
- | youthyears =
- | youthclubs =
- | youthrepyears =
- | youthrepteam =
- | youthrepcaps(points) =
- | otheryears =
- | otherclubs =
- | url =
They may be a minority of articles that don't follow the old MOS for this infobox, the bot should create a list (somewhere) of articles that should be manually changed. I don't know much about bots, but I think this may be better done in sections/speratetly. Note that the template is currently in the old format, and we won't change it until any bot's are used.
I may be unclear in some parts, I'm here to go through with this, so I'll help to gain understanding, and this may not be as hard as it seems (to me any way). The Windler talk 09:29, 7 June 2008 (UTC)
- Any responses? The Windler talk 13:04, 11 June 2008 (UTC)
- I can do this, but not right now. If there are no responses by June 28 (first time I'm really free), I'll write something. Happy‑melon 14:35, 11 June 2008 (UTC)
- Thanks, I'm in no rush. I'll just add a comment every few days, to prevent this being archived. The Windler talk 21:22, 11 June 2008 (UTC)
- Preventing archiving The Windler talk 11:06, 15 June 2008 (UTC)
- I can do this, but not right now. If there are no responses by June 28 (first time I'm really free), I'll write something. Happy‑melon 14:35, 11 June 2008 (UTC)
Bot needed to clean up 50,000+ disambiguation pages
Due to the fact that many disambiguation pages predate Wikipedia:Manual of Style (disambiguation pages), and the fact that many/most editors are not familiar with the manual of style, about 2/3 of our disambiguation pages need some sort of cleanup.
A sizeable amount of this cleanup (though not all) could be done with a bot. Each entry in a disambiguation page is supposed to start with an asterisk, end with no punctuation, contain only one blue linked term, and this blue link should generally not be piped (so clumsy article titles like Mercury (element) should look exactly like that). In addition, DAB pages should contain no external links, no references sections, and (with a few exceptions) be in no categories except Category:Disambiguation.
It would be very helpful to have a bot which could check all 100,000 DAB pages, removing periods from the end of entries, removing categories, converting numbered lists to bulleted lists, commenting out external links and references sections, and in those cases where it would be possible for the bot to do so, removing extra blue links and inappropriate piping of article titles. I estimate that the bot would end up doing cleanup on about 50,000 pages.
This has been discussed on Wikipedia:WikiProject Disambiguation and, while there is agreement that this needs doing, none of us have the technical knowhow to write our own bot. --Xyzzyplugh (talk) 17:40, 7 June 2008 (UTC)
if you can give some diffs of dabs that you have cleaned up Ill take a look at the issue and see what I can do. βcommand17:42, 7 June 2008 (UTC)
- [1], [2], [3]. These don't involve all of what needs to be done, but the other stuff (removing categories, commenting out external links, etc.) is pretty straightforward. --Xyzzyplugh (talk) 18:14, 7 June 2008 (UTC)
- How would categories on page like Auburn Township, Ohio be handled? §hep • ¡Talk to me! 20:35, 7 June 2008 (UTC)
- I'm not entirely up to speed on the exact set of categories which are currently considered appropriate, this would have to be figured out before the bot could start removing any categories, with the help of the folks at Wikipedia:WikiProject Disambiguation. The categories that article is in look appropriate to be left in place, though. --Xyzzyplugh (talk) 22:15, 7 June 2008 (UTC)
- How would categories on page like Auburn Township, Ohio be handled? §hep • ¡Talk to me! 20:35, 7 June 2008 (UTC)
- There are some extra considerations:
- Sentences should end with a full stop, and there are disambiguation pages with sentences.(eg)
- There are disambiguation pages with two relevant (disambiguating) blue links on the same line (eg). New entries should be made for the second entry.
- Pipes are often used to good effect, and are fairly common when the blue link occurs after the red link. Removing the pipe without altering the description may result in loss of information, nonsensical descriptions,(eg) or a redlink.(eg)
- Where there are multiple blue links it is not always obvious which one should remain(eg)
- It may be better to restrict some of these tasks to leaving a cleanup tag. -- zzuuzz (talk) 21:21, 7 June 2008 (UTC)
- There are definitely extra considerations. In the first example you gave, though, the manual of style says "Even when the entry forms a complete sentence, do not include commas or periods at the end of the line". As to your other concerns, yes, the bot will not be able to clean up every instance of piping or every entry with multiple links, but will instead only be able to do so in situations which appear to be straightforward and which shouldn't cause problems.
- Basically, it should revolve around looking at whether linked terms match the title of the disambiguation page. If the page is Voodoo, then entries would only be cleaned of extra links if they have exactly one blue-linked term which contains the article title. If an entry has no blue linked terms which contain "Voodoo", or more than one, then the bot can't know what to do with that and should do nothing.
- As for piping, it should only remove piping in straightforward situations. Probably 95% of the inappropriate piping involves either changing something like [[Sequim, Washington]] to [[Sequim, Washington|Sequim]], or [[Tarzan (1999 film)]] to [[Tarzan (1999 film)|Tarzan]], and these should be fairly straightforward to fix (the only complication being the odd ways editors try to add italics around the name of the film, which a bot should also be able to deal with.(Note, for those who might not be familiar with the manual of style - the bot would be removing the piping, not adding it) --Xyzzyplugh (talk) 22:06, 7 June 2008 (UTC)
- Just to clarify that first point - the solution is not to remove the full stop from the second sentence, but to convert the two sentences into one and then remove the full stop (that's a job for a human). Perhaps the bot could check for existence of two full stops and not remove the second one, but flag it accordingly. -- zzuuzz (talk) 22:19, 7 June 2008 (UTC)
- There are problems with that Voodoo example. For instance assume the James Kelly disambig page had only two links, it should still link to the article Jim Kelly (martial artist) which means the same thing, but doesn't include the phrase, as well as the article James M. Kelly (politician) which wouldn't be seen as a match following a simple comparison. --T-rex 00:08, 8 June 2008 (UTC)
- As I said above, if the entry does not have a blue linked term containing the article title, then the bot shouldn't be doing anything with the links in that entry. So the bot would probably do nothing regarding both of the entries that you mentioned. This is not a problem, as we would rather have a bot which fixes 90% of the problems correctly than have one which fixes 95% of problems, but messes up many of those other 5%. (A further comment, added afterwards - I'm not sure I'm understanding you here, but you may be thinking the bot will be removing entries entirely. It won't. The bot will only be de-linking and de-piping. If one of the entries on the James Kelly page says nothing but "Jammmees Kelly iz a rad dudde!!", the bot will not be removing that) --Xyzzyplugh (talk) 09:53, 8 June 2008 (UTC)
- Ok, make that three links then, adding one to James Kelly (pirate) as well. Fact is that a bot that creates some problems should never be used no matter how many it may fix. --T-rex 18:11, 8 June 2008 (UTC)
- You haven't explained what problems you think the bot would cause. If the bot saw "*[[James Kelly (pirate)]] (died 1701)", it wouldn't change that in any way. If it saw "*[[James Kelly (pirate)|James Kelly]] (died 1701)", it would remove the piping. If it saw "*[[James Kelly (pirate)]] (died 1701), English [[pirate]] active in the [[Indian Ocean]]", it would de-link "pirate" and "Indian Ocean" (Or at least, that's my plan). --Xyzzyplugh (talk) 18:30, 8 June 2008 (UTC)
If you can come up with some very specific rules for what this bot should do, I can try implementing it. Kaldari (talk) 16:53, 9 June 2008 (UTC)
- Why not have a bot dump the differing types of pages into a list? Start with the simpler one (no bullets, nothing in bold) and then another for the piping issue (i.e., the first word after the asterisk or the word in bold should not be piped). That way, you can manually see what's wrong and if there's larger patterns. This seems too hard for a bot to fix but really easy for a bot to identify. -- Ricky81682 (talk) 18:27, 15 June 2008 (UTC)
- I'm not exactly sure what you're suggesting. Are you saying that the issues I described above would be too hard for a bot to fix, or that the bot could identify other issues which it couldn't fix and flag those pages? If the bot was going to clean up 50,000 pages and then mark thousands of others as needing cleanup, that would be fine. If the bot was merely going to mark every page which needed cleanup as needing cleanup, that would be totally useless, as we already know that 60+% of disambiguation pages need cleanup. Any editor who wants to clean up disambiguation pages merely needs to click on any 3 random DAB pages, 2 of them will need cleanup. --Xyzzyplugh (talk) 18:24, 16 June 2008 (UTC)
I'd be interested in doing this. I'm not a member of the WikiProject, but I did a lot of disambiguation cleanup at one point and the MoS is straight forward enough. I suggest that as a compromise, the bot should correct easily fixed formatting problems (inappropriate bolding, numbering, simple punctuation, etc.), while building a list of potentially problematic pages for the more complex issues (piped links, punctuation where multiple sentences are involved, and so forth). — xDanielx T/C\R 00:23, 17 June 2008 (UTC)
- It looks like I'm not going to be able to get into this project for a while, so if you're able to work on it, that would be great. I like your idea of fixing the obvious problems and building a list for the more complex issues. Unsupervised bot editing isn't always the best answer for fixing problems on Wikipedia. Kaldari (talk) 14:58, 17 June 2008 (UTC)
- Sounds good -- I put together a list of (~100k) articles with
{{disambig}}
transcluded (guess I'll need to do the other templates later) and I've been working on the programming. I should have something working within a few days. As a totally random aside, has anyone else found that libxml2 doesn't like the XML generated by the MediaWiki API? (W3C validates it with a no-DTD warning.) I can find another library or parse the responses manually, just wondering.... — xDanielx T/C\R 19:23, 17 June 2008 (UTC)
- Sounds good -- I put together a list of (~100k) articles with
WP:LGBT newsletter delivery assistance needed
Well, I'm a big ol' dorkus. I don't know what I'm doing, and I'd rather get someone who is familiar with bots than end up spamming every English user on the site. Our erstwhile newsletter delivery method, Satyrbot, is on extended holiday with its owner, User:SatyrTN, until his house is completed. He is incommunicado until then. I have a newsletter I'd like to send out to our members, but I've never done this before. Help? --Moni3 (talk) 19:31, 9 June 2008 (UTC)
- I can send it out for you. ((ShepBot (talk · contribs). §hep • ¡Talk to me! 19:46, 9 June 2008 (UTC)
- Contact anyone at Category:Newsletter delivery bots -- TinuCherian (Wanna Talk?) - 12:24, 11 June 2008 (UTC)
- I would also offer to do this if I knew who to send it to, what letter it was e.t.c ·Add§hore· Talk/Cont 13:55, 11 June 2008 (UTC)
- Wow, I had no idea our WP was so popular. User:Stepshep was the lucky dog who contacted me first, so I've asked him to deliver it. Perhaps next time I shall set up bidding...hmmm... I do appreciate the offers, though. Thank you for rescuing me in my hour of stupidity. --Moni3 (talk) 15:08, 11 June 2008 (UTC)
Signature Signing Bot
I'm not sure where else to ask, can I have a bot like SineBot to sign posts on my wiki?
SineBot does this when you forget to sign: —Preceding unsigned comment added by Kremzeek! (talk • contribs) 19:07, 11 June 2008 (UTC)
- Ask User talk:slakr for the source. Soxred 93 02:23, 12 June 2008 (UTC)
This list has hundreds of links, a lot of them being dead. It would be helpful if a bot could remove the broken links, replacing them with this tag: [dead link]
Orjanlothe (talk) 11:50, 12 June 2008 (UTC)
- Done using the CheckLinks Tool. Is that what you wanted? §hep • ¡Talk to me! 17:06, 12 June 2008 (UTC)
- Fantastic!, thank you!
Orjanlothe (talk) 17:56, 12 June 2008 (UTC)
Hello, I want to create a bot named UnknownBot. If you accepted the request, the users who approved my bot will give a barnstar as I can. Thanks! Unknownquinones (talk) 13:06, 13 June 2008 (UTC)
- This will have to be done at Wikipedia:Bots/Requests for approval. --T-rex 14:18, 13 June 2008 (UTC)
- May be your should be reading Wikipedia:Bot policy also. Best wishes -- TinuCherian (Wanna Talk?) - 14:22, 13 June 2008 (UTC)
Uk station usage statistics
Following a discussion at Wikipedia talk:WikiProject UK Railways#05/06 usage update, there is a request for a bot that can look at all articles which transclude template:Infobox UK station, and produce a list of any articles for which the usage0506 parameter is blank. No changes to the articles would be required, as we can add these manually once the list has been produced. Does anyone have a suitable bot that can do this? — Tivedshambo (t/c) 23:20, 13 June 2008 (UTC)
- Also can the same be applied for usage0607 ? Simply south (talk) 09:04, 16 June 2008 (UTC)
- I'm not sure whether this is required now but neither am I sure whether it was ever to be honest. By adding hidden links to the template this information can be obtained without the need for a bot. Adambro (talk) 11:43, 17 June 2008 (UTC)
Open Refs
Is there a bot already in existence that can detect open refs? That is, detect that a <ref> is not followed by a </ref> (perhaps confined to recent edits, I don't know). If so, could it generate a list of such articles and deliver it to my talk page (or somewhere). I feel in a fixin' mood. Thanks, Phlegm Rooster (talk) 05:51, 14 June 2008 (UTC)
Condolbot
I want to make a bot that archives pages according to the size of the page instead of how much time. Is that possible? Please help me create a bot because i don't know how to make one. Thank you!!! --LCondolence_ 00:01, 15 June 2008 (UTC)
- Aren't you looking for Wikipedia:Creating a bot then? -- Ricky81682 (talk) 00:05, 15 June 2008 (UTC)
- MiszaBot already does this, see here. BJTalk 00:21, 15 June 2008 (UTC)
I need a selected list of old AfDs
There is a Deletion sorting page for In popular culture articles, but through various other pages, I have discovered this list is vastly incomplete. I would like to be able to have a list of all the deleted IPC articles so I can go through page histories and see if there is possibly any content worth restoring. Making a list like this by hand (going through each and every day of the AfD history) would be extremely tedious. Can anyone help me? — Preceding unsigned comment added by NickPenguin (talk • contribs)
- Sorry, even if list could be made, you couldn't see the deleted contributions because you are not an admin. You will have to trust the deletion decision. -- maelgwn - talk 10:12, 15 June 2008 (UTC)
- I do trust the admin's deletion decision, I trust they believed the article subject was not notable enough for a seperate article to exist. However, that's not to say that some of this content isn't suitable for a focused and selective list in the parent article. For example, some of the deleted article concern major works of literature, and I'm sure their importance is highlighted with a list of major derivative works.
- Also, I don't want to view the deleted article, I want to view the parent article history and see where the IPC content was split out. Based on the quality of the content at that point, I may selectively restore some content, or perhaps ask if the deleted article could be userfied. I simply want the list so I know what I'm looking at. --NickPenguin(contribs) 16:35, 15 June 2008 (UTC)
"1990's [sic]" etc. cleanup
I think that a relatively easy spelling correction that could be done would be changing "1990's" (etc.) to "1990s" (etc.). I say relatively easy as there are only so many decades that you would need to enter and as it is never "1990's", there should be no objections to changing such words (unlike an "its" / "it's" bot, say). It Is Me Here (talk) 16:18, 15 June 2008 (UTC)
- I know WP:AWB will do this if general fixes are turned on. §hep • ¡Talk to me! 18:01, 15 June 2008 (UTC)
- There definitely are instances in which "1990's" is the correct spelling. Please do not change all instances of "1990's" without manually checking them! For example: "1989's most successful athlete was Bo Diddly, while 1990's most successful athlete was Richard Nixon." Kaldari (talk) 15:04, 17 June 2008 (UTC)
- And what about quoted text that should preserve spelling and grammatical errors? 1 != 2 15:07, 17 June 2008 (UTC)
Image tagging cleanup: Publicity photos with missing fair use rationale
I have been going through Category:Publicity_photographs_with_missing_fair-use_rationale and I realized that many of these images actually do have a fair use rationale but do not have the correct parameter added in to remove them from this category's list.
I would like to know if a bot could go through that category and do the following:
- IF the image has both {{Non-free promotional}} and the string "{{Non-free use rationale"
- THEN change {{Non-free promotional}} to {{Non-free promotional|image_has_rationale=yes}}
I would only be concerned about incorrectly formatted fair use rationales slipping through, but that is a risk I am willing to take because every page I have found with both parameters has a correctly created non-free rationale in place.
The goal for this request is to make Category:Publicity photographs with missing fair-use rationale more accurate to remove the photos from the list that actually do have a fair use rationale that utilizes the {{Non-free use rationale}} template. I have been manually going through each image so far and every case like this just needed the parameter added in to get it off the list.
Please let me know if a bot can streamline this action.
-- Guroadrunner (talk) 07:24, 16 June 2008 (UTC)
- Looking into this now -- maelgwn - talk 12:20, 16 June 2008 (UTC)
- BRFA filed -- maelgwn - talk 12:32, 16 June 2008 (UTC)
- Done -- maelgwn - talk 04:23, 18 June 2008 (UTC)
Rename/move bot
moved from WP:BRFA
I'm not sure if I need a new bot, because I don't know if a bot already exists that can do this or not.
Are all the bots on Wikipedia listed somewhere?
I need to retitle about 200 pages (don't worry, they are all drafts in project space)
Is there a rename-bot to make this chore easier?
Renaming these by hand would take me all day.
I look forward to your replies.
The Transhumanist 20:01, 16 June 2008 (UTC)
- Not sure about a rename bot, but this is the most current list of bots. §hep • ¡Talk to me! 21:17, 16 June 2008 (UTC)
- There are none. What's the next step? The Transhumanist 21:39, 16 June 2008 (UTC)
- Tell us what you want done and chances are someone hanging around here will do it! (eg list of pages to be moved, and where they should be moved to) -- maelgwn - talk 23:51, 16 June 2008 (UTC)
- I can do this, I just need more information. βcommand 12:45, 17 June 2008 (UTC)
I left a note on User:Remember the dot's talk page, but it doesn't appear the bot is running anymore (last run was in mid-April). Is there anyone willing to run this bot, or a clone, on a semi-regular basis? The category of images tagged with {{ShouldBePNG}} is getting extremely backlogged. Kelly hi! 23:53, 16 June 2008 (UTC)
- I'll take a crack at coding this one. east.718 at 01:49, June 17, 2008
- There is a link to the source code on the bot's userpage. It's GPL so it's reusable. -- JLaTondre (talk) 03:16, 17 June 2008 (UTC)
- Unfortunately, 1/ it's written in what looks like VB, which I can't read, 2/ I don't trust anything I haven't coded, and 3/ I've already submitted a BRFA. :) east.718 at 05:36, June 17, 2008
- There is a link to the source code on the bot's userpage. It's GPL so it's reusable. -- JLaTondre (talk) 03:16, 17 June 2008 (UTC)
"oldrequestbot" for Articles requested for more than ...
Wikipedia:Articles requested for more than a year and Wikipedia:Articles requested for more than two years are difficult to maintain because they require daily updates and decision making. I believe that a bot perform the task. Basically, the bot would review all posts at Wikipedia:Requested articles (including all subpages) that are (as of today) between June 18, 2006 and June 17, 2007. If the requested article still is red linked, then the bot would need to determine whether the request had already been posted at Wikipedia:Articles requested for more than a year and Wikipedia:Articles requested for more than two years. If not previously posted at Wikipedia:Articles requested for more than a year or Wikipedia:Articles requested for more than two years, then the bot would post the requested article at Wikipedia:Articles requested for more than a year in a predetermined format and predetermined location. The bot also would review all posts at Wikipedia:Requested articles that are (as of today) before June 17, 2006. The bot would perform the same screening as for the one year old request. If the requests are more than two years old, the bot would post the requests at Wikipedia:Articles requested for more than two years. There may be some missed articles/mistakes, but those can be addressed manually. The bot can run once a month or so. Bebestbe (talk) 17:28, 17 June 2008 (UTC)
- Ive got a very good Idea for how to do this, but we would need to impliment a method for timestamping these request, and maintaining constant formating of the pages. we would need a template {{requested article|John jane doe}} and a HTML comment for each line with a timestamp <!-17:28, 17 June 2008 (UTC)-> if we can do this then creating a bot to maintain those pages will be no big deal. it will be able to maintain old lists and remove the pages that get created. βcommand 2 18:22, 17 June 2008 (UTC)
- Going forward, I don't think anyone will have any problem with maintaining constant formating of the pages to whatever you suggest. The existing pages weren't so well maintained. Can you gather the old data for posting? Bebestbe (talk) 02:56, 18 June 2008 (UTC)
- Ill see what I can do. re-configuring may take some crude timestamps, but it is do able. can you bring this up on the talk page just to give everyone a heads up before we go live with the new method? βcommand 16:40, 18 June 2008 (UTC)
- Going forward, I don't think anyone will have any problem with maintaining constant formating of the pages to whatever you suggest. The existing pages weren't so well maintained. Can you gather the old data for posting? Bebestbe (talk) 02:56, 18 June 2008 (UTC)
Archive bot for Wikipedia:Editor review
- Note: This is a renewed discussion from Wikipedia:Bot requests/Archive20. Yechiel (Shalom) 16:45, 18 June 2008 (UTC)
There are two parts to archiving an editor review, and the order in which they are done is inconsequential.
- Link to the editor review page in Wikipedia:Editor review/Archive.
- Untransclude the editor review from the main editor review page.
Previously I have done both tasks myself. I've stopped doing it, and it's apparent that nobody else wants to do it either.
There are a couple of possibilities. One way is to place archive templates, which have not yet been created for this specific purpose, on an expired editor review page. Then a bot sees the archive templates and untranscludes the page and links to it in an archive list. This is similar to the process at Wikipedia:Suspected sock puppets. It may be necessary in this scenario to switch to a "by month" format instead of the current "by username" format.
Another way is to have the bot automatically add a link to the archive in Wikipedia:Editor review/Archive as soon as the request is filed, or at a designated time thereafter, e.g. if the bot runs once a week. Then a human untranscludes the editor review page at his or her discretion. This might be better because it's a less drastic change from the current system.
I don't know whether bots can be programmed to archive pages based on the alphabetical order of a subpage title. In other words, will a bot know how to sort Wikipedia:Editor review/WikiMan53 before Wikipedia:Editor review/WikipedianProlific just based on the fact that M comes before p? If yes, that's probably what I want. If not, switching to archiving based on month of the request will preserve some archiving system without requiring undue human intervention. Shalom (Hello • Peace) 18:30, 12 May 2008 (UTC)
- either way is doable it is possible to maintain the current method and if you want even archive by last edit timestamp, then sort them to the archive page. βcommand 2 18:40, 12 May 2008 (UTC)
- Thanks Betacommand. Right now the archive actually uses the first edit timestamp, i.e. when was the page created. I would think you could program that just as easily as the last edit timestamp. Hold off for now while I ask a couple of my friends if they think this is a good idea. Shalom (Hello • Peace) 03:48, 13 May 2008 (UTC)
- I'd agree with Shalom, preferring alphabetical order but eschewing it if it's easier to program chronologically. I don't know a lot about programming (only a rudimentary knowledge of TI-BASIC), so it would be up to the programmer. bibliomaniac15 03:56, 13 May 2008 (UTC)
- I would prefer alphabetical but that doesn't really matter to me. What I do think is important, though, is that the reviews that get archived have a decent amount of reviewing; frequently I won't archive an old review if it only has a few lines. Of course the bot won't be able to tell when someone's gotten a lengthy review but it's all total crap, but having the bot will probably be worth it, and at least a length requirement would be a start. Would this be difficult? delldot talk 05:40, 13 May 2008 (UTC)
- you tell me what you want and how, I can normally do it. βcommand 22:39, 13 May 2008 (UTC)
- I would prefer alphabetical but that doesn't really matter to me. What I do think is important, though, is that the reviews that get archived have a decent amount of reviewing; frequently I won't archive an old review if it only has a few lines. Of course the bot won't be able to tell when someone's gotten a lengthy review but it's all total crap, but having the bot will probably be worth it, and at least a length requirement would be a start. Would this be difficult? delldot talk 05:40, 13 May 2008 (UTC)
- I'd agree with Shalom, preferring alphabetical order but eschewing it if it's easier to program chronologically. I don't know a lot about programming (only a rudimentary knowledge of TI-BASIC), so it would be up to the programmer. bibliomaniac15 03:56, 13 May 2008 (UTC)
- Thanks Betacommand. Right now the archive actually uses the first edit timestamp, i.e. when was the page created. I would think you could program that just as easily as the last edit timestamp. Hold off for now while I ask a couple of my friends if they think this is a good idea. Shalom (Hello • Peace) 03:48, 13 May 2008 (UTC)
- either way is doable it is possible to maintain the current method and if you want even archive by last edit timestamp, then sort them to the archive page. βcommand 2 18:40, 12 May 2008 (UTC)
(Unindent) I think the best way to do this is to create an archive template for editor reviews. That way, a human decides that the page should be archived, and the bot takes care of the rest - same as with WP:SSP. If someone wants me to modify one of the existing archive templates, I'll give it a try. Shalom (Hello • Peace) 02:30, 14 May 2008 (UTC)
Next steps
A lot has happened in the last month. It's probably best if someone other than Betacommand creates this bot.
I'm going to create Template:Era (stands for "Editor review archive") to be placed at the top of editor review pages. A human will add this archive. The bot will not do this automatically.
A bot should occasionally scan Wikipedia:Editor review for pages with the archive template, and will untransclude that editor review from the main page, and will add it to Wikipedia:Editor review/Archive according to alphabetical order. If possible, add it according to exact alphabetical order. (From the sorting at WP:O and WP:DEP, it would seem possible for a bot to do this.) If that's not possible, then ask the bot to find the first letter of the subpage, and edit the section of the archive page corresponding to that letter of the alphabet. (That should be easy to do.) Either way, it will more or less maintain the current system while not creating too much workload for humans.
Please leave me a message on my talk page with comments about the progress of this request, or if there is any need for clarification. Yechiel (Shalom) 16:45, 18 June 2008 (UTC)