User:Vicarious/bot ideas
automated automation syntax bot
delink removed items deleting bot links to redirects
I know there's already autoarchive bots running such as the one archiving this page, but I think a bot that operates a little differently could be effectively used to archive all article talkpages. First off, it would only archive talk pages that are very long, so 3 year old comments on a tiny talk page would be left untouched. When the bot runs across a very long talk page it will archive similarly to current bots, but with a high threshold, for example all sections older than 28 days (rather than the typical 7 days). Also, unlike current bots I'd suggest we make this opt out rather than opt in, although very busy talk pages or talk pages that are manually archived wouldn't be touched anyway because they'd either be short enough or would have no inactive sections.
- As for bandwidth, I don't think it would be an issue. First off it could run once on a database dump to get the ball rolling, then it could patrol recent changes looking only at "talk:" changes. If it still seems like it could hog bandwidth I can think of many more ways to cut down the number of pages it checks. First off ignore any pages that just had characters removed instead of added. Secondly only check every third page (or so), this operates under the premise that big talk pages get big because they're edited often, so it'll pop up again soon if it's going to need archiving. Thirdly, the bot could store a local hash table of page lengths so rather than loading the page each time it would add (or subtract) the number of characters listed on Special:Recentchanges and could only load the page if it needs archived. This wouldn't be as hard on a bot as it sounds, the storage space would only be a few megs because all it needs is the page's hash and size. Also the computation would be easy, because it would hash, not search for the page so the lookup time is O(1) and the calculations are all real simple.
- a bot automatically add Template:Verylong to articles that are above the recommended size.
- a bot that transforms large sections of quotes to template:quote
a bot that deletes articles that have a clear concensus on Wikipedia:Articles for deletion. For example, it's quite obvious that Wikipedia:Articles for deletion/Myspacephobia is going to get deleted, but it's currently waiting for an admin to do the work. Yes I know this would mean an admin bot, but that's not without precedent. Also, this bot would ONLY work on articles with a very obvious concensus. As for vandals abusing the bot, I don't think it would be an issue. First off it'd ignore IPs, secondly it'd have a minimum amount of time for voting, and there's too many legitamate voters to contest a bad faith deletion for the bot to touch it. Btw, this bot would also close candidates that are clearly keep as well.
no sources bot assisted afd