This user is a member of CVU and contributes by writing bots.
This user has account creator rights on the English Wikipedia.
This user has 53% energy left.
This user uses Huggle to fight vandalism.
This user has pending changes reviewer rights on the English Wikipedia.
This user has rollback rights on the English Wikipedia.
Trout this user
This user uses Twinkle to fight vandalism.

User talk:cyberpower678

From Wikipedia, the free encyclopedia
  (Redirected from User talk:Cyberbot II)
Jump to: navigation, search
Yes check.svg
This user is online, or has forgotten to update this message after finishing a wikisession.
(If there have been no edits from this user in the last 60 minutes, it is safe to assume that this user either forgot or is lurking.)
cyberpowerChat:Online
Click here to find out why my signature changes color.
Wikistress3D 1 v3.jpg

  • Hello!! I am Cyberpower678. I am your typical run of the mill user here on Wikipedia.
  • I specialize in bot work and tools, but I lurk around RfPP, AfD, AIV, and AN/I, as well as RfA. If you have any questions in those areas, please feel free to ask. :-)
  • I also serve as a mailing list moderator and account creator over at the Account Creation Center. If you have any questions regarding an account I created for you, or the process itself, feel free to email the WP:ACC team or me personally.
  • At current I have helped to create accounts for 1932 different users.
  • Disputes or discussions that appear to have ended or is disputed will be archived.

All the best.—cyberpower


View my talk page Archives.
RfA candidate S O N S% Ending (UTC) Time left Dups? Report
RfB candidate S O N S% Ending (UTC) Time left Dups? Report

No RfXs since 21:13, 30 August 2015 (UTC).—cyberbot ITalk to my owner:Online


broken archive links[edit]

This edit didn't turn out the way one might have hoped (the second {{cite web}} template). The new archive-url value addresses a 404 page while the original archive-url value addresses the apparently correct page.

Trappist the monk (talk) 18:24, 28 August 2015 (UTC)

Also compare old to new.

Trappist the monk (talk) 18:34, 28 August 2015 (UTC)

The bot does it's best to find a working archive, but it's not 100% reliable.—cyberpowerChat:Limited Access 18:57, 28 August 2015 (UTC)
Looks like a minor bug. The bot attempts to normalize citation templates, but it shouldn't alter the archive. The bot is doing a single run at the moment, so you are free to revert or fix the citation.—cyberpowerChat:Online 19:43, 28 August 2015 (UTC)
I'm also coming up with some similar errors, see [1][2]:Cyberbot II pulls an archive that is actually a 404. Sometimes that's because the bot chose a later archive when only the earlier archives captured the page before it went dead. Also some HTTP 302 crawl errors [3][4]. Can the bot be coded to select archives, if possible, that are dated to before the date of the dead link tag, and preferably as close to the date the original source was added to the wiki article? Altamel (talk) 00:07, 29 August 2015 (UTC)
The bot is already coded to do that. If there is no specified access date, it looks through the revision history to find the timestamp when it was added.—cyberpowerChat:Offline 05:55, 29 August 2015 (UTC)
Also 302 isn't an error. It's a redirect code and the archive will automatically redirect to the correct location.—cyberpowerChat:Offline 05:56, 29 August 2015 (UTC)
What's the right way to deal with a attempt to redirect to archive which doesn't work? E.g., here, where the Nashua Telegraph (site being referenced) explicitly prohibited archiving with a robots.txt? In this case, I reverted to the "dead link" notification, there doesn't appear to be a way to get a usable pointer to an article any more. Is there a preferred better action? Tarl.Neustaedter (talk) 21:43, 29 August 2015 (UTC)
I fixed the link – it wasn't really dead, just malformed (the trailing slash shouldn't have been there). But the validity of your question remains. Mojoworker (talk) 23:16, 29 August 2015 (UTC)

DeadLinksBot[edit]

Just saw that the BFRA was approved. Looking forward to seeing this bot in action. Cheers! Kaldari (talk) 00:25, 29 August 2015 (UTC)

The bot is running with limited speed at the moment. I'm waiting for the dedicated resources to let it run at full speed.—cyberpowerChat:Limited Access 02:05, 29 August 2015 (UTC)
Hey Max, see here and here. The bot was pretty awesome and did a great job, but there are some issues. First, with The Smashing Pumpkins, apparently the version that WayBack Machine archived was actually a redirect. It continues redirect a second time before landing on a working archive, so I think it's a matter of the bot checking for the 302 response code and following the redirect(s). Maybe the API works differently though (if there is an API), so not sure. The same thing happened with The Rolling Stones with this archive URL. Similarly, this archive URL redirects to this non-working archive. So again it's the WayBack Machine returning 302's or 404's, that the bot simply needs to check for and not assume they are a working version. Otherwise I think the bot is pretty f-ing awesome and will a great asset to the project. Hope this helps! MusikAnimal talk 18:25, 30 August 2015 (UTC)
Unfortunately what you are suggestion makes sense, but the bot is already attempting that. The bot is requesting the closest archive to the accessdate, that is either a 200, 203, 206, 301, 302, 307, or 308 as they all resemble a working page in some manner. The bot cannot automatically follow a redirect due to the limitations of the wayback API. Working with Ocaasi, on behalf of the WMF, I am hoping that the API's capabilities can be expanded. This is one feature I have not requested, so you should consider asking Ocaasi to ask IA to add that too.—cyberpowerChat:Online 19:49, 30 August 2015 (UTC)
Seems like a 200 would guarantee a working version, though, right? Or at least not a redirect or dead version. I figured there was some API limitation here. Anyway thanks for the hard work! MusikAnimal talk 19:53, 30 August 2015 (UTC)
True, but I've discovered that not all pages that 302, ever had a 200. A source on Wikipedia could have been created and the site may have always 302ed and that's all there is in the archive when pulling a copy.—cyberpowerChat:Online 20:25, 30 August 2015 (UTC)

Cyberbot II false edit summary?[edit]

In this edit to Born again (Christianity), the bot seems to have added a {{wayback}} template and archived a reference, which is fine, but its edit summary reads "Rescuing 1 sources, flagging 0 as dead, and archiving 2 sources." That doesn't seem to be true. I'm not quite sure what the "rescuing" refers to, but it only archived 1 source, not 2. This edit has the same problem, while this edit uses an identical summary, but it's actually true there. This edit summary doesn't sound right, either ("archiving 0 sources"). Unless I'm misunderstanding the meaning of the summaries, they don't all seem to be accurate. And on a separate note, is the removal of "Emmer, Michele. Mathematics and Cinema in" in this edit intentional? Bilorv(talk)(c)(e) 10:13, 29 August 2015 (UTC)

Yes, you are misinterpreting the edit summary. The rescuing count are the sources that are getting changed. Rescuing meaning reviving sources. Archiving is a behind the scenes process you can't see. The bot is taking links that are not yet in the wayback machine, and archiving them. This will allow sources that die in the future to remain accessible.—cyberpowerChat:Limited Access 15:08, 29 August 2015 (UTC)
Ah, okay. Thanks for the explanation; that makes sense. Is there an FAQ page or anything for Cyberbot II's dead links task? Bilorv(talk)(c)(e) 15:51, 29 August 2015 (UTC)
Not yet. I haven't been asked the same question frequently enough yet. :p This is a brand new bot, one that probably has some bugs still needing to be worked out based on the above comments.—cyberpowerChat:Limited Access 16:03, 29 August 2015 (UTC)
I came to ask something that would be partially resolved by a FAQ entry about the meanings of the various terms used. The edit-summary for this edit to Sucralose says "Rescuing 3 sources, flagging 0 as dead, and archiving 1" but the edit only added a Wayback link to one ref (a ref that was indeed 404 but had a viable Wayback copy). Based on your previous answer, it sounds instead like only 1 was rescued. The message posted to Talk:Sucralose#External links modified lists 3 entries as "Added archive", which is consisent with the edit-summary. DMacks (talk) 03:44, 30 August 2015 (UTC)

AdminStats question[edit]

Hi again, Cyberpower678! I have a question about AdminStats – AdminStats used to include in its output list all those Admins that had done zero Admin actions during the searched timeperiod. But I noticed today that it is no longer including Admins with zero Admin actions in the list, ending just with those that had done at least one Admin action during the searched timeperiod. Was that a deliberate change? If so, what was the reasoning behind that change? (Personally, I found the inclusion of the Admins with zero Admin actions in the list useful info...) Thanks in advance! --IJBall (contribstalk) 17:05, 29 August 2015 (UTC)

If it changed it certainly wasn't deliberate, considering nobody has been making alterations to xTools for a while now.—cyberpowerChat:Limited Access 18:09, 29 August 2015 (UTC)

cyberbot I stuck or stalled[edit]

Wikipedia:RFXR is out of date, and hasn't been updated in over seven hours. Thanks, Wbm1058 (talk) 17:41, 30 August 2015 (UTC)

Yes and WP:RFPP doesn't appeared to have been clerked since around the same time. However the bot has been making other edits since then, so not sure what's going on MusikAnimal talk 18:19, 30 August 2015 (UTC)
The bot seems to have updated the page now: take a look at Special:Diff/678640812. Bilorv(talk)(c)(e) 18:20, 30 August 2015 (UTC)
The bot also updated RFPP at the same time: Special:Diff/678640814. Bilorv(talk)(c)(e) 18:22, 30 August 2015 (UTC)

Opinion on a bot policy thing[edit]

Hi Cyberpower. I'm approaching you as an uninvolved bot operator who's far more familiar with the precedents of how bot policy has been applied than I am. Would a bot task that removed deprecated parameters that have been entirely removed from a template (i.e. serve no function any longer) from mainspace articles fall afoul of WP:COSMETICBOT?

Several parameters related to debut and final teams were removed a while back with consensus from {{Infobox NFL player}} as they duplicated information in the parameter that displays all teams a player has played on (with years played). The parameters were completely removed from the template, and no longer have any effect on the template's output. The rationale for removing them is that their presence in articles is likely to confuse new editors or even experienced editors unfamiliar with this particular infobox, possibly causing them to add good information to a parameter that no longer displays rather than the appropriate one. In my opinion and those of other editors close to this particular template, this makes it worthwhile to remove the parameters from all articles even if it does not produce any change in the HTML output of the page.

Your opinion on this would be appreciated, as I'm unsure how COSMETICBOT has been applied in the past and whether it tends to extend beyond things like removing white space. If there's a better person I should ask, let me know. Thanks! ~ RobTalk 20:28, 30 August 2015 (UTC)

WP:COSMETICBOT does apply here, however, if there is consensus for such a bot, then there is no reason why it shouldn't get approved.—cyberpowerChat:Online 20:50, 30 August 2015 (UTC)
Just to make sure I'm understanding your response in a more general sense, is it correct that COSMETICBOT applies only for tasks that do not have a clear consensus? Thanks again for your help. ~ RobTalk 21:14, 30 August 2015 (UTC)
Consensus is the driving force behind actions on Wikipedia. If consensus wants something done, it can override policy, since policy was established by consensus to cover most cases. So in other words, WP:COSMETICBOT only matters if there is no consensus for a bot affected by this policy. So in short, WP:COSMETICBOT will result in all proposed that apply, to be denied per policy unless a discussion with a favoring outcome supports it.—cyberpowerChat:Online 22:57, 30 August 2015 (UTC)
Alright, I understand. Thanks again for your help! ~ RobTalk 00:26, 31 August 2015 (UTC)

Checked link; archiveurl is borked. Now what?[edit]

Hey, I think the prospect of programming a bot to automagically replace deadlink refs with archived URLs is a noble endeavor. However, it doesn't always work (scrolling up, I'm not alone in seeing this), yet the automated Talk Page message doesn't seem to allow for any other edit than 'checked=true', which causes a message to appear, stating "Archived sources has been checked to be working". OK, I checked the link, and it's bad, and there are no earlier archived versions... now what? Any chance there could be an option to note that: The archived source was checked, but found to be an archived 404 (or some such), and cannot be further rescued? I would imagine you are interested in the success rate of your bot, and at the very least this could help provide such stats. Thanks! Antepenultimate (talk) 00:24, 31 August 2015 (UTC)

I am very well aware the bot cannot always get it right, and it probably never will be able to get a 100% accuracy due to the many variables involved. In your case you simply need to either remove the source or restore it and add the {{cbignore}} tag. Follow the template instructions on how to use it.—cyberpowerChat:Limited Access 17:24, 31 August 2015 (UTC)

Force archive[edit]

Exciting to see the Internet Archive bot making its rounds. (Feature suggestion: it'd be nice if it also ran a WebCite archive on the link for good measure—sometimes archive.org drops whole sites from the archive. User:WebCiteBOT has been down for some time.) Question for your FAQ: Is there a way to call the bot over to a page? Or to specify which citations to expand when first adding them? – czar 03:44, 31 August 2015 (UTC)

Unfortunately, when the bot was built, WebCite was never mentioned, and therefore the code would need some major rewrites to incorporate WebCite. Also, as I mentioned this elsewhere, WebCite has no API that I can tell. I'm not sure how User:WebCiteBOT works, but I'm not sure if WebCite is worth adding since it is a privately funded site, and that to me is an indication that it can go down any day. Also, Cyberbot II saves unarchived pages into the wayback machine, so in the end, it will be more likely that the wayback machine has the page we need rather than WebCite.—cyberpowerChat:Limited Access 17:36, 31 August 2015 (UTC)
Understandable—just wanted to bring it to your attention. WPVG has had all sorts of issues with sites getting dropped from archive.org in the past year. I think WebCiteBOT's maintainer had been in contact with the site. But my second question: Is there a way to call the bot over to a page? Or to specify which citations to expand when first adding them? Appreciate your help – czar 17:49, 31 August 2015 (UTC)
Cyberbot will run through millions of articles. I think it may start to cause problems when people start calling Cyberbot over to other areas, as it may never get done with a run. However if you want it to add an archive, simply tag it dead. Alternatively we can create a new tag such as {{forcearchive}} that we can add to the bot's list of templates to look for when modify sources.—cyberpowerChat:Limited Access 17:56, 31 August 2015 (UTC)
That sounds great. My ideal use case would be to tag a whole article ("forcearchive" or otherwise) when I'm finished for the night so the bot can come around in the next few hours. Thus archiving need not be part of my workflow. I'd hate to have an edit conflict in the middle of an expansion, though. – czar 18:23, 31 August 2015 (UTC)
If you had an edit conflict and the diffs looked like a complete mess, so you didn't want to spend ages merging the two versions together, could you not just save your version and wait for the bot to come around and archive everything again (forcing it to if possible)? It might add to its workflow a bit but when it's editing millions of pages, having to go over a few articles more than once isn't going to substantially impede anything. Bilorv(talk)(c)(e) 18:43, 31 August 2015 (UTC)
Yep, but I'd prefer to avoid that, if possible (like the other bots that do cleanup rounds). I suppose it depends on how long you anticipate it will take for the bot to come around again. – czar 18:52, 31 August 2015 (UTC)
My edit didn't go through, despite me clicking save twice.The bot will do an immense amount of work, so it can take days, maybe weeks for the first run, for the bot to finish a single run on 12 million+ articles. Depending on how many resources labs is willing to give me determines how fast the bot can run. The more resources, the faster. Given that this bot is quickly gaining in popularity here, and the fact that some people over at WMF have taken an interest in this bot, it's not unreasonable to assume that they're willing to provide what's needed to get the bot through a run in a reasonable amount of time. Also, I'm still working on improving it's efficiency. The bot is pretty slow at the moment, working at several seconds per article. Ideally, I would want each article to be processed in a second or less.—cyberpowerChat:Online 19:06, 31 August 2015 (UTC)

XTool edit counter protocol bug[edit]

If I'm right, you are the maintainer of XTool edit counter. A hu.wiki user reported a bug in Recent edits (global) section, where Page titles has doubled protocol: "http://https://". Can you fix it? --BáthoryPéter (talk) 20:39, 31 August 2015 (UTC)

Ok, I see what he means: in JSoos's edit staistic in section "Latest edits (global)" all the links (not only in "Page title" column) there is a pefix "https//" for the link, though in other sections there are no such prefix, and they work correctly using secure page (eg: "Top edited pages" section: Page log example-1, while in the mentioned section it does not work Page log example-2 (the examples are ther first entry "log" links in the cited sections of the statistics at "2015-09-01, 13:40") JSoos (talk) 14:27, 1 September 2015 (UTC)

Glitch during CyberBot I AfD tagging[edit]

Hello Cyberpower678, please see Wikipedia:Articles for deletion/Love Bites (band) and the history of Love Bites (band) and Love Bites. It looks like the bot tried to fix an incomplete nomination for "Love Bites (band)", but created some links in the nomination for "Love Bites" and mistagged the DAB-page "Love Bites" as well. I fixed (hopefully) the wrong details manually, but could you have a look into this case please? Many thanks. GermanJoe (talk) 21:24, 31 August 2015 (UTC)

(talk page stalker) This appears to have been caused by a mistake made in the creation of that AfD. Everything in the AfD except its title refers to the disambiguation page, not the band page. This appears to be human error, not bot error. ~ RobTalk 22:02, 31 August 2015 (UTC)
I see, good catch. Apparently the nomination page itself was created inconsistently by the nominator. GermanJoe (talk) 22:15, 31 August 2015 (UTC)

Barnstar[edit]

Computer barnstar2 2.png The Computing Star
For creating the Cyberbots Aero Slicer 12:58, 1 September 2015 (UTC)

Improper parameter usage[edit]

Hi, what's the idea with these posts, also these? You undid some, but a lot of duplicates are left over. --Redrose64 (talk) 20:54, 1 September 2015 (UTC)

(talk page stalker) This was related to the recent trial run approved in this BRFA, for context. ~ RobTalk 21:07, 1 September 2015 (UTC)
A bot that is supposed to be simple, is turning out to be a pain. The bot is not cooperating.—cyberpowerChat:Online 21:19, 1 September 2015 (UTC)