User talk:Cyberpower678: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
Line 80: Line 80:
:::::::::::: And forums aren't RSes, which is what I thought was triggering the issue. When it started explaining soccer.+\.com was the trigger I sat up and took notice. There were a few on canada-soccer that were flagged earlier that weren't in the forum section. I'll have to find them.
:::::::::::: And forums aren't RSes, which is what I thought was triggering the issue. When it started explaining soccer.+\.com was the trigger I sat up and took notice. There were a few on canada-soccer that were flagged earlier that weren't in the forum section. I'll have to find them.
:::::::::::: Thanks again. [[User:Walter Görlitz|Walter Görlitz]] ([[User talk:Walter Görlitz|talk]]) 23:22, 4 October 2013 (UTC)
:::::::::::: Thanks again. [[User:Walter Görlitz|Walter Görlitz]] ([[User talk:Walter Görlitz|talk]]) 23:22, 4 October 2013 (UTC)
:::::::::::::But be warned, like I mentioned above, the bot may come back to tag it once more before it realizes it's not blacklisted.—[[User:C678|<span style="color:green;font-family:Neuropol">cyberpower]] [[User talk:C678|<sup style="color:red;font-family:arnprior">Chat]]<sub style="margin-left:-4.4ex;color:red;font-family:arnprior">Offline</sub> 10:38, 5 October 2013 (UTC)


== Cyberbot I isn't upto something ==
== Cyberbot I isn't upto something ==

Revision as of 10:38, 5 October 2013

This user is offline, or has forgotten to update this message since starting a wikisession.

(If there have been multiple edits from this user in the last 60 minutes and the most recent one wasn't to activate this template, it is safe to assume that this user forgot.)

Senior Editor II
Senior Editor II
CYBERPOWER (Around)
Click here to find out why my signature changes color.
This signature was designed with a font that has been discontinued. You can however download the font pack necessary to view how the sig is supposed to look like here and here.
Be sure to download both fonts from both links. The sig has also been designed to look good for people who don't want to download the font packs.

  • Hello!! I am Cyberpower678. I am an administrator on Wikipedia. Despite that, I'm still your run of the mill user here on Wikipedia.
  • I specialize in bot work and tools, but I lurk around RfPP, AfD, AIV, and AN/I, as well as RfA. If you have any questions in those areas, please feel free to ask. :-)
  • For InternetArchiveBot matters specifically, please see meta:InternetArchiveBot/Problem for common issues or meta:User talk:InternetArchiveBot to leave a message
  • I also serve as a mailing list moderator and account creator over at the Account Creation Center. If you have any questions regarding an account I created for you, or the process itself, feel free to email the WP:ACC team or me personally.
  • At current I have helped to create accounts for 2512 different users and renamed 793 other users.
  • Disputes or discussions that appear to have ended or is disputed will be archived.

All the best.—cyberpower


View my talk page Archives.
Requests for adminship and bureaucratship update
No current discussions. Recent RfAs, recent RfBs: (successful, unsuccessful)

This user is busy doing other things and would like a {{talkback}} notice at this time.


Blacklist - bot request

First, I hope you're all fine now.

I hope in any case that this is not giving you headache. In the tag that you leave, would you mind to add two (un-used) parameters, merely for our convenience:

  1. the actual rule(s) that block the link
  2. which blacklist it is

.. that is, if they are coming out of the db-requests. It would namely help greatly with finding the why of the blacklisting for those who are considering the de-blacklisting/whitelisting. Some of the links are hitting on strange rules which are not easily found. Thanks. --Dirk Beetstra T C 08:27, 29 September 2013 (UTC)[reply]

Two related requests:

  • Seen the AN/I .. could you program the bot so that it does not add the template more than once every ## hours (I would suggest 48 or 72). Some persistence is good, but we don't need to push it too hard either. If editors then indicate that the bot is re-tagging while they are waiting for the whitelist to come through, please set the bot to ignore for that page (but preferably not earlier, it is better that the issue is resolved at some point).
  • Related to the bot-ignore - maybe that should have a 'deadline' like 3 months. If whitelisting is not done within, e.g. three months, then another solution needs to be found. You could add a date-tag to the ignore, and have a deadline on that, or just manually with a remark line every month, kill everything of 3 months ago.

Thanks for the hard work .. now hope that the issue gets resolved at some point. This is also good for finding the too-wide-net cases. --Dirk Beetstra T C 12:51, 29 September 2013 (UTC)[reply]

(talk page stalker)I agree with the unused parameters, but I disagree that we should ever ignore blacklisted links long-term. They need to be either whitelisted or removed. Jackmcbarn (talk) 15:28, 29 September 2013 (UTC)[reply]
I don't think I suggested to ever ignore blacklisted links long-term. --Dirk Beetstra T C 15:44, 29 September 2013 (UTC)[reply]
3 months would be way to long a figure if whitelisting hasnt happened in that length of time it aint ever going to happen. Again though editors equally shouldn't be edit warring with the bot so would an automatic talk page message be worth whilst if the bot is reverted.Blethering Scot 17:13, 29 September 2013 (UTC)[reply]
  • @Beetstra: I think it's a great idea, and would love to add those additional bells and whistles. No for the bad news. This script already eats. 1.1 GB of my allotted 2 GB on labs and constantly pushes the CPU usage beyond 100%. I wouldn't mind adding it, but I'm concerned about the additional resources it's going to consume. I'm already at 1.5 GB for all continuous tasks. I also have scripts that are executed by the cron. I will certainly look into it, but I can say for certain it will slow down the bot, and eat more memory.—cyberpower ChatOnline 00:20, 30 September 2013 (UTC)[reply]
    • Most are just features, the one of the once every ## hours-editing however should not add to much overhead, I'd consider that as the more important one. --Dirk Beetstra T C 04:05, 30 September 2013 (UTC)[reply]
      No it shouldn't but it's really finicky. The bot's run consists of scan local database, about 3-4 hours, and remove any links that aren't blacklisted, check for new regex additions, to the blacklist, or removals from the whitelist, if none proceed to tagging phase, if there are, scan 60000000+ links on Wikipedia, about 16-20 hours, tag pages, about 3 hours, remove invalid tags, about 30 minutes, and then sleep for 15 minutes. Once every ## hours would be very difficult to enforce in this case. As time needed to execute a part of the script constantly changes.—cyberpower ChatOnline 11:40, 30 September 2013 (UTC)[reply]
  • Surely the option in this case is just to run the bot once every three days. Liamdavies (talk) 15:06, 30 September 2013 (UTC)[reply]
    Huh? Where did you pull three days from? That means the tag removal part of the script will be running once every three days as well, which also an excessive time for the bot to just sit there and wait. Not to mention that when the bot runs a scan of the database, it will be more than 3 days before the bot does anything again. If anything, the bot should have no more than a 48 hour gap in editing.—cyberpower ChatOnline 15:11, 30 September 2013 (UTC)[reply]
  • Fine, every two days, the point is that limiting how often the bot runs - rather than in a loop - is a low tech way of solving the problem in the interim. Liamdavies (talk) 15:37, 30 September 2013 (UTC)[reply]
  • @Beetstra: As you can see, I've managed to come up with a resource conserving method of identifying the rule. But due to it's design, I'd like to see the log output first before I unleash the bot to update all the tags.—cyberpower ChatOnline 21:13, 2 October 2013 (UTC)[reply]
  • Bot war: Cyberbot insists on the tag being at the very beginning ([1], [2]), other bots put it beneath hatnotes with the other maintainance templates ([3]).
    I would recommend to a) replace an existing tag in-place if it's already on the page, that should be very simple and the least objectionable by other bots and editors. b) add new tags beneath hatnotes, to comply with guidelines (WP:HNP etc.) -- I would recommend to use the Twinkle-regex ([4]) as a guideline on how to recognize them.
    Amalthea 16:55, 4 October 2013 (UTC)[reply]
    I'll look into fixing that.—cyberpower ChatLimited Access 17:04, 4 October 2013 (UTC)[reply]

CyberBot II PCBot task

Hey, is there any way you could slow down this task to where the bot waits a period of time, say three minutes, before assuming the admin forgot to add the pending changes template? I very rarely forget to add it and it's somewhat of an inconvenience for the bot to try adding it for me so quickly when I'm already in the process of doing so. Ks0stm (TCGE) 20:01, 2 October 2013 (UTC)[reply]

I really ought to fix that. It keeps slipping my mind.—cyberpower ChatOnline 20:36, 2 October 2013 (UTC)[reply]

FYI your bot is editing logged out

[5] and other contribs on the IP. See m:October 2013 private data security issue. Please use &assert=user in the future. Legoktm (talk) 06:07, 3 October 2013 (UTC)[reply]

I'm aware. These are non-peachy based scripts. Peachy didn't appear to have this problem. I've reinstated my bots password, and it looks like Pywikipdedia has assert issues of it's own. :p—cyberpower ChatOnline 10:49, 3 October 2013 (UTC)[reply]
No it doesn't. pywikibot-core requires you to be logged in to edit. Legoktm (talk) 14:15, 3 October 2013 (UTC)[reply]
Obviously not since an edit went through as an IP. But I hardly know anything about Pywikibot, so I'm not going to make an argument of it.—cyberpower ChatOnline 14:19, 3 October 2013 (UTC)[reply]
The issue is that you're using pywikibot-compat instead of pywikibot-core. If/when you switch to core, it forces the account to be logged in and other improvements. Legoktm (talk) 18:39, 3 October 2013 (UTC)[reply]

Blacklisted url

Hi, Cyberbot II, I was just wondering if quote=k-kristie.quazen.com/arts/architecture/the-largest-buildings-in-the-world on Kennedy Space Center is still on your blacklist. Lotje (talk) 16:14, 3 October 2013 (UTC)[reply]

Sorry. I don't control the blacklist. Please see Template:Blacklisted-links for more information on blacklists.—cyberpower ChatOnline 16:18, 3 October 2013 (UTC)[reply]

Penang

Hey Cyber - how's the head...? Better I hope...

This just came up on my watchlist: https://en.wikipedia.org/w/index.php?title=Penang&diff=575609318&oldid=575217841

Something seems to make the tag not include the link, and weirdly truncate stuff.

Just a heads up. Begoontalk 17:51, 3 October 2013 (UTC)[reply]

It's the pipe that's causing it. I need to convert it to an html character to fix that. And thank you, I'm 95% better. Just a mild headache at this point.
Cool - I knew it would be something simple. But more importantly, glad you're ok. Still take care though...Begoontalk 19:13, 3 October 2013 (UTC)[reply]

facts is still a trigger

Thanks for updating the bot and it's clear that this is a false positive on 2nd Chapter of Acts. Walter Görlitz (talk) 18:47, 3 October 2013 (UTC)[reply]

I see there is an open request on the whitelist concerning this link. I have added this to the exceptions list for the duration of the request. The page should no longer be tagged for the time being.—cyberpower ChatOffline 18:55, 3 October 2013 (UTC)[reply]
Thanks!
As a personal note, it's good to see you're feeling well enough to be on Wikipedia. Hoping for continued recovery and a speedy one at that. You do great work for the project. Walter Görlitz (talk) 19:04, 3 October 2013 (UTC)[reply]
Yep. 95% there. Nothing an Aspirin can't handle.—cyberpower ChatOffline 19:08, 3 October 2013 (UTC)[reply]
It seems that the 2ndchapterofacts.com is still triggering the blacklist today and canada-soccer.com is as well. Walter Görlitz (talk) 13:31, 4 October 2013 (UTC)[reply]
I can't de-blacklist or whitelist links. I can only tell the bot to ignore blacklisted links while a pending request exists. Sorry.—cyberpower ChatOffline 13:47, 4 October 2013 (UTC)[reply]
The issue is that there are links from this domain on all these pages and the bot has been tagging many of them; the exception list is either incomplete or not working. Is there no way to make the bot ignore all links from a domain globally? Liamdavies (talk) 14:32, 4 October 2013 (UTC)[reply]
That would require a regex setup. I'll add the remaining links you provided to the exceptions list. It usually helps me to provide the links needing ignoring.—cyberpower ChatOnline 14:39, 4 October 2013 (UTC)[reply]
Links added.—cyberpower ChatOnline 14:49, 4 October 2013 (UTC)[reply]
It was just triggered again here.
The site for the first is www.2ndchapterofacts.com. The site for the second is www.canadian-soccer.com as seen here and here. Walter Görlitz (talk) 19:23, 4 October 2013 (UTC)[reply]
That's a normal, and currently unavoidable side-effect. It takes some time before the exceptions go into effect. It shouldn't tag the first again. As for the latter, can you point me to the whitelist, or de-blacklist request? Simply ignoring the blacklisted links isn't the solution.—cyberpower ChatOnline 20:25, 4 October 2013 (UTC)[reply]
I've whitelisted 2ndchapterofacts.com, that was a clear collateral damage of a global blacklist entry. I'll check out the second one, that's almost certainly a mistake in the blacklist entry -- I'll get back here. Amalthea 22:06, 4 October 2013 (UTC)[reply]
canadian-soccer.com should also cause no further problems, tags can be removed manually or will be removed automatically in the near future, thanks for the report -- but please request de-blacklisting (if you think the blacklist entry is wrong) or whitelisting (if you think the blacklist entry is right but should have a specific exemption) next time. Amalthea 22:38, 4 October 2013 (UTC)[reply]
... however, these are both links to forum entries (which is not reachable ATM): I strongly doubt those are sources we want to use for anything! Amalthea 22:55, 4 October 2013 (UTC)[reply]
And forums aren't RSes, which is what I thought was triggering the issue. When it started explaining soccer.+\.com was the trigger I sat up and took notice. There were a few on canada-soccer that were flagged earlier that weren't in the forum section. I'll have to find them.
Thanks again. Walter Görlitz (talk) 23:22, 4 October 2013 (UTC)[reply]
But be warned, like I mentioned above, the bot may come back to tag it once more before it realizes it's not blacklisted.—cyberpower ChatOffline 10:38, 5 October 2013 (UTC)[reply]

Cyberbot I isn't upto something

Cyberbot I isn't clerking RFPP currently, I've not seen anyplace where it's been mentioned so is it a known issue? tutterMouse (talk) 07:27, 4 October 2013 (UTC)[reply]

Yes, seems to have stopped for at least a day. I manually archived a big chunk yesterday, and it's starting to stack up again. GedUK  11:24, 4 October 2013 (UTC)[reply]
I'm really starting to dislike Pywikipedia.—cyberpower ChatLimited Access 12:49, 4 October 2013 (UTC)[reply]

What is the point in having the bot tag a page which discusses the link?--Launchballer 21:43, 4 October 2013 (UTC)[reply]

The link is blacklisted regardless. Submit a whitelisting request for that page if the link belongs. Jackmcbarn (talk) 22:34, 4 October 2013 (UTC)[reply]