Wikipedia:Bots/Requests for approval: Difference between revisions
- العربية
- Arpetan
- Asturianu
- Avañe'ẽ
- تۆرکجه
- বাংলা
- Башҡортса
- Беларуская
- भोजपुरी
- Български
- Bosanski
- Català
- Čeština
- Corsu
- Dansk
- الدارجة
- Deutsch
- Español
- Esperanto
- Estremeñu
- Euskara
- فارسی
- Føroyskt
- Français
- Galego
- ГӀалгӀай
- 贛語
- ગુજરાતી
- 한국어
- हिन्दी
- Hrvatski
- Igbo
- Bahasa Indonesia
- Interlingua
- Íslenska
- Italiano
- עברית
- ಕನ್ನಡ
- ქართული
- Қазақша
- Кыргызча
- Ladino
- ລາວ
- Latviešu
- Lombard
- Magyar
- मैथिली
- Македонски
- Malagasy
- മലയാളം
- Malti
- मराठी
- مصرى
- Bahasa Melayu
- ꯃꯤꯇꯩ ꯂꯣꯟ
- Minangkabau
- မြန်မာဘာသာ
- Nederlands
- नेपाली
- 日本語
- Нохчийн
- Norsk bokmål
- Occitan
- Oʻzbekcha / ўзбекча
- پنجابی
- ပအိုဝ်ႏဘာႏသာႏ
- پښتو
- Piemontèis
- Plattdüütsch
- Polski
- Português
- Qırımtatarca
- Română
- Romani čhib
- Русский
- Sicilianu
- Simple English
- سنڌي
- SiSwati
- Slovenčina
- Slovenščina
- Soomaaliga
- Српски / srpski
- Srpskohrvatski / српскохрватски
- Suomi
- Svenska
- தமிழ்
- ၽႃႇသႃႇတႆး
- తెలుగు
- ไทย
- Türkçe
- Українська
- اردو
- Vèneto
- Tiếng Việt
- Walon
- ייִדיש
- 粵語
- 中文
B-bot 5 approved |
→Current requests for approval: +MusikBot 2 |
||
Line 4: | Line 4: | ||
=Current requests for approval= |
=Current requests for approval= |
||
<!-- Add NEW entries at the TOP of this section, on a new line directly below this message. --> |
<!-- Add NEW entries at the TOP of this section, on a new line directly below this message. --> |
||
{{BRFA|MusikBot|2|Open}} |
|||
{{BRFA|Reports bot|3|Open}} |
{{BRFA|Reports bot|3|Open}} |
||
{{BRFA|WikiProject Notification Service||Open}} |
{{BRFA|WikiProject Notification Service||Open}} |
Revision as of 16:40, 7 July 2015
![]() |
![]() | All editors are encouraged to participate in the requests below – your comments are appreciated more than you may think! |
New to bots on Wikipedia? Read these primers!
- Approval process – How these discussions work
- Overview/Policy – What bots are/What they can (or can't) do
- Dictionary – Explains bot-related jargon
To run a bot on the English Wikipedia, you must first get it approved. Follow the instructions below to add a request. If you are not familiar with programming consider asking someone else to run a bot for you.
Instructions for bot operators | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
|
Bot-related archives (v·t·e) |
---|
1, 2, 3, 4, 5, 6, 7, 8, 9, 10 11, 12, 13, 14, 15, 16, 17, 18, 19, 20 21, 22, 23, 24, 25, 26, 27, 28, 29, 30 31, 32, 33, 34, 35, 36, 37, 38, 39, 40 41, 42, 43, 44, 45, 46, 47, 48, 49, 50 51, 52, 53, 54, 55, 56, 57, 58, 59, 60 61, 62, 63, 64, 65, 66, 67, 68, 69, 70 71, 72, 73, 74, 75, 76, 77, 78, 79, 80 81, 82, 83, 84, 85, 86 |
|
Bot Name | Status | Created | Last editor | Date/Time | Last BAG editor | Date/Time |
---|---|---|---|---|---|---|
BaranBOT 3 (T|C|B|F) | Open | 2024-06-19, 05:08:35 | MusikAnimal | 2024-06-20, 05:23:11 | MusikAnimal | 2024-06-20, 05:23:11 |
Mdann52 bot 15 (T|C|B|F) | Open | 2024-06-12, 09:24:31 | Mdann52 | 2024-06-12, 09:24:31 | Never edited by BAG | n/a |
RustyBot (T|C|B|F) | Open | 2024-06-10, 20:06:58 | Rusty4321 | 2024-06-17, 04:41:23 | Never edited by BAG | n/a |
Mdann52 bot 14 (T|C|B|F) | Open | 2024-06-10, 17:46:58 | Mdann52 | 2024-06-10, 17:46:58 | Never edited by BAG | n/a |
BaranBOT 2 (T|C|B|F) | Open | 2024-05-27, 14:01:46 | DreamRimmer | 2024-06-14, 16:57:15 | Primefac | 2024-06-06, 15:00:25 |
Dušan Kreheľ (bot) VIII (T|C|B|F) | Open | 2024-04-07, 05:50:30 | Dušan Kreheľ | 2024-04-07, 05:56:34 | Never edited by BAG | n/a |
Dušan Kreheľ (bot) VII (T|C|B|F) | Open | 2024-02-16, 09:24:40 | EggRoll97 | 2024-06-02, 04:21:37 | ProcrastinatingReader | 2024-02-20, 13:13:05 |
BattyBot 81 (T|C|B|F) | On hold | 2024-02-07, 14:12:49 | ProcrastinatingReader | 2024-02-15, 12:09:35 | ProcrastinatingReader | 2024-02-15, 12:09:35 |
DannyS712 bot III 74 (T|C|B|F) | In trial | 2024-05-09, 00:02:12 | DannyS712 | 2024-05-09, 16:13:34 | ProcrastinatingReader | 2024-05-09, 10:58:36 |
StradBot 2 (T|C|B|F) | In trial | 2024-02-17, 03:20:39 | SD0001 | 2024-02-17, 05:58:51 | SD0001 | 2024-02-17, 05:58:51 |
CapsuleBot 2 (T|C|B|F) | Extended trial | 2023-06-14, 00:14:29 | Capsulecap | 2024-01-20, 02:36:30 | Primefac | 2024-01-15, 07:40:39 |
AussieBot 1 (T|C|B|F) | Extended trial: User response needed! | 2023-03-22, 01:57:36 | Hawkeye7 | 2024-02-18, 23:33:13 | Primefac | 2024-02-18, 20:10:45 |
DoggoBot 10 (T|C|B|F) | In trial | 2023-03-02, 02:55:00 | Frostly | 2024-02-21, 22:41:18 | Primefac | 2024-01-15, 07:40:49 |
Qwerfjkl (bot) 30 (T|C|B|F) | Trial complete | 2024-06-05, 20:51:40 | Qwerfjkl | 2024-06-09, 17:26:10 | SD0001 | 2024-06-06, 04:07:27 |
PrimeBOT 39 (T|C|B|F) | On hold | 2023-05-11, 12:48:50 | Primefac | 2023-09-22, 10:51:59 | Headbomb | 2023-07-02, 17:38:58 |
Current requests for approval
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Approved.
Operator: MusikAnimal (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 16:35, Tuesday, July 7, 2015 (UTC)
Automatic, Supervised, or Manual: Automatic
Source code available: GitHub
Function overview: Takes over the archiving task of WP:PERM pages, previously the responsibility of KingpinBot. Bot operator Kingpin13 is aware and in support of this as are the regular PERM admins
Links to relevant discussions (where appropriate): [1]
Edit period(s): Continuous
Estimated number of pages affected: 8, and most likely to be 9 (if we merge in mass message sender into PERM)
Exclusion compliant (Yes/No): No
Already has a bot flag (Yes/No): Yes
Function details: MusikBot has for a while been clerking the PERM pages, and that trial is now complete and awaiting review (see Wikipedia:Bots/Requests for approval/MusikBot). In the meantime I have developed the archiving task to be bundled in with the other tasks. Every task is mutually exclusive to others, and can be turned off/on as needed. The bot already scrapes the pages every 10 minutes so it will archive on-the-fly, rather than once daily as KingpinBot did. The logic is otherwise exactly the same, just implemented differently. Some additions:
- The edit summary will state how many open requests there are left, or will say "list is clear"
- Updates the {{admin backlog}} to {{no admin backlog}} or vice versa, as needed
- Comments when a request has been marked as {{done}} but the user does not have the said permission
This new archiving task is badly needed... KingpinBot (all due respect to the operator) has been increasingly less reliable. Currently we are quite backed up — MusikAnimal talk 16:35, 7 July 2015 (UTC)[reply]
((BAGAssistanceNeeded)) Requesting immediate attention as the requests for permission pages badly need to be archived. Patrolling admins are finding it difficult to pick out open remaining requests ([2]). During this first run I can run the script locally, stepping through the code, to ensure nothing goes awry given how many requests there are — MusikAnimal talk 20:53, 7 July 2015 (UTC)[reply]
Discussion
Approved for trial (7 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Magioladitis (talk) 22:54, 7 July 2015 (UTC)[reply]
Trial complete. Alright! After 7 days, I think I'm convinced the bot is stable except for the AWB registration page. That did work but then I realized I wasn't archiving bot requests (although there haven't been any). The AWB page requires special handling as it is structured vastly different from the other PERM pages. I have it fully working on testwiki, I believe, but I'll need an extended trail to prove it works in production.
As for the other PERM pages, they've been spot-on. When I first ran the script all the pages were backed up, so I manually ran it page by page. I saw that when a level 2 heading was not on its own line, the bot malfunctioned (skipped several requests). I think most bots would have too, actually... as it's the headings that makes it possible to parse the page request by request. To rememdy this for MusikBot, I implemented an initial check that would report the error, and it will refrain from attempting to process the page until the error is fixed. The report shows exactly what line caused the error. Here's an example on testwiki: [3] I don't want to have the bot automatically fix the error as this scenario may come about when the page structure has been tampered with, and the bot could potentially make matters worse, so it will instead wait for a human to fix it.
Anyway, aside from that little mishap we've had no issues for the core PERM pages. Here's some diffs:
To reduce edit count and server strain, the bot consolidates edits to archive pages when applicable: [8]
Diffs showing how the bot updates the {{admin backlog}} template: [9][10]
Let me know if you have any questions. Again, if you feel it is necessary, please allow for an extended trial to exemplify handling of the AWB registration page. Thanks! — MusikAnimal talk 21:52, 14 July 2015 (UTC)[reply]
Approved for extended trial (10 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Let's run the bot for another 10 days! -- Magioladitis (talk) 08:51, 15 July 2015 (UTC)[reply]
Trial complete. Alright! As aforementioned, the bot only had issues archiving the request for AWB access page, since it is formatted very differently from the other PERM pages. I'm happy to report it's now working as intended, and there have been no issues during the extended trial. Diffs: [11][12][13]. Notice how the bot handles other tasks in the same edit. Finally, there have been no AWB bot requests so I can't show you a diff of MusikBot archiving such a request in production, but its ability to handle this scenario can be demonstrated on testwiki here. All in all I think I can definitively say the bot is stable. — MusikAnimal talk 18:36, 25 July 2015 (UTC)[reply]
Approved. ·addshore· talk to me! 12:48, 28 July 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Speedily Approved.
Operator: Harej (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 22:25, Wednesday, July 1, 2015 (UTC)
Automatic, Supervised, or Manual: Automatic
Programming language(s): Python
Source code available: new_discussions.py, members.py, assessment.py on GitHub
Function overview: Generates various automated lists for WikiProjects on an opt-in basis.
Links to relevant discussions (where appropriate): Part of the broader WikiProject X work. As an opt-in service it would not enabled for individual WikiProjects except upon request. Currently doing tests for WikiProjects Women in Technology (which I founded and de facto run), Cannabis, Cognitive Science, Evolutionary Biology, Ghana, and Hampshire, where members have requested their WikiProject to participate in pilot testing.
Edit period(s): New discussions script runs every 30 minutes and members script runs every hour, but edits aren't posted unless changes are made. Assessments script runs once a day.
Estimated number of pages affected: Around six pages per WikiProject opting in, depending on which WikiProjects sign up for what.
Exclusion compliant (Yes/No): Not applicable; bot edits very specific pages intended to be updated by bot
Already has a bot flag (Yes/No): Yes
Function details:
- New Discussions: Bot polls the recent changes feed every 30 minutes, and looks for the string "new section" in the edit summaries of talk pages. Bot searches its own database for WikiProjects associated with that talk page, and if that WikiProject has the new discussions feed enabled, the bot will update the feed accordingly. See demonstrations at WikiProject Cannabis, WikiProject Cognitive Science, Evolutionary Biology, Ghana, and Hampshire. Configuration settings are managed in Wikipedia:WikiProject X/wikiproject.json. Now that this task is pending before the BAG, I have discontinued testing until I get approved for further testing.
- Members: Maintains lists of WikiProject members. Members sign up for the WikiProject using a FormWizard form that creates a page in user space that uses the template {{WikiProjectCard}}; see User:Harej/WikiProjectCards/WikiProject Women in Technology for a demonstration. Bot checks for transclusions of the template and updates lists of WikiProject members and inactive members. (A member becomes inactive if they have not made some entry to the Recent Changes log in the past 30 days.) See demonstration at WikiProject Women in Technology. WikiProjects opt in by setting up the FormWizard for their own project; see MediaWiki:Gadget-formWizard/WikiProject Women in Technology/Join as an example. As with the new discussions feed, I am not going to test this any further until I get approved for further testing. See also the related BRFA for User:WikiProject Notification Service.
- Assessments: Produces article assessment-related worklists. Currently the bot produces a list of articles that need to be assessed quality (i.e. class) featuring article quality predictions made by ORES. See demonstration here. The bot also produces a list of articles and categories that are not tagged by the WikiProject but could potentially be in scope on the basis of belonging to categories that are tagged by the WikiProject. For example, The Hemp Trading Company is not tagged by WikiProject Cannabis, but it is sorted into Category:Hemp, which is. See demonstration here. I would also like to have a list of pages that need to be assigned importance/priority ratings, but I am still working on the system that prepares the priority predictions. Configuration settings are managed in Wikipedia:WikiProject X/wikiproject.json.
Discussion
Speedily Approved. Opt-in is optional so I think this is uncontroversial. -- Magioladitis (talk) 22:49, 7 July 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Speedily Approved.
Operator: Harej (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 18:14, Tuesday, June 30, 2015 (UTC)
Automatic, Supervised, or Manual: Automatic
Programming language(s): Python
Source code available: notifications.py on GitHub
Function overview: Creates a notification service for WikiProject members for those WikiProjects that implement this feature.
Links to relevant discussions (where appropriate): Part of a broader project to improve WikiProjects; see Wikipedia:WikiProject X
Edit period(s): Daily
Estimated number of pages affected: Very few. Edits are limited to /Notifications subpages, which there is only one of at the moment: Wikipedia:WikiProject Women in Technology/Notifications
Exclusion compliant (Yes/No): No; bot does not edit any pages other than /Notifications pages
Already has a bot flag (Yes/No): No
Function details: I have developed a system called "WikiProjectCards" that facilitate automated, useful membership lists for WikiProjects. Each WikiProject using this system (currently just one, as of writing) has a FormWizard form which facilitates the creation of a page in userspace with the {{WikiProjectCard}} template. See User:Harej/WikiProjectCards/WikiProject Women in Technology as an example. As part of the form that generates the card, a user can opt in for notifications relevant for the WikiProject. I'm currently planning for two notifications—new members and new discussions—but I would like to have additional notification options in the future.
The notifications script works by pulling user-space transclusions of {{WikiProjectCard}}, checking the page name structure for the username and WikiProject it is associated with. It then parses the template to see which notifications the user has signed up for. Each notification type is generated for each WikiProject once a day, and it is saved to a notifications subpage for that WikiProject, e.g. Wikipedia:WikiProject Women in Technology/Notifications. (The one notification there is the only test I will do before I am cleared for more testing; I like to test my scripts before submitting them for approval to see that they work.) The bot tags the user in the notification message in such a way that it does not render in the message (i.e., [[User:Example| ]]). Users tagged in this way get a notification in their notification center, hence "WikiProject Notification Service". In the long term I would like the bot to send notifications directly to users instead of posting them to a page, but this works for now.
The bot is called "WikiProject Notification Service" instead of "bot" because "WikiProject Notification Service mentioned you..." sounds more natural as a notification in my opinion.
Discussion
- This looks extremely uncontroversial to the point where I would guess that a BRFA shouldn't be needed... →Σσς. (Sigma) 01:08, 3 July 2015 (UTC)[reply]
- ((BAGAssistanceNeeded)) Harej (talk) 21:00, 7 July 2015 (UTC)[reply]
Harej is this Opt-in or Opt-out? From the sample page I see it is Opt-out? -- Magioladitis (talk) 22:56, 7 July 2015 (UTC)[reply]
- Magioladitis, WikiProjects opt in to using the system (there is one form per project) and then each user can choose which notifications they want, including no notifications. Harej (talk) 23:12, 7 July 2015 (UTC)[reply]
Speedily Approved. Magioladitis (talk) 23:16, 7 July 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Approved.
Operator: Hasteur (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 19:30, Monday, June 22, 2015 (UTC)
Automatic, Supervised, or Manual: Automatic
Programming language(s): Python
Source code available:
Function overview: Functional changes to Wikipedia:Bots/Requests for approval/HasteurBot, Wikipedia:Bots/Requests for approval/HasteurBot 2, and Wikipedia:Bots/Requests for approval/HasteurBot 9 after community supported requests
Links to relevant discussions (where appropriate): Wikipedia talk:Criteria for speedy deletion#Sidebar proposal
Edit period(s): Continous (sort-of)
Estimated number of pages affected: Same as HasteurBot1/HasteurBot 2
Exclusion compliant (Yes/No): No
Already has a bot flag (Yes/No): Yes
Function details: Changing the way HasteurBot (Task 1, Task 2/Task 9) works so that we start giving notices at the 5 months un-edited (instead of the 6 months previously) so that when the article does become eligible for CSD:G13 the author will (at that point) had a month to come back and make even a single edit to the page to derail the G13 eligibility. If the page is still un-edited after a month of the user being notified (i.e. 6 months) and the page is eligible, then go ahead an perform the G13 nomination. Hasteur (talk) 19:30, 22 June 2015 (UTC)[reply]
Discussion
I intend to file for early Administrative closure once the underlying discussion has ran for ~7 days to give people an opportunity to object to my proposal. Once the closure takes, I will make the changes and have a diff of the code I intend to change up. Because this task will change the rules for how we calculate which pages to deliver notice and nominate on, I intend to flush my current set of "notified" records so that we can have a clean break. This means that pages that were already inside their 6 month window will recieve at least one additional month of reprieve (and annother notification) before we perform the CSD:G13 nomination. Hasteur (talk) 19:30, 22 June 2015 (UTC)[reply]
- I don't see any technical problems here with the way you have proposed the transition. So far consensus seems to be strong at the other discussion. This BRFA should prove uncontroversial. Thank you for this work. Gigs (talk) 23:46, 22 June 2015 (UTC)[reply]
I'll speedy approve this as soon as the on-going discussion is closed. -- Magioladitis (talk) 16:21, 28 June 2015 (UTC)[reply]
Hasteur maybe it's time the main discussion closes? -- Magioladitis (talk) 06:26, 10 July 2015 (UTC)[reply]
Approved. Magioladitis (talk) 08:41, 15 July 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Denied.
Operator: Magioladitis (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 21:43, Monday, June 1, 2015 (UTC)
Automatic, Supervised, or Manual: Automatic
Programming language(s): AWB
Source code available: open source
Function overview: Remove {{Persondata}}
Links to relevant discussions (where appropriate): Wikipedia:Village_pump_(proposals)/Archive_122#RfC:_Should_Persondata_template_be_deprecated_and_methodically_removed_from_articles.3F concluded that "Consensus is to deprecate and remove."
Edit period(s): One time run
Estimated number of pages affected: 1.5 million pages
Exclusion compliant (Yes/No):
Already has a bot flag (Yes/No): Yes
Function details: Straightforward
Discussion
@Magioladitis: Two questions:
- Should this bot wait until AWB has been changed to stop adding/updating Persondata?
- Since Persondata is not visible in the article, does WP:COSMETICBOT apply? Would it be better to include Persondata removal in AWB general fixes, for other bots & users to remove as they make substantial changes?
Thanks! GoingBatty (talk) 23:30, 1 June 2015 (UTC)[reply]
- GoingBatty I'll be doing general fixes at the same time. I applied for this so I have control to AWB's code. The bot won't start until we are 100% that mass removal is a good thing to do. Before starting I'll modify the AWB's code not to add Persondata and probably we'll do a new release so that no other editors will add it. -- Magioladitis (talk) 05:41, 2 June 2015 (UTC)[reply]
- We already have consensus for the removal of Persondata. If the addition of Persondata by automated tools hasn't been a breach of COSMETICBOT, then neither should be its removal. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 07:40, 2 June 2015 (UTC)[reply]
- @Pigsonthewing: Hi Andy! You make a good point. Are there any bots that have been adding Persondata as their primary approved task? GoingBatty (talk) 12:27, 2 June 2015 (UTC)[reply]
- Yes, Rjwilmsibot used to add but not anymore. I already contacted Rjwilmsi about the RfC. -- Magioladitis (talk) 12:57, 2 June 2015 (UTC)[reply]
- Found the approval at Wikipedia:Bots/Requests for approval/RjwilmsiBot 4. Thanks! GoingBatty (talk) 13:33, 2 June 2015 (UTC)[reply]
- Yes, Rjwilmsibot used to add but not anymore. I already contacted Rjwilmsi about the RfC. -- Magioladitis (talk) 12:57, 2 June 2015 (UTC)[reply]
- @Pigsonthewing: Hi Andy! You make a good point. Are there any bots that have been adding Persondata as their primary approved task? GoingBatty (talk) 12:27, 2 June 2015 (UTC)[reply]
The RFC mentioned above has a section (not an actual wiki markup heading) "Rough plan" which says in part
1. Transfer |SHORT DESCRIPTION=
across to Wikidata. Done
...
4. Transfer any new data to Wikidata, then remove methodically.
I don't see any agreement to modify the rough plan, so I suppose that is the plan. Will this bot transfer new data to Wikidata, or just "remove methodically." If this bot just removes, how will the part about transferring new data be done? Also, does # 1 mean that if any new data is found, only the SHORT DESCRIPTION will be transferred and other, more suspect, data such as birth and death dates will not be transferred? Jc3s5h (talk) 16:03, 2 June 2015 (UTC)[reply]
- Whoah! I second that concern. The five-point plant presented at the RfC was expressly conditioned on the "transfer any new data to Wikidata" before the systematic removal is implemented. This immediate removal without transfer of new input persondata to Wikidata violates the conditions upon which the RfC was approved. Please adhere to the RfC "rough plan" as presented. Dirtlawyer1 (talk) 18:23, 2 June 2015 (UTC)[reply]
- I have already suggested that you read the lengthy and detailed discussion of data import under the RfC; and on the pages linked from there, on Wikidata. The RfC was concluded as "deprecate and remove", with no conditions atatched, in the light of that discussion. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:55, 2 June 2015 (UTC)[reply]
- To add, it became apparent during the course of the RfC that no more data would be transferred to Wikidata, all other PD fields having been deemed unreliable. I can't imagine what "remove methodically" might entail. Alakzi (talk) 20:38, 2 June 2015 (UTC) Oh, I see what you mean now, Dirtlawyer1. I agree that the (or a) bot should migrate any descriptions added after PLbot's last run; that would be eminently sensible. Alakzi (talk) 20:56, 2 June 2015 (UTC)[reply]
- @Alakzi: Indeed. I didn't just fall off the wiki-turnip truck yesterday. In addition to the recently added persondata descriptions, I have also raised a concern about the married name variations of female bio subjects listed under alternative names. Dirtlawyer1 (talk) 21:13, 2 June 2015 (UTC)[reply]
- To add, it became apparent during the course of the RfC that no more data would be transferred to Wikidata, all other PD fields having been deemed unreliable. I can't imagine what "remove methodically" might entail. Alakzi (talk) 20:38, 2 June 2015 (UTC) Oh, I see what you mean now, Dirtlawyer1. I agree that the (or a) bot should migrate any descriptions added after PLbot's last run; that would be eminently sensible. Alakzi (talk) 20:56, 2 June 2015 (UTC)[reply]
- I have already suggested that you read the lengthy and detailed discussion of data import under the RfC; and on the pages linked from there, on Wikidata. The RfC was concluded as "deprecate and remove", with no conditions atatched, in the light of that discussion. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:55, 2 June 2015 (UTC)[reply]
- The outcome of the RfC is "deprecate and remove", not "deprecate and remove with caveats". Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:55, 2 June 2015 (UTC)[reply]
That RFC closed 26 May 2015 (UTC). Wouldn't "deprecate and remove" imply a reasonable period of classic deprecation, possibly with warnings or something, while retaining functionality for some period of time before removal? --slakr\ talk / 05:27, 4 June 2015 (UTC)[reply]
- See proposal at Wikipedia:Bot requests#RfC: Remove persondata practical steps --Francis Schonken (talk) 05:34, 4 June 2015 (UTC)[reply]
- I'm not keen on the sound of "I'll be doing general fixes at the same time." Whatever this bot does, please don't do the other things AWB does at the same time, particularly changing reference positions. Magioladitis, if you're doing an AWB update, please remove that instruction. Also, only 35 people supported this RfC. Is that enough to be making mass changes? Sarah (SV) (talk) 20:43, 5 June 2015 (UTC)[reply]
- I won't do any ref reordering. I will wait until there is a stable consensus to remove. There is an active discussion in WP:BOTREQ at the moment. -- Magioladitis (talk) 06:28, 6 June 2015 (UTC)[reply]
First batch proposal
I had a look at Wikipedia:Bots/Requests for approval/RjwilmsiBot 4: this bot introduced persondata derived from the infobox. Afaics I suppose this automatic removal task would be uncontroversial (apart from maybe cosmeticbot concerns):
- yobot removes persondata templates that comply to all of the following:
- the persondata was created by RjwilmsiBot 4, before the 2014 merge of persondata to wikidata;
- the persondata template/data has not been modified since its introduction
- the article still carries an infobox
- the persondata does not contain any pre-1924 dates
I see several advantages to carrying out such task now, most importantly some feedback before carrying out possibly more intrusive tasks. --Francis Schonken (talk) 09:19, 6 June 2015 (UTC)[reply]
- Added fourth condition per Wikipedia:Bot requests#Temporarily exclude pre-1924 births from persondata delete. I still think it useful to operate a first batch of uncontroversial persondata removals. --Francis Schonken (talk) 05:33, 7 June 2015 (UTC)[reply]
I think the bot should be approved with no restrictions. -- Magioladitis (talk) 08:07, 5 July 2015 (UTC)[reply]
- So, a couple of still-outstanding issues I'd like to get clarification on:
- Ensuring no data loss — if I'm interpreting Wikipedia:Bot_requests#RfC:_Remove_persondata_practical_steps correctly, I'm seeing some concerns about WP:CONTEXTBOT. For example, ensuring that data contained in the still-present transclusions of
{{persondata}}
might need to be manually migrated to, e.g., Wikidata or the appropriate infoboxen. Is everyone certain that things are free and clear, including with the prior actions of prior bots (e.g., RjwilmsiBot 4)? I'm seeing uncertainty on this. - Removing the template — The concern that the outcome of the RFC is to deprecate and remove the template is clear, but as several have mentioned, a sweeping, automated removal (i.e., the net effect of a template deletion) isn't the same thing, and it could create chaos if it results in data loss. I mean "loss" in the sense of makes it extremely difficult to undo this action once done, as backlinks/transclusions will, in effect, be removed, and the edits will be intermixed with the rest of Yobot's edits, further complicating reverting or parsing disappeared data.
- RFC substituting for TFD — TFD is technically the appropriate venue—not a specialized RFC—for saying whether a template is safe or desired to be fully unlinked and deleted, hence the difference between "deprecate (and eventually remove)" versus "remove now." I say this not to be bureaucratic, but also because global community consensus is to discuss actual template deletion at TFD, because that's where people will be more likely watching should something be an issue. The more eyes the better before actual removal, once again, because it'd be a giant pain in the ass to undo. This obviously ties back in to #1. Anecdotally, I think someone at one point TFDed one of the
{{unsigned}}
templates long ago, and it was only because of that venue that I knew about it (and obviously the effects it might have had on my bot should it have succeeded). Similarly, if there are bots or scripts out there that are relying on this still, for some reason, are they going to feel any effects? In any case, TFD would be the best before automated, mass-removal, most likely. - What other general fixes are being done? — as editors have asked.
- Ensuring no data loss — if I'm interpreting Wikipedia:Bot_requests#RfC:_Remove_persondata_practical_steps correctly, I'm seeing some concerns about WP:CONTEXTBOT. For example, ensuring that data contained in the still-present transclusions of
- Granted I could be missing stuff and being over-paranoid, but that's sort of the point before making 1.5 million edits. :P
- --slakr\ talk / 19:37, 2 August 2015 (UTC)[reply]
Slakr on 4 see WP:AWB/GF. -- Magioladitis (talk) 17:30, 3 August 2015 (UTC)[reply]
- As a botop, I feel the level of concern is too great. No disrespect to the botop, and the others involved. While the idea may be good, it's my impression that as long as there isn't a high level of confidence that this is a good idea, and considering the nature of this task, and how much damage it could possibly cause as a result of a premature removal, I think we should hold off. We have to hold a discussion somewhere that asks the question, "Should the persondata be removed NOW?" As botops we are responsible for what kind of damage we can cause by running our high speed programs. BAG needs to consider the same thing when approving them. Right now, I'm of the opinion that the risk is higher than the benefit. No comment on whether removal is warranted or not, but it's definitely not the time to do it now. I feel BAG has been put in a difficult position here, so these are my 2 cents.—cyberpowerChat:Offline 04:01, 27 August 2015 (UTC)[reply]
@Magioladitis: — do we have any sort of update as to what's going on with this or any of the other concerns? --slakr\ talk / 00:49, 28 August 2015 (UTC)[reply]
Slakr About the data loss: According the discussions all info about birth/death date/place is already there and there is no evidence that in cases where the values differ that Persondata is more convenient than the other places we store similar info (Wikidata fields, infobox fields). How do you suggest that we solve this problem? We could make create tracking categories before any bot action. Similar to Category:Official website different in Wikidata and Wikipedia. -- Magioladitis (talk) 05:06, 28 August 2015 (UTC)[reply]
So we can tweak request to match the one approved for BattyBot and have two bots running. -- Magioladitis (talk) 19:45, 14 September 2015 (UTC)[reply]
- ? Why would we have two bots running? It's not as if persondata need to be removed twice...
- Recapitulating: above I proposed to start with an uncontroversial batch of removals, to which Magioladitis replied "the bot should be approved with no restrictions".
- At WP:VPPROP#Multiphase removal GoingBatty is successful in negotiating the community through one uncontroversial batch after another.
- Magioladitis declined such negotiative process, and now seems to be willing to ride on the success of a botop who is careful in taking this step by step, and negotiates every step.
- In sum (1) we don't need two bots doing the same thing, better go help the ongoing process of finding consensus for every step; (2) the step-by-step process works, and GoingBatty garnered more community confidence for it than Magioladitis. (3) Magioladitis seems to underestimate that "tweaking" the request is as well going back to a process he declined, and supposes step by step negotiations, which is something completely different from adding a few words and then let it go loose.
- For the Yobot 24 BRFA, with all due respect, I think it is time to close it as having gone stale after the recent developments. --Francis Schonken (talk) 03:42, 15 September 2015 (UTC)[reply]
- My negotiation resulted in less than 1,000 removals out of 1.2 million. Everyone's suggestions at negotiating small batches of removal will be appreciated. If Magioladitis is more successful than I am, I have no problems with him submitting a bot request to remove certain batches. GoingBatty (talk) 02:04, 16 September 2015 (UTC)[reply]
- I must concur with the above, emphasising that there is no absolute necessity to remove persondata immediately. →Σσς. (Sigma) 04:04, 15 September 2015 (UTC)[reply]
Denied. — Given the discussion, not to mention my own concerns, I can't confidently say that this bot has consensus to run. To help sum up: while there was an RFC that favored deprecation and eventual removal of the template, the latter's timeframe was not clearly established and otherwise appears to present no urgency. Furthermore, several issues are unresolved (or at the very least, not sufficiently reassured to be resolved) with respect to the practical upshot on other forms of automation and/or the process of migrating the now-deprecated template's data to the more appropriate and/or newer representations of the data (whatever that might entail). As such, bulk/blanket removal may serve to unexpectedly confound that process. Because of these issues and e.g., Wikipedia:Bots/Requests for approval/BattyBot 47, it seems the likely-less-problematic/likely-less-controversial approach for removal of the template is via piecemeal tasks rather than all at once (for now, anyway). --slakr\ talk / 21:36, 16 September 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Request Expired.
Operator: Jamesmcmahon0 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 08:33, Sunday, May 10, 2015 (UTC)
Automatic, Supervised, or Manual:Automatic
Source code available: AWB
Function overview: Creating redirects from [[Foo]] to [[List of Foo]].
Links to relevant discussions (where appropriate):
Edit period(s): one-time run, then weekly/monthly depending on how many new lists are created without redirects
Estimated number of pages affected: Initially 12617
Exclusion compliant (Yes/No):Yes
Already has a bot flag (Yes/No): Yes
Function details: I have compiled a list of pages where there exists a [[List of Foo]] page but no [[Foo]] page, as a redirect or otherwise. My bot will create all of the pages as redirects to their lists. Specifically with the content;
#REDIRECT [[List of Foo]] {{R from list topic}} {{R with possibilities}} [[Category:Bot created redirects]]
This is per Pigsonthewing request at Wikipedia:Bot requests#Redirects to lists, from the things they are lists of.
Discussion
- Support as proposer. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:13, 10 May 2015 (UTC)[reply]
- @Jamesmcmahon0: I have created {{R from list topic}}, which you might like to use instead. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:19, 10 May 2015 (UTC)[reply]
- I agree that {{R from alternative spelling}} is the wrong rcat to use. If you want to include an existing template I would suggest {{R from short name}}, but I expanded {{R from list topic}} a little bit to make sure that it doesn't get confused with {{R to list entry}} and to point out that many of these redirects probably could be expanded into a full article and therefore need {{R with possibilities}} as well. --Ahecht (TALK
PAGE) 21:28, 11 May 2015 (UTC)[reply]- I have updated the details with these redirect templates. Jamesmcmahon0 (talk) 12:26, 12 May 2015 (UTC)[reply]
- I agree that {{R from alternative spelling}} is the wrong rcat to use. If you want to include an existing template I would suggest {{R from short name}}, but I expanded {{R from list topic}} a little bit to make sure that it doesn't get confused with {{R to list entry}} and to point out that many of these redirects probably could be expanded into a full article and therefore need {{R with possibilities}} as well. --Ahecht (TALK
- @Jamesmcmahon0: I have created {{R from list topic}}, which you might like to use instead. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:19, 10 May 2015 (UTC)[reply]
Needs wider discussion. Although WP:MASSCREATION doesn't specifically call out redirects, I'd say we need more than one user's request and a lack of any response on one WikiProject's talk page before bot-creating 77119 new redirects. Try WP:VPR. Anomie⚔ 11:52, 10 May 2015 (UTC)[reply]
- I agree with Anomie. For mass creation you'll have first to request a comment in the Village Pump. -- Magioladitis (talk) 21:02, 10 May 2015 (UTC)[reply]
- That's not my reading of that guideline; and it seems overly bureaucratic - do we seriously expect any objections? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 07:25, 12 May 2015 (UTC)[reply]
- I have submitted an RfC. WP:VPR. Jamesmcmahon0 (talk) 12:29, 12 May 2015 (UTC)[reply]
- You say that you've made a list of all relevant pages; can we see it? עוד מישהו Od Mishehu 08:32, 12 May 2015 (UTC)[reply]
- List is here Jamesmcmahon0 (talk) 12:31, 12 May 2015 (UTC)[reply]
- Support but the list needs to be filtered for ':' and singleton letters to catch Arcade video games: O, Athletes from Maryland A – M, Botanists by author abbreviation (B), Commodore 64 games (A–M), Compositions for viola: L to N, Country names in various languages (A–C), Craters on Mars: A-G, Craters on the Moon, A–B, DC Comics characters: L, Acronyms: A, Airports by IATA code: A, Albums containing a hidden track: A, American Civil War Medal of Honor recipients: A–F, Hieroglyphs/A, Allied military operations of the Vietnam War (G–L), Amiga games (A–H), Birmingham City F.C. players (25–99 appearances), Black metal bands, 0–K, ... Stuartyeates (talk) 02:37, 14 May 2015 (UTC)[reply]
- I have filtered the list for anything like the above as well as a few others that I don't think make sense as their own pages. The updated list is at the same location here. List is now around 12000. Jamesmcmahon0 (talk) 14:53, 16 May 2015 (UTC)[reply]
- Per some comments at the village pump, I have filtered out another 526 pages where "List of Foos" and "Foo" exist but "Foos" does not, in these cases it is not inherent that Foos should be created the list of these is here. The total number of pages to be created is now 12091 — Preceding unsigned comment added by Jamesmcmahon0 (talk • contribs) 11:11, 19 May 2015 (UTC)[reply]
- @Jamesmcmahon0: I'm still seeing examples in that list that might be better off being removed, such as All-American Girls Professional Baseball League players (A–C), Aircraft (B), Airports in Canada (A–B), Amiga games (A–H), Banks (alphabetically), British children's and young adults' authors (1900–49), CJK Unified Ideographs Extension B (Part 1 of 7), etc. I think you might be better off just removing anything with a parenthetical. There are also some without parentheses, such as CJK Unified Ideographs, part 1 of 4 and Candidates for U.S. Representative from Ohio, A–G (maybe removing anything with a comma would be for the best as well) and Canadian plants by family A. I also found Belgian flags, which should probably redirect to Belgian flag instead (I thought you caught plurals where the singular existed). --Ahecht (TALK
PAGE) 17:14, 19 May 2015 (UTC)[reply]- @Ahecht: - Thanks for pointing that out, I'll first check that I uploaded the correct list (I have quite a lot now all with not great names and different content) as the (A-C), A-C and (A) type ones should definitely not be in there anymore. I'll spend a bi of time looking through and clearing out ones that either should not be there or might be contentious. Stand by, I might be a few days. Jamesmcmahon0 (talk) 19:34, 19 May 2015 (UTC)[reply]
- @Jamesmcmahon0: I'm still seeing examples in that list that might be better off being removed, such as All-American Girls Professional Baseball League players (A–C), Aircraft (B), Airports in Canada (A–B), Amiga games (A–H), Banks (alphabetically), British children's and young adults' authors (1900–49), CJK Unified Ideographs Extension B (Part 1 of 7), etc. I think you might be better off just removing anything with a parenthetical. There are also some without parentheses, such as CJK Unified Ideographs, part 1 of 4 and Candidates for U.S. Representative from Ohio, A–G (maybe removing anything with a comma would be for the best as well) and Canadian plants by family A. I also found Belgian flags, which should probably redirect to Belgian flag instead (I thought you caught plurals where the singular existed). --Ahecht (TALK
- Per some comments at the village pump, I have filtered out another 526 pages where "List of Foos" and "Foo" exist but "Foos" does not, in these cases it is not inherent that Foos should be created the list of these is here. The total number of pages to be created is now 12091 — Preceding unsigned comment added by Jamesmcmahon0 (talk • contribs) 11:11, 19 May 2015 (UTC)[reply]
- I have filtered the list for anything like the above as well as a few others that I don't think make sense as their own pages. The updated list is at the same location here. List is now around 12000. Jamesmcmahon0 (talk) 14:53, 16 May 2015 (UTC)[reply]
- @Jamesmcmahon0: Could you maybe put a list of the exceptions (those that you have removed) somewhere in may talk namespace plase? There's some tidying up that I'd like to do on some of these articles. In particularl I'm thinking of a hidden maintenance cat Category:Lists broken across multiple articles which can then be used to regularise the naming of these. Stuartyeates (talk) 21:00, 19 May 2015 (UTC)[reply]
Anomie how do we proceed here? -- Magioladitis (talk) 23:09, 30 May 2015 (UTC)[reply]
- The "needs advertisement" seems to have been satisfied, it got some support, and it got some good suggestions that reduced the list dramatically. Proceed as you see fit. Anomie⚔ 01:19, 2 June 2015 (UTC)[reply]
- Update - I've been away but am now redoing lists from the latest dumps and checking all suggestions etc. standby... Jamesmcmahon0 (talk) 13:11, 7 June 2015 (UTC)[reply]
- @Jamesmcmahon0: Any news? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:00, 25 June 2015 (UTC)[reply]
- Update - I've been away but am now redoing lists from the latest dumps and checking all suggestions etc. standby... Jamesmcmahon0 (talk) 13:11, 7 June 2015 (UTC)[reply]
A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) — @Jamesmcmahon0: I'm assuming, judging by your contribs, that real life has gotten a hold of you for now. :P We can stash the BRFA and either re-open it or make a new one if you wanna hold off for now (and to be honest, sometimes I'm just aren't in the mood to code, so again, no worries and no pressure). :D By default, we'll likely close this as expired in the next few days unless you drop us a quick "wait!" here --slakr\ talk / 04:21, 27 August 2015 (UTC)[reply]
Request Expired. (for now). We can revisit once the operator is active again. --slakr\ talk / 23:17, 2 September 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Request Expired.
Operator: PhantomTech (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 02:11, Thursday, March 19, 2015 (UTC)
Automatic, Supervised, or Manual: Automatic
Programming language(s): Python
Source code available: No, not now at least, though I'll share some of the regex if asked during the approval process
Function overview: Monitors recent changes for possible vandalism and edits from long term abuse users, logs findings and (sometimes) gives information to AN/I for review by users.
Links to relevant discussions (where appropriate): Not sure if this would require consensus from AN/I since it would be posting there or not since the posting task is simple and likely to be uncontroversial.
Edit period(s): daily (while I have a computer on) with plans to make it continuous
Estimated number of pages affected: 1 (AN/I) not counting pages in its own user space
Exclusion compliant (Yes/No): no
Already has a bot flag (Yes/No): no
Function details: This bot is meant to allow a decrease in the amount of edit filters and to identify abuse that can't be reverted by bots like ClueBot due to lack of certainty. Every 60 seconds (that time might be lowered to 20-40 seconds to spread load) a list of changes since the time of the last check is filled. On a separate thread, the bot goes through the list, and decides if the actions match a set filter, these filters are usually similar in what they check to the edit filters however are not limited to the same restraints. If a filter is matched the associated actions are taken, usually logging to the user space and sometimes a noticeboard report. Essentially, this bot acts as a post-edit filter, currently targeting long term abuse but technically able to act on any identifiable action. Since it happens after edits, as opposed to "during" edits, it doesn't slow down editing for users so problematic edits don't have to be frequent, like they do to be edit filter worthy, for it to be worth it for this bot to check for them. In its current state I have two LTA matches setup, one stolen from a log only edit filter and another stolen from edit filter requests, and a general abuse match, also stolen from edit filter requests. If the bot is accepted, I plan on going through all the active long term abuse cases and adding whichever ones I can along with some edit filter requests that aren't accepted due to infrequency.
Discussion
Vandalism/abuse monitoring is a difficult area; I suggest that you write your bot and have it edit a page in its or your userspace (no approval necessary unless edit rates are high) as if it were ANI, and monitor what it reports. You can in turn pass the valid observations it makes along to ANI, and if the quality of the reporting is high enough you may find other people watching the page to see what it finds. I expect you'll get a high false-positive rate which you'll need to analyse to improve the performance of your algorithms, and eventually you'll get to a point where regexs just don't cut it for detecting the long-term, low-frequency abuse you're targetting - and you'll have to look at more sophisticated processes. This is the technological evolution that Cluebot went through, but it catches more egregious and obvious vandalism.
Do you think operating in your or the bot's own userspace would be an acceptable stepping stone? Josh Parris 22:18, 20 March 2015 (UTC)[reply]
- I realize that there is lots of long term abuse that can't be solved by regex alone, this bot will never be able to handle every LTA case but I do plan on implementing more advanced checks in the future. I have no problem running my bot for a bit with it doing nothing but logging to User:ThePhantomBot/log. PhantomTech (talk) 22:36, 20 March 2015 (UTC)[reply]
- I would want to see a community consensus that bot generated ANI reports are wanted, please discuss and link that discussion here. — xaosflux Talk 05:43, 26 March 2015 (UTC)[reply]
- @Xaosflux: As I've been working on my bot I've been adding more functionality and thinking about the best ways to have the bot's reports dealt with. Here's my current plan for how it will report things:
- Bad page recreation - Log to user space
- High probability sockpuppets - Report to SPI
- Lower probability sockpuppets - Log to user space
- LTA detection - Report to AIV or report to AN/I where certainty is reasonably low (not too low, don't want to waste people's time)
- Newly added LTA filters, including ones being tested - Log to user space
- IPs using administrative templates - Report to AN/I
- Sleeper account detection - Not implemented yet so I don't know how often it will go off, if its often log to user space otherwise report to AN/I
- I assume you still want to see a discussion for the AN/I reports but do you want to see any for the other places? I'm guessing you'll want SPI mentioned in the discussion too since I don't think any bots currently report to there. Also, do you have any suggestions on where to report these things or how to report them? Admittedly AN/I does feel like a weird place for bot reports but the goal is to get the attention of editors who may not be aware of the bot's existence. PhantomTech (talk) 07:03, 26 March 2015 (UTC)[reply]
- Start reading AIV archvies such as Wikipedia_talk:Administrator_intervention_against_vandalism/Archive_3#Suggested_merge_of_Wikipedia:Administrator_intervention_against_vandalism.2FTB2 for some suggestions. WP:AIV/TB2 is probably the oldest 'bot reported' noticeboard right now. — xaosflux Talk 10:23, 26 March 2015 (UTC)[reply]
- @Xaosflux: Are you suggesting that if my bot were to report to ANI it should do so via a transuded page? I like that idea, using transclusion to put the bot's reports somewhere they'll be seen keeps the bot's updates off the watchlists of people who don't care. PhantomTech (talk) 15:32, 26 March 2015 (UTC)[reply]
- I'm sugesting that prior community discussion on ANI bot reports came to that conclusion - and that after reading up on it you start new discussions to find out where people would make best use of your reports. For ANI it could be the existing TB2 subpage, but they might want it on its OWN subpage; for the other forums people might want subpages, might want main, or might not want bot reports at all. I am not trying to dictate the solution, just that whatever it is should enjoy community consensus before integrating to existing forums. — xaosflux Talk 16:59, 26 March 2015 (UTC)[reply]
- @Xaosflux: Are you suggesting that if my bot were to report to ANI it should do so via a transuded page? I like that idea, using transclusion to put the bot's reports somewhere they'll be seen keeps the bot's updates off the watchlists of people who don't care. PhantomTech (talk) 15:32, 26 March 2015 (UTC)[reply]
- Start reading AIV archvies such as Wikipedia_talk:Administrator_intervention_against_vandalism/Archive_3#Suggested_merge_of_Wikipedia:Administrator_intervention_against_vandalism.2FTB2 for some suggestions. WP:AIV/TB2 is probably the oldest 'bot reported' noticeboard right now. — xaosflux Talk 10:23, 26 March 2015 (UTC)[reply]
- @Xaosflux: As I've been working on my bot I've been adding more functionality and thinking about the best ways to have the bot's reports dealt with. Here's my current plan for how it will report things:
I posted a discussion at Wikipedia:Village_pump_(idea_lab)#ThePhantomBot_reporting_to_noticeboards to help get an idea of what kind of reporting users would like. Depending on how that goes I'll post something to village pump proposals with notifications on the relevant noticeboard's talk pages. PhantomTech (talk) 05:57, 27 March 2015 (UTC)[reply]
- There's a lot of room for fuzzy logic on some of the things you've mentioned. To name a few:
- High probability sockpuppets / Lower probability sockpuppets — exactly how are these detected?
- Bad page recreation — what's detected here?
- LTA detection — how does this magic work? Do you have any code already written, and if so, which forms of LTA did you have in mind? Any examples?
- A huge bonus of the abuse filter is that people can actually see what the filters are detecting and make tweaks if there are a crapton of false positives, for example.
- Also, any updates otherwise on the stuff raised above? It looks like this has been sitting here for a few months. :P
- --slakr\ talk / 05:48, 4 June 2015 (UTC)[reply]
- Note — There appears to be a discussion on Wikipedia:Bot_requests#Remove_persondata with an RFC subsection opened recently... that is, in case you're not coming here from there already. --slakr\ talk / 06:38, 4 June 2015 (UTC)[reply]
- @Slakr: Sorry about how long this has been sitting here, I've gotten busy since starting it which has slowed it down a bit, hopefully I'll be able to get more time to work on it more often soon. To your concerns about "fuzziness", you're right, there aren't any clear cut ways to tell what is considered, for example, "high probability" vs "low probability" which means that, to some extent, my judgment will have to be trusted in deciding where some things go. I don't expect too many people to just trust me without having something to look at so I think the best way to gain support for things outside of the bot's userspace is to be able to show some examples of real detections and where they would be reported. Currently, almost all detections are sent to the debug log, but I'm working on setting up different logging locations and don't plan on seeking consensus until after those have been reported to for a bit.
- Sockpuppets: I'm working on different ways to detect "general" (non-LTA case) sockpuppets, what the bot can detect right now is based on removing speedy deletion tags and I don't think it causes enough disruption to be worth an SPI. I'm not sure what other ways I'll come up with for general sockpuppet detection so, if the bot ends up being allowed to report to SPI, the early reports will all be LTA cases with new reports only happening after several successful manual reports.
- LTA: LTA cases are detected using anything from the same basic checks an abuse filter can do to ones that involve looking through all the user's contributions for patterns or past behavior. The ones that can be detected are any that I'm aware of and can think of a way to detect, which would include any that currently have abuse filters setup for them. I have detections setup for a few cases (see User:ThePhantomBot/Edit Codes#LTA: Long Term Abuse for a list) but a lot of them haven't been updated during my inactivity.
- Bad pages: Bad page detections are pages that probably have to be deleted, (or moved) currently bad recreates are the only things detected (and reported here) but I plan on setting up something to detect autobiographies better than the abuse filters do and have something "ready" to detect self-published drafts.
- Help from other users: I agree that a bonus to the abuse filters is that more than one user can edit them, since that isn't as easy with a bot I think it's something that will limit its usefulness a bit so I hope someone will be able to think of some way to allow others to help more directly without having to make all the detection algorithms public. Until then, I don't mind sharing detection algorithms with anyone who is currently trusted by the community and has a reason to see them (like abuse filter managers) via email or something. I'd also like to keep abuse filters setup and updated (but disabled) that would exist if my bot didn't, so they can easily be enabled in case something happens to my bot. Keeping the disabled abuse filters updated is something I wouldn't mind doing personally (if I can get the perm) or by working with abuse filter managers. To mitigate the issue of something causing false positive spam, I plan to have a fully protected page to allow specific detections to be turned off by administrators without having to turn them all off by blocking or something and testing detections in the debug log before having them file any reports.
- Other Updates: Sleeper account detection is something I have a few ideas for how to do but haven't worked on in a while, it probably won't be done till a while after everything else has been. I've been working on setting up what's needed for the ANI reports ("priority" reports) and I think the proposal will end up being to transclude a navbox (this one, once it's done) so that it is split from the actual noticeboard page. ANI is still the best place I can think of to have the "priority" reports go since the goal is to get experienced editors to see them soon after being reported and decide what (if anything) needs to be done, and I think using a navbox and transclusion will allow for minimal disruption, but I'm open to other ideas and realize it may not get enough support to happen.
- Hopefully that answered all your questions without being too long, if not, let me know. PHANTOMTECH (talk) 07:43, 4 June 2015 (UTC)[reply]
A user has requested the attention of the operator. Once the operator has seen this message and replied, please deactivate this tag. (user notified) There seem to be a lot of false positives or things I'm not sure about:
- 2015-07-08 11:30:31 Detected UA-6, pageid: 47166091 - a redirect or something?
- [16] - appears to be a new page, but the prior delete was for A7?
- 2015-07-08 15:41:08 Detected UA-6, pageid: 47149395 - are all previously-deleted/CSDed draft moves assumed to be abusive, or where were these going to be reported and how?
- These are only some of the ones I checked. There are some good detections in my sample, but false positives—rather, the proportion of them and the assumptions made—can be more problematic with the community. Granted, all these were well over a month ago, and it seems like you've been a little inactive (not to mention, nobody else has chimed in here in the meantime :P). Have there been any updates in the meantime? --slakr\ talk / 04:59, 27 August 2015 (UTC)[reply]
Request Expired. for now; can revisit when the operator becomes active again. --slakr\ talk / 23:19, 2 September 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Request Expired.
Operator: Enzet (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 00:18, Sunday, March 8, 2015 (UTC)
Automatic, Supervised, or Manual: supervised.
Programming language(s): Python.
Source code available: no source code available since bot is a part of major project.
Function overview: fixes inconsistencies and formatting in metro stations articles (in station infobox and S-rail templates).
Links to relevant discussions (where appropriate):
Edit period(s): every time bot found some inconsistency in metro pages.
Estimated number of pages affected: about 1000 pages. There are about 10 K metro stations in the world, so no more than 10 K pages should be affected.
Exclusion compliant (Yes/No): yes.
Already has a bot flag (Yes/No): no.
Function details: I have The Metro Project for metro map automated drawing. It uses Wikipedia for check metro system graphs and sometimes meets inconsistencies and bad formatting in Wikipedia articles. Now I fix them manually (see my contributions) but want to entrust it to my bot.
Tasks for this request:
wrap dates in station infobox withdate
template, e.g.2000-03-30
to{{date|30-03-2000|mdy}}
;- add links to station structure types and platform types, e.g.
Shallow single-vault
to[[Single-vault station|Shallow single-vault]]
; - fix redirects in S-rail template.
Discussion
I see the bot account has been editing articles. It is not yet approved for that.
I note you want to edit dates, but I see from your recent edit to Klovska (Kiev Metro) and your function details (above) you haven't realised the importance of ISO 8601 date formatting. I also note that you did not elect to use the style reportedly preferred by Ukrainians in your edit; is there a reason for this? Out of interest, why are these dates of interest to your bot?
The bot fixes inconsistencies between articles; how does it know which one is correct?
The links to station structure types and platform types you're proposing to link - are they the ones in infoboxes, or article text?
What major project is bot a part of, and why does that make the source code unavailable? Josh Parris 14:32, 9 March 2015 (UTC)[reply]
- I'm sorry for editing without approval. It was a test to make sure bot works. I'll never do it again.
- Yeah, I see, date changes seem to be a bad idea. I think, I should remove it from tasks list. Should I undo my edits (there are only 5 of them)?
- About inconsistencies. Bot doesn't know which one is correct, it only can detect wrong things or possibly wrong things. For example, wrong date format (month number can't be greater then 12), wrong terminus (station cannot be a next or previous station for itself), if station A is next for station B, station B should be previous for station A, wrong S-rail values (if it conflicts with station lists on metro page or line page), and so on. That's why bot is not automatic, I supervise every edit. I don't know how to formulate it as a task since there are so many types of inconsistencies. May be you can help me?
- Yes, bot will add links to infobox only if there is no such link in article text.
- My major project is not open source for now. It generates very simple suggestions for bot I exampled above—what to replace in which article. If bot source code is important, I can push it to public repository but it is trivial since it uses pywikibot (no more then 100 LOC). Enzet (talk) 17:01, 9 March 2015 (UTC)[reply]
- If you're supervising every edit, then this is a "manual bot" and can be run using your own account without approval. Would you like to do so? Josh Parris 11:30, 13 March 2015 (UTC)[reply]
- OK, I understand all about inconsistencies. If I don't want to use Enzet account for semi-automated editing, can I use EnzetBot account (with removed
{{Bot}}
template and without approval) or should I register new account without bot keyword? What is a good practice for that? Also, is there some criteria for semi-automated editing (no faster than 1 edit per 5 seconds, no more 100 edits in a row, or something like that)? (Sorry if I missed it from the rules.)
- OK, I understand all about inconsistencies. If I don't want to use Enzet account for semi-automated editing, can I use EnzetBot account (with removed
- Also, I am realized that (1) wrapping station structure and platform type with links and (2) fixing S-rail redirects tasks may be provided without supervising or supervising for them is really fast (checking is trivial). Can I get approval or disapproval for these tasks in this request or I should create new one? Enzet (talk) 09:27, 17 March 2015 (UTC)[reply]
Josh Parris any further comments? -- Magioladitis (talk) 18:44, 19 May 2015 (UTC)[reply]
Enzet What are the redirects of S-rail? -- Magioladitis (talk) 08:55, 22 June 2015 (UTC)[reply]
((OperatorAssistanceNeeded|D)) Magioladitis (talk) 06:28, 10 July 2015 (UTC)[reply]
- Redirections in S-rail templates. See this, this (station redirect), or this edit (line redirect). Enzet (talk) 15:05, 14 July 2015 (UTC)[reply]
I asked WikiProject Stations to join the discussion. WikiProject transport too. -- Magioladitis (talk) 09:03, 15 July 2015 (UTC)[reply]
{{Date}} should not be used in articles; the template makes this clear. Alakzi (talk) 11:32, 15 July 2015 (UTC)[reply]
Alakzi the bot already changed 3 pages. Do you suggest the edits should be reverted? -- Magioladitis (talk) 14:15, 15 July 2015 (UTC)[reply]
I striked out this bot part. -- Magioladitis (talk) 17:10, 15 July 2015 (UTC)[reply]
Short of publishing the source code, I'd want to see a list of all replacements the bot would perform. Alakzi (talk) 17:44, 15 July 2015 (UTC)[reply]
Request Expired. — the operator's account appears to be inactive for over a month with outstanding issues / concerns here. Marking expired for now. --slakr\ talk / 04:26, 27 August 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
Bots in a trial period
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Approved.
Operator: Cyberpower678 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 13:37, Saturday, June 6, 2015 (UTC)
Automatic, Supervised, or Manual: Automatic and Supervised
Programming language(s): PHP
Source code available: Here
Function overview: Replace existing tagged links as dead with a viable copy of an archived page.
Links to relevant discussions (where appropriate): Here
Edit period(s): Daily, but will likely look it will run continuously.
Estimated number of pages affected: 130,000 to possibly a million.
Exclusion compliant (Yes/No): Yes
Already has a bot flag (Yes/No): Yes
Function details: The bot will crawl its way through articles on Wikipedia and attempt to retrieve an archived copy of dead-links at the time closest to original access date, if specified. To avoid persistent edit-warring, users have the option of placing a blank, non-breaking, {{cbignore}}
tag on the affected to tell Cyberbot to leave it alone. If the bot makes any changes to the page, a talk page notice is placed alerting the editors there that Cyberbot has tinkered with a ref.
The bots detecting of a dead-link needs to be carefully thought out to avoid false positives, such as temporary site outage. Feel free to suggest some algorithms to add to this detection function. At current the plan is to check for a 200 OK response in the header. If any kind of response that indicates downage, the bot proceeds to add the archived link if available, or otherwise tags it as dead. A rule mechanism can be added to the configurations for sites that follow certain rules when the kill a link.
There is a configuration page that allows the bot to be configured to desired specifications, which can be seen at User:Cyberbot II/Dead-links. The bot attempts to parse various ways references have been formatted and attempts to keep consistent as to not destroy the citation. Even though the option to not touch an archived source is available, Cyberbot II will attempt to repair misformatted sources using archives if it comes across any.
Any link/source that is still alive, Cyberbot can check for an available archive copy, and the request site be archived, if it can't find any.
The bot can forcibly verify if the link is actually dead, or be set to blindly trust references tagged as dead.
The bot may need some further developing depending on what additional issues crop up, but is otherwise ready to be tested.
Discussion
I think this is a great idea. One thought: there are several kinds of dead links - (a) sometimes the site is completely defunct and the domain simply doesn't work - there is no server there any more, (b) sometimes the site has been bought by another entity and whatever used to be there isn't there any more, so most things get a 404, (c) sometimes a news story is removed and now gets a 404, or (d) sometimes a news story is removed and is now a 30x redirect to another page.
For a, b, or c, what you are describing is a great idea and probably completely solves the problem. For (d), it may be tricky to resolve whether this is really a dead link or whether they merely relocated the article.
One thought/idea: can you have a maintainable list of newspapers that are known to only leave their articles available online for a certain amount of time? The Roanoke Times for example, I think only leaves things up for maybe six months. Sometimes, they might redirect you to a list of other articles by the same person, e.g. [17] which was a specific article by Andy Bitter and now takes you to a list of Andy Bitter's latest articles. Other times you just get a 404, e.g. [18]. Since links from roanoke.com are completely predictable that they will disappear after six months, you could automatically replace 302s, whereas for some other sites, you might tag it for review instead of making the replacement on a 302. An additional possible enhancement would be that, knowing that the article is going to disappear in six months, you could even submit it to one of the web citation places so that we can know it will be archived, even if archive.org misses a particular article. --B (talk) 18:38, 9 June 2015 (UTC)[reply]
- First time comment on a BRFA. For d) one possibility (for cite templates with additional parameters such as author, date, location or title) would be for the bot to check if these words appear on the target page of the link. Non-404 dead links usually lack these. Jo-Jo Eumerus (talk) 18:54, 9 June 2015 (UTC)[reply]
- This is really great input, and it should be possible, to maintain such a list via a control panel I've already set up. Right now, my development is focused various ways a reference has been formatted and appropriately parsing it and modifying it when needed.—cyberpowerChat:Limited Access 20:42, 9 June 2015 (UTC)[reply]
- Another thing that came to mind: Can the bot check more than one archive service? Such as both the Wayback one and WebCite? Jo-Jo Eumerus (talk) 16:13, 13 June 2015 (UTC)[reply]
- I really wish you informed me of that earlier, the bot has been written around the WayBack machine. Also, the WebCite doesn't seem to have an API
, or a way to immediately look up a website. It seems to prefer to email the user with a list of archive links. My bot doesn't have email though, so WebCite is out atm. Screen scraping by the looks of it may not be effective either.—cyberpowerChat:Online 16:29, 13 June 2015 (UTC)[reply] - I can however program the bot to ignore any reference that have a webcite template or archive link.—cyberpowerChat:Online 16:35, 13 June 2015 (UTC)[reply]
- Rats. I know I have to suggest things more quickly. Category:Citation Style Vancouver templates should perhaps be included as well, some of them allow for URLs. Jo-Jo Eumerus (talk) 18:58, 13 June 2015 (UTC)[reply]
- I really wish you informed me of that earlier, the bot has been written around the WayBack machine. Also, the WebCite doesn't seem to have an API
- Another thing that came to mind: Can the bot check more than one archive service? Such as both the Wayback one and WebCite? Jo-Jo Eumerus (talk) 16:13, 13 June 2015 (UTC)[reply]
- This is really great input, and it should be possible, to maintain such a list via a control panel I've already set up. Right now, my development is focused various ways a reference has been formatted and appropriately parsing it and modifying it when needed.—cyberpowerChat:Limited Access 20:42, 9 June 2015 (UTC)[reply]
Since you mentioned this, how do you avoids things like "temporary site outage"? Links go temporary bad very often. Sometimes there's a DNS issues. Sometimes regional servers or cache servers are down. Sometimes clouds are having issues. There's scheduled maintenances and general user errors. It is definitely unreliable to check a link only once.
I don't want to repeat all the comments from previous BRFAs, but there's tons of exceptions that you have to monitor. Like I've had sites return 200 and a page missing error just as returning 404 and valid content. I've had sites ignore me because of not having some expected user agent, allowing/denying cookies, having/not having referrer, being from certain region, not viewing ads, not loading scripts, not redirecting or redirecting to a wrong place or failing redirect in scripts, HEAD and GET returning different results, and a hundred other things. — HELLKNOWZ ▎TALK 18:03, 24 June 2015 (UTC)[reply]
- Those are issues that need to be controlled for even without a bot. If an average editor tries to follow an external link and comes to a 404 page, that editor is as likely to replace the link with a working one, even if the 404 page only comes from a temporary cite error. If there is an archived version of the page, and the link is changed to that, then no information is lost. bd2412 T 18:09, 24 June 2015 (UTC)[reply]
- Even if it is a temporary downage, Cyberbot will simply be adding an archived version of the link to the original citation either through the use of the wayback template or if using a cite template through the archive-url parameter. Nothing is lost. The verification procedure will be very erroneous at first but as I get more information, refinements can be easily added. Rules can be added to the bot's configuration page for ones with regular problems. If the bot is being problematic with a source, users can attach a
{{cbignore}}
tag to the citation to tell the bot to go away.—cyberpowerChat:Limited Access 18:46, 24 June 2015 (UTC)[reply]- Adding an archive url implies the link is dead, unless you add
|deadurl=no
, which implies it is not dead at this time. There was brief discussion on this (can't really recall where), and sending a user to a slower, cached version when a live one is available was deemed "bad". I would say you need consensus for making archive links the default links when bot has known detection errors and links may be live. It may be low enough that people don't care as long as there are archives for really dead links. — HELLKNOWZ ▎TALK 19:35, 24 June 2015 (UTC)[reply]- There is a clear consensus for a bot to do this at the Village Pump discussion. The problem of dead links is substantial, and the slim chance that a website will be down temporarily when the bot checks is vastly outweighed by the benefit of fixing links that are actually bad. I would also suggest that a site that goes down "temporarily" may not be the best site to link to either. bd2412 T 19:48, 24 June 2015 (UTC)[reply]
- I linked to the discussion that supports this bot. The bot leaves a message on the talk advising the user to review the bot's edit and fix as needed. So any link changed that shouldn't be changed can be fixed and tagged with
{{cbignore}}
.—cyberpowerChat:Online 20:03, 24 June 2015 (UTC)[reply]- Worth noting also that when I manually repair dead links, the site is usually down (although I have encountered a few working links which were presumably tagged during temporary outages) and the archived links almost always work. These errors do constitute only a minor share of all replacements, in my experience.Jo-Jo Eumerus (talk) 20:07, 24 June 2015 (UTC)[reply]
- I see consensus for dead links, not most likely dead links though. We had such consensus already, and this is a previously approved bot task. There is no question that we need a bot, the question is what error rate in what areas is allowed? The VP proposal was worded "could we have such a bot in theory?", not "we have a bot that will have x% error rate, is this acceptable?" We are talking hundreds of thousands of links here. Even a 0.01% error rate is thousands of links. From what Cyberpower says, it would be higher and we know some cases cannot be avoided. BRFA needs to show either close to 0% error rate or clear consensus that an error rate is acceptable (see, for example, ClueBot NG BRFA). This is described as part of WP:CONTEXTBOT. — HELLKNOWZ ▎TALK 21:25, 24 June 2015 (UTC)[reply]
- If we are talking about links that sometimes work and sometimes don't (and therefore might not be working when the bot checks), I think it's pretty obvious that we are better off with a link to an archived page that works all the time. It's not an error at all to replace a questionable link with a stable link to the same content. bd2412 T 22:08, 24 June 2015 (UTC)[reply]
- I linked to the discussion that supports this bot. The bot leaves a message on the talk advising the user to review the bot's edit and fix as needed. So any link changed that shouldn't be changed can be fixed and tagged with
- There is a clear consensus for a bot to do this at the Village Pump discussion. The problem of dead links is substantial, and the slim chance that a website will be down temporarily when the bot checks is vastly outweighed by the benefit of fixing links that are actually bad. I would also suggest that a site that goes down "temporarily" may not be the best site to link to either. bd2412 T 19:48, 24 June 2015 (UTC)[reply]
- Adding an archive url implies the link is dead, unless you add
- Even if it is a temporary downage, Cyberbot will simply be adding an archived version of the link to the original citation either through the use of the wayback template or if using a cite template through the archive-url parameter. Nothing is lost. The verification procedure will be very erroneous at first but as I get more information, refinements can be easily added. Rules can be added to the bot's configuration page for ones with regular problems. If the bot is being problematic with a source, users can attach a
"If the bot makes any changes to the page, a talk page notice is placed alerting the editors there that Cyberbot has tinkered with a ref." -- Is there consensus for this? That's a lot of messages. — HELLKNOWZ ▎TALK 21:25, 24 June 2015 (UTC)[reply]
- Technically, 0.01% would be tens of links. I think we'll need a test run to establish how reliable the link replacement is, though.Jo-Jo Eumerus (talk) 21:45, 24 June 2015 (UTC)[reply]
- It's been asked for a couple times, it can be switched off. Link checking can be switched off too. It would drastically speed the bot up.—cyberpowerChat:Online 21:48, 24 June 2015 (UTC)[reply]
The main problem I see with this is automatically trying to identify whether a link is up or down. It's ridiculously tough for a bot to do it (reflinks had a ton of code for it), and IIRC sites like CNN and/or NYT blocked the toolserver in the past. I also don't see any advantage to using a special exclusion template and spamming talk pages. I also had written my own code for this (BRFA) which I'll resuscitate. It'll be great to have multiple bots working on this! Legoktm (talk) 22:22, 26 June 2015 (UTC)[reply]
- I have been discussing with Legoktm on IRC and I think 2 bots is a lovely idea. More coverage quicker. My bot shouldn't have any conflicts with another bot. Legoktm and I will be implementing a feature to allow them both to acknowledge
{{nobots|deny=InternetArchiveBot}}
. As for checking whether a link is dead or not, it seems to be an agreement among us to leave that feature off for now, or indefinitely. As spamming talk pages, we can see how that works out. If it's too much after the trial, we can turn that off too.—cyberpowerChat:Online 23:16, 26 June 2015 (UTC)[reply]
Development Status
Done Fetch appropriate articles
Done Recognize and parse various formats in references
Done Parse a template properly
Done Recognize and parse various formatted external links, and citations
Done Detect if a link is really dead
Done Submit archive requests for links that are alive but have no archive
Done Detect if the link has been marked as dead
Done Detect if the link has an archive
Done Handle the link properly
Done Scan the archive and retrieve an archive
Done Properly format new references and links
Done Fix improperly formatted templates
Done Notify on talk page
Done Log report generator
Done Refinements
- ((BAGAssistanceNeeded)) Development is finished, source code has been posted and I believe the bot is ready for a trial run.—cyberpowerChat:Online 23:08, 22 June 2015 (UTC)[reply]
- Before being approved for a trial please answer the following questions:
- Should Cyberbot scan all links on specified pages, or just references?
- Should Cyberbot scan all pages, or only those contain dead-link tags?
- Should Cyberbot modify all links, only those tagged as dead, or tagged as dead and those the bot see as dead?
- Should the bot verify if a tagged link is really dead, or blindly trust dead-link tags?
- Should the bot provide the latest archived copy or those closest to the set access date of source?
- Should Cyberbot touch sources that already have archives on them?
- Should Cyberbot leave a message on the respective talk page when it edits a page?
- Can you suggest a subject line Cyberbot should use for talk page messages? You can use keywords such as {linksrescued}, {linkstagged}, {linksmodified}, and {namespacepage}.
- Can you suggest the body of the message Cyberbot should leave behind. You can use the same syntax mentioned in the previous question. Use \n for newlines.
- Should Cyberbot check if a link is dead, as in check those that aren't tagged?
- Should Cyberbot make sure an archived copy is available and ready should the live link ever go down?
- All these questions are individual configuration options for this bot. Knowing how the community wants would be of a great help.
- Here is my opinion on the matter:
- Bad links are bad links, so it shouldn't make a difference if they are in references or text.
- I interpret that as all links.—cyberpowerChat:Online 01:43, 23 June 2015 (UTC)[reply]
- Yes, all links. If a link is dead, it should be made good.
- I interpret that as all links.—cyberpowerChat:Online 01:43, 23 June 2015 (UTC)[reply]
- Same as above, although I would start with those that are tagged.
- This is a configuration question. There are 2 scanning methods, one scans all pages, the populates pages that contain dead-link templates and scans those. It sounds like you want all pages in the end.—cyberpowerChat:Online 01:43, 23 June 2015 (UTC)[reply]
- In that case, I would go with all pages. Going with tagged links is useful because it focuses on links known to be dead, but if the resources exist to do all pages, go for it.
- This is a configuration question. There are 2 scanning methods, one scans all pages, the populates pages that contain dead-link templates and scans those. It sounds like you want all pages in the end.—cyberpowerChat:Online 01:43, 23 June 2015 (UTC)[reply]
- I presume the bot will do nothing to links that appear to be in fine working order. If it sees a link as dead, it should fix it, tagged or not.
- I think it makes more sense to verify. Basically, it should be agnostic about the tags, since those may be erroneous.
- I agree that verification is a must, but the process still quite erroneous. Certain dead links do return a 200 OK and the bot will see that as a live link.
- To what extent can the process be tweaked as it goes? Can we start with clearly dead links, and then refine the process for links that are tagged as dead but do not show up as dead?
- Rules can be introduced using the rules parameter in the configuration page. Verification algorithms can be improved on demand. The bot's source code has been for maintainability.—cyberpowerChat:Limited Access 02:36, 23 June 2015 (UTC)[reply]
- Ok - not to throw in new complications, but if a links is tagged as a dead link, but the bot thinks it's a live link, perhaps the "dead link" tag should either be removed or modified to indicate that there's some question about whether it really is a dead link. Also, this raises an additional question for me. What does the bot do when it finds a dead link for which no fix exists (i.e. no archive)? Perhaps it should also note this on the talk page, so editors will know that whatever proposition the link is supposed to support will need a new source. bd2412 T 02:55, 23 June 2015 (UTC)[reply]
- The bot would simply remove the tag if it was deemed as alive and the bot can't find an archive it will tag it as dead. Any modification done to the page results in a talk page notification. Both these features can be turned on and off on the configuration page.—cyberpowerChat:Limited Access 03:50, 23 June 2015 (UTC)[reply]
- Ok - not to throw in new complications, but if a links is tagged as a dead link, but the bot thinks it's a live link, perhaps the "dead link" tag should either be removed or modified to indicate that there's some question about whether it really is a dead link. Also, this raises an additional question for me. What does the bot do when it finds a dead link for which no fix exists (i.e. no archive)? Perhaps it should also note this on the talk page, so editors will know that whatever proposition the link is supposed to support will need a new source. bd2412 T 02:55, 23 June 2015 (UTC)[reply]
- Rules can be introduced using the rules parameter in the configuration page. Verification algorithms can be improved on demand. The bot's source code has been for maintainability.—cyberpowerChat:Limited Access 02:36, 23 June 2015 (UTC)[reply]
- To what extent can the process be tweaked as it goes? Can we start with clearly dead links, and then refine the process for links that are tagged as dead but do not show up as dead?
- I agree that verification is a must, but the process still quite erroneous. Certain dead links do return a 200 OK and the bot will see that as a live link.
- I would prefer the closest archive to the source date, since the contents of the page may have changed.
- I'm not sure what you mean by "sources that already have archives". If the link already purports to point to an archive I don't know how we would find an archive of that link.
- What I mean by that is, if a source contains a reference to an archive, should Cyberbot fiddle with it or leave it alone? My recommendation is to leave them alone.—cyberpowerChat:Online 01:43, 23 June 2015 (UTC)[reply]
- I have no preference with respect to talk page messages. Since the operation is a bit complicated, I guess it would be too much to describe in a tag on the page.
- Have you seen the source code yet? Compared to that, notifying on the talk page is easy. :p—cyberpowerChat:Online 01:43, 23 June 2015 (UTC)[reply]
- This is more than a tagging or modification. I would just say "Dead link(s) replaced with archived links".
- Any message should briefly describe the operation, and state that "[this] dead link was replaced with [this] link from the Internet Archive" (or whatever service is used).
- As above, the concern is the links, irrespective of the tags. Although we can start with tagged links, ultimately every link should be checked.
- Checking for archives of working links seems a bit out of scope, and a bigger task. I don't recall whether we had determined that there is a way to prompt Internet Archive or another such service to archive a link.
- Some users have asked for it, and I've been able to implement without much cost to resources. I recommend this be turned on.
- If there's a call for it, sure.
- Some users have asked for it, and I've been able to implement without much cost to resources. I recommend this be turned on.
- Bad links are bad links, so it shouldn't make a difference if they are in references or text.
- Cheers! bd2412 T 01:34, 23 June 2015 (UTC)[reply]
- Here is my opinion on the matter:
What should we do here? -- Magioladitis (talk) 13:54, 28 June 2015 (UTC)[reply]
- Approve for a trial, obviously. :p—cyberpowerChat:Online 14:03, 28 June 2015 (UTC)[reply]
First trial (100 edits)
Approved for trial (100 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. ·addshore· talk to me! 15:52, 28 June 2015 (UTC)[reply]
- 50 article edits and 49 talk page edits already done (counting by way of the edit reasons). I'll inspect the article edits. Jo-Jo Eumerus (talk) 18:00, 28 June 2015 (UTC)[reply]
Trial complete. A random review of the edits reveals no problems.—cyberpowerChat:Online 18:24, 28 June 2015 (UTC)[reply]
- A few notes of mine:
- The bot is adding the {{wayback}} template without a space between the template and the preceding markup, leaving no space between any punctuation and the "Archived" template output in seeing mode. Is this right? (On Paul Bonner, it did add a space in one of the two replacements).
Fixed Though I should note it didn't edit Paul Bonner.—cyberpowerChat:Online 19:43, 28 June 2015 (UTC)[reply]
- Whoops. It was Peter Bonner, not Paul. Sorry!
- The bot didn't change the two broken external links on Zeba Islam Seraj. I assume that the bot noticed that the most recent archived copies are also broken?
- Archive.org only returns the closest working copy of the page, or it returns nothing. If the bot gets nothing, it does nothing with the link.—cyberpowerChat:Online 19:52, 28 June 2015 (UTC)[reply]
- Floating ecopolis had a previously archived link that was broken by the bot. Apparently it tried to archive the already archived link.
- Actually the wayback was being improperly used. The generated link is unusable. The bot attempted to fix the formatting, but it failed, it should have removed the 1= parameter.—cyberpowerChat:Online 19:57, 28 June 2015 (UTC)[reply]
Fixed—cyberpowerChat:Online 21:15, 28 June 2015 (UTC)[reply]
- Actually the wayback was being improperly used. The generated link is unusable. The bot attempted to fix the formatting, but it failed, it should have removed the 1= parameter.—cyberpowerChat:Online 19:57, 28 June 2015 (UTC)[reply]
- Talysh Khanate also had an incomplete replacement, not sure what went wrong there.
- Not sure what happened there either, I'll have to look at that closely.
Fixed—cyberpowerChat:Online 21:31, 28 June 2015 (UTC)[reply]
- Not sure what happened there either, I'll have to look at that closely.
- One Wayback archive was of an already broken page (last link on the Überlingen article). The Margin of error, Palmer's College and Gecko (software) replacement also appears to be already broken. Same for the last link on Koreatown, Los Angeles (or so it appears to me).
- The bot can't be expected to accurately determine if the archive is good or not, that's why the suggestion of human review.—cyberpowerChat:Online 20:27, 28 June 2015 (UTC)[reply]
- In a few instances, the bot replaced a working link with another working link because the original was mis-tagged as broken (Parsley Sidings, Hot Cross and Vanity (singer)).
- Don't blame the bot if someone else mistagged it. The bot can't be expected to know if the link is really dead or not when there is consensus to shut the link verification process off.—cyberpowerChat:Online 20:27, 28 June 2015 (UTC)[reply]
That's all from me - only the first four things are potentially problematic. Jo-Jo Eumerus (talk) 18:53, 28 June 2015 (UTC)[reply]
- [19] - the bot grabbed the earliest archive, why earliest and not latest? (P.S. I only checked like 10 pages, so this isn't a full review.) — HELLKNOWZ ▎TALK 19:51, 28 June 2015 (UTC)[reply]
- I'm assuming it has something to do with the blank accessdate parameter making the bot assume a unix timestamp of 0 and resulting in it trying to pull an archive as close to January 1, 1970 as possible. I;ll put in a fix for that.—cyberpowerChat:Online 20:27, 28 June 2015 (UTC)[reply]
- WikiBlame could perhaps be implemented somehow? That's what I use when finding the best archived-link. (t) Josve05a (c) 20:37, 28 June 2015 (UTC)[reply]
- In all my years of being here, I never learned what WikiBlame is. Can someone enlighten me?—cyberpowerChat:Online 20:50, 28 June 2015 (UTC)[reply]
- A tool for searching in the revision history of a page, per Wikipedia:WikiBlame.Jo-Jo Eumerus (talk) 21:39, 28 June 2015 (UTC)[reply]
- How would that help?
- A tool for searching in the revision history of a page, per Wikipedia:WikiBlame.Jo-Jo Eumerus (talk) 21:39, 28 June 2015 (UTC)[reply]
- In all my years of being here, I never learned what WikiBlame is. Can someone enlighten me?—cyberpowerChat:Online 20:50, 28 June 2015 (UTC)[reply]
Fixed—cyberpowerChat:Online 21:53, 28 June 2015 (UTC)[reply]
- WikiBlame could perhaps be implemented somehow? That's what I use when finding the best archived-link. (t) Josve05a (c) 20:37, 28 June 2015 (UTC)[reply]
- I'm assuming it has something to do with the blank accessdate parameter making the bot assume a unix timestamp of 0 and resulting in it trying to pull an archive as close to January 1, 1970 as possible. I;ll put in a fix for that.—cyberpowerChat:Online 20:27, 28 June 2015 (UTC)[reply]
- Josve05a has brought up more issues that I missed and have addressed them.—cyberpowerChat:Online 13:50, 29 June 2015 (UTC)[reply]
Second trial (300 edits)
- The previous trial has concluded, and the brought up issues have been addressed. Requesting another trial of 500 this time.—cyberpowerChat:Online 22:04, 28 June 2015 (UTC)[reply]
Approved for extended trial (300 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. 500 is too much for us to check. Let's do 300 first. -- Magioladitis (talk) 08:09, 5 July 2015 (UTC)[reply]
- Notes by Josve05a
I've only done tests on a few articles to find the worst bugs. These are not all, but those I found when checking a selective number of articles to see its reliability. Saying "the bot can't know if it is dead on Wayback" is not a good excuse. That is a reason not to allow the bot task.
Legend for recurring errors:
Code | Error |
---|---|
(b) | The source URL was not dead. |
(c) | The archive-url is dead. |
Diff | URL | Archived | Note |
---|---|---|---|
[20] | [21] | [22] | (c) THE BOT REPEATED EDIT, AFTER BEEN REVERTED |
[23] | [24] | [25] | (c) THE BOT REPEATED EDIT, AFTER BEEN REVERTED |
[26] | [27] | [28] | (c) THE LINK WAS INLINE, NOT IN-REF OR UNDER EXTERNAL LINKS |
[29] | [30] | [31] | (c) |
[32] | [33] | [34] | (c) |
[35] | - | - | Added |dead-url=yes, even though |deadurl=yes already existed |
[36] | [37] | [38] | (c) |
^ | [39] | [40] | (b) |
^ | [41] | [42] | (c) |
[43] | [44] | [45] | (c) |
[46] | [47] | [48] | (b) |
[49] | - | - | REMOVED CONTENT FROM THE ARTICLE |
[50] | - | - | TRIED TO FIX STRAY REF IN COMMENTED TEXT, BREAKING TEMPLATE, REMOVING CONTENT |
[51] | - | - | REMOVED CONTENT FROM THE ARTICLE |
(t) Josve05a (c) 16:18, 5 July 2015 (UTC)[reply]
{{OperatorAssistanceNeeded|D}}
Magioladitis (talk) 22:46, 7 July 2015 (UTC)[reply]
Trial complete. Sorry. The bot is still waiting to receive the fixes to mentioned bugs.—cyberpowerChat:Online 22:53, 7 July 2015 (UTC)[reply]
- Rome wasn't built in a day. ;-) bd2412 T 23:15, 7 July 2015 (UTC)[reply]
- I have addressed (c). The likelihood of a bad archive being added should be greatly reduced now. A solution for items 1 and 2 is already present. I have fixed item number 6 and item number 13 so far.—cyberpowerChat:Online 13:09, 10 July 2015 (UTC)[reply]
- It took some searching but I managed to get 12 and 14 fixed and confirmed it with this edit. Also an addendum, I have instructed the bot to change the links in external links directly, if they are not inside reference tags. That way when fixing sources and links, I'm not disrupting the article with a wayback template.—cyberpowerChat:Offline 06:04, 11 July 2015 (UTC)[reply]
- Rome wasn't built in a day. ;-) bd2412 T 23:15, 7 July 2015 (UTC)[reply]
Third trial (500 edits)
The bot appears to be ready for one last trial before approval.—cyberpowerChat:Offline 06:04, 11 July 2015 (UTC)[reply]
- I have reviewed the bot's configuration once again (last time I did it was before the first trial), and it seems like my earlier major concern about VERIFY_DEAD being set to true is resolved. (Note to closing BAGer: bot does not seem to have consensus to run with VERIFY_DEAD set to true, since it is too prone to errors.)
- I'm still unsure about the talk page notices. I cleaned up the wording a bit, adding a {diff} label (which Cyberpower says he can implement) and removing unnecessary information, but I'm still debating the general usefulness, since it will appear on tens (possibly hundreds) of thousands of talk pages. I would like to see some more comments on this.
- My only other issue is concerning PAGE_SCAN. As I understand it, setting this to false (as Cyberpower intends to do when the bot is approved) will involve tens of millions of archival requests to the Internet Archive (Wikipedia has 81,235,194 external links at last count; some of these are already archived but many will not be). I understand this is in line with the goals of that service, but I'm not sure if this is a good idea without explicit confirmation from them. So let's hold off on setting PAGE_SCAN to 0 after approval until we get more details on this.
- Anyway:
Approved for extended trial (500 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Hopefully I will be able to do a careful review of the results of this trial after it is complete. — Earwig talk 05:00, 13 July 2015 (UTC)[reply]
Trial complete.. I looked over the edits and can't find any problems with them except that some pages don't seem to be archiving correctly, based on the talk messages left behind. The source rescuing hasn't revealed any bugs this time, but I would appreciate a seperate set of eyes on this too in case I missed something.—cyberpowerChat:Online 13:43, 14 July 2015 (UTC)[reply]
The Earwig here you are! 500 pages :) Josve05a you may also want to have a look! -- Magioladitis (talk) 08:53, 15 July 2015 (UTC)[reply]
- I'll be checking the articlespace contributions. As a note/question, I am not sure how much importance should be placed on working link-->working archive or nonworking link-->nonworking archive replacements; they appear to be fairly minor issues on their own (unlike working link-->nonworking archive replacements). Jo-Jo Eumerus (talk, contributions) 13:30, 15 July 2015 (UTC)[reply]
- Alright, from Ani DiFranco forward I see [52] where the bot fixed one link but de-{{dead link}}-ed two and [53] where a citation already using a Webarchive link got its "wayback" part removed. Jo-Jo Eumerus (talk, contributions) 14:03, 15 July 2015 (UTC)[reply]
- I somehow missed that edit, which begs to ask how many others I missed. As for that edit, it's reasonable to assume the bot got confused as 2 sources were placed in one reference. Unless I'm mistaken, only one source should be in a reference at a time, so that should rather be fixed on the article. True?—cyberpowerChat:Online 14:14, 15 July 2015 (UTC)[reply]
- @Jo-Jo Eumerus: Is there anything wrong with that second case? It doesn't seem to be an explicit part of the task description, but the end result is better formatted since it does make the original source visible. @Cyberpower678: It is odd, yes, but I don't think it's technically disallowed – either way, the bot shouldn't be doing that, even though it's understandable why the bug would arise in the first place. — Earwig talk 05:20, 17 July 2015 (UTC)[reply]
- Mmm. Yeah, with your argument I think that can be done. I'll review some more edits from Assembly line forward. Jo-Jo Eumerus (talk, contributions) 10:58, 17 July 2015 (UTC)[reply]
- Alrighty, aside from the usual Austin, Texas had a mistagged link (because it still works) changed to a broken Wayback link, but it's clearly noted on the talk page so I guess it's not a major issue. Nothing else serious to see. Jo-Jo Eumerus (talk, contributions) 11:49, 17 July 2015 (UTC)[reply]
- Then this might be a problem. The way the bot is coded, it's designed to look for reference tags, external links, and citation templates. If it finds the reference tag, it looks for the source inside it. It can't see 2 sources the way it's coded, and updating that would require major rewrites of the bot. While I can adjust the regex to absorb the 2 links no problem, feeding them into the parser might be a problem as it only takes in one link. Ideally, I would rather simply fix this issue on the article since this seems to have occurred only once of all the trials.—cyberpowerChat:Online 14:10, 17 July 2015 (UTC)[reply]
- Just thinking outside the box here, but why don't we make a separate bot to find and fix instances of multiple references in a single tag, run that on all of Wikipedia, and then run this one when that one is done. bd2412 T 14:29, 17 July 2015 (UTC)[reply]
- Unfortunately, you're thinking outside of our galaxy here. Such a bot would be extremely difficult to program. How would it know what text to put where. In the case here, this reference has 2 external and text mentioning both links. Your bot would need to master the english language first. I do like the idea though.—cyberpowerChat:Online 15:16, 17 July 2015 (UTC)[reply]
- My suggestion would be to correct the reference manually and move on. I have a sneaking suspicion that this issue will come up so rarely, that any human could easily fix it. And the bot won't come back to it once it has an archive link, or an ignore tag.— Preceding unsigned comment added by cyberpower678 (talk • contribs)
- Just thinking outside the box here, but why don't we make a separate bot to find and fix instances of multiple references in a single tag, run that on all of Wikipedia, and then run this one when that one is done. bd2412 T 14:29, 17 July 2015 (UTC)[reply]
- Mmm. Yeah, with your argument I think that can be done. I'll review some more edits from Assembly line forward. Jo-Jo Eumerus (talk, contributions) 10:58, 17 July 2015 (UTC)[reply]
- Is there a way to tell these problem refs? Maybe the bot can simply list them somewhere (or tag them) and have a human repair them before doing the botwork. Jo-Jo Eumerus (talk, contributions) 15:46, 17 July 2015 (UTC)[reply]
- Yes. That can easily be done. But to do all 12 million articles may take some time.—cyberpowerChat:Online 15:53, 17 July 2015 (UTC)[reply]
- OK. I believe this is a problem that can be actively dealt while the bot is working. Were there any other problems? -- Magioladitis (talk) 10:36, 24 July 2015 (UTC)[reply]
- Not that I am aware of.—cyberpowerChat:Online 13:54, 24 July 2015 (UTC)[reply]
- OK. I believe this is a problem that can be actively dealt while the bot is working. Were there any other problems? -- Magioladitis (talk) 10:36, 24 July 2015 (UTC)[reply]
- Yes. That can easily be done. But to do all 12 million articles may take some time.—cyberpowerChat:Online 15:53, 17 July 2015 (UTC)[reply]
Josve05a did you had the chance to check (some of) the 500 edits? -- Magioladitis (talk) 14:32, 24 July 2015 (UTC)[reply]
- I did do some spot tests and checks and the rate of error (dead archives etc.) is within my acceptable parameters, in my opinion. However, I would suggest a mew maintenence tempate/category be added next to the links/on the talk page, so a human can do a second review of all bot edtis if wanted, instead of having to look at edit logs. Like "Template:Bot link-archivation" or something, in monthly categories. Just a suggestion, to catch those which may be in error. (t) Josve05a (c) 16:50, 24 July 2015 (UTC)[reply]
- Would the talk page notifiers serve that scope? Jo-Jo Eumerus (talk, contributions) 16:55, 24 July 2015 (UTC)[reply]
- Not unless they all got "collected at one page, like if they had a category/template in them. It is one thing to "see" the talk page notifiers while on the article, another to systematicly manually review them afterwards. The notifiers is to "let you know" that it happened, a category where hese could be in would to to "allow manual reviews"...I'm just mumbling right now... (t) Josve05a (c) 17:10, 24 July 2015 (UTC)[reply]
- How about fashioning a template to go in the talkpage message? The template has a switch, resolved=no, which places the page in a category, and resolved=yes, which removes the page from the category.—cyberpowerChat:Online 14:28, 6 August 2015 (UTC)[reply]
- "Resolved" makes it sound like it inherently has a problem, which it should not. I thnk
{{{manually_checked}}}
or something is more "accurate", but it sounds like a plan. Has my 'vote'. (t) Josve05a (c) 19:16, 6 August 2015 (UTC)[reply]- How about
{{{checked}}}
?—cyberpowerChat:Online 19:53, 6 August 2015 (UTC)[reply]Done. Also web archive doesn't seem to have a problem with the bot archiving.—cyberpowerChat:Limited Access 02:59, 7 August 2015 (UTC)[reply]
- How about
- "Resolved" makes it sound like it inherently has a problem, which it should not. I thnk
- How about fashioning a template to go in the talkpage message? The template has a switch, resolved=no, which places the page in a category, and resolved=yes, which removes the page from the category.—cyberpowerChat:Online 14:28, 6 August 2015 (UTC)[reply]
- Not unless they all got "collected at one page, like if they had a category/template in them. It is one thing to "see" the talk page notifiers while on the article, another to systematicly manually review them afterwards. The notifiers is to "let you know" that it happened, a category where hese could be in would to to "allow manual reviews"...I'm just mumbling right now... (t) Josve05a (c) 17:10, 24 July 2015 (UTC)[reply]
- Would the talk page notifiers serve that scope? Jo-Jo Eumerus (talk, contributions) 16:55, 24 July 2015 (UTC)[reply]
- Example
Here's an example. {{sourcecheck}}
Outcome
{{BAGAssistanceNeeded}} I recommend that this bot task be approved, on the condition that the template above are implemented. In case a bug arises which breaks a page, or changes page layout in any way the bot shall be turned off and not be turned on again until the bug has been fixed, in order to not break more pages. This should not be conditional. (t) Josve05a (c) 03:23, 8 August 2015 (UTC)[reply]
- The bot has a runpage and the changes have been implemented.—cyberpowerChat:Offline 04:56, 8 August 2015 (UTC)[reply]
- Three things:
- What's going on here?
- Talk pages that are automatically bot-archived are going to lose these notifications, even when they are still marked with
|checked=false
. This might be a problem given the categorization. Also, I'm not sure if requiring (or recommending, at the very least) manual intervention on over a hundred thousand talk pages is a good idea. - I made a minor tweak to the talk page message and changed the name of {{sourcecheck}}'s category to Category:Articles with unchecked bot-modified external links. Willing to change again if people don't like it. Let's leave it red until approval.
- Thanks. — Earwig talk 01:50, 10 August 2015 (UTC)[reply]
- What do you mean?
- How, it'll simply relocate the the category link to the archive, meaning can still piece 2+2 in figuring out which article that archive belongs to.
- Agreed.
- Cheers.—cyberpowerChat:Limited Access 18:42, 11 August 2015 (UTC)[reply]
- Re #1, I do not understand what that first message is about. Is it part of this task? What's the real point of it? I suspect it will show up a lot for similar pages. Why isn't it combined with the main message? Re #2, I realize the meaning will be clear, but we are then suggesting that users edit talk archives. I suppose this is not a dealbreaker, but I'm not fully satisfied with it either. — Earwig talk 09:02, 12 August 2015 (UTC)[reply]
- It simply means that the bot received a bad response from the archive while attempting to archive non-dead pages. It's doing that to alert to the possibility that link may be dead, a redirect, or the site does not allow for archiving, and if possible if the site is prone to dying that it should be manually archived somehow.—cyberpowerChat:Online 12:45, 12 August 2015 (UTC)[reply]
- Can you combine that with the main message? — Earwig talk 03:15, 15 August 2015 (UTC)[reply]
- I can possibly put in a patch to combine the messages. But what about the edit summaries? Also, it would seem the WMF has taken an interest in this bot and is offering to use their name in talks with IA, to better improve the service. So now I am also waiting to hear from them.—cyberpowerChat:Limited Access 16:08, 21 August 2015 (UTC)[reply]
- Can you combine that with the main message? — Earwig talk 03:15, 15 August 2015 (UTC)[reply]
- It simply means that the bot received a bad response from the archive while attempting to archive non-dead pages. It's doing that to alert to the possibility that link may be dead, a redirect, or the site does not allow for archiving, and if possible if the site is prone to dying that it should be manually archived somehow.—cyberpowerChat:Online 12:45, 12 August 2015 (UTC)[reply]
- Re #1, I do not understand what that first message is about. Is it part of this task? What's the real point of it? I suspect it will show up a lot for similar pages. Why isn't it combined with the main message? Re #2, I realize the meaning will be clear, but we are then suggesting that users edit talk archives. I suppose this is not a dealbreaker, but I'm not fully satisfied with it either. — Earwig talk 09:02, 12 August 2015 (UTC)[reply]
- Three things:
Approved. Cyberpower has removed the message regarding un-archivable links. To the best of my knowledge, that was only remaining issue. Future feature requests, such as detecting unmarked dead links, should be made under a subsequent BRFA. — Earwig talk 00:24, 25 August 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Approved.
Operator: BD2412 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 18:15, Thursday, May 14, 2015 (UTC)
Automatic, Supervised, or Manual: Supervised,
Programming language(s): AutoWikiBrowser.
Source code available: AWB.
Function overview: I frequently clean up links left from disambiguation page moves. For example, the page Epping previously was an article on a town in England. This page was moved to Epping, Essex, and Epping became a disambiguation page with several hundred incoming links. As is commonly found in such cases, most of the links intended the town in England, and many were found in formulations like "[[Epping]], Essex", or "[[Epping]], [[Essex]]". A similar issue is the recurring creation of common patterns of disambiguation links to heavily linked articles; for example editors will often make edits creating disambiguation links like "[[heavy metal]] music" and "the [[French]] language", which can easily be resolved as "[[heavy metal music]]" and "the [[French language]]". Over time, large numbers of these links may build up. I would like permission to run AWB as a bot so that when page moves are made or common disambiguation targets become heavily linked, obvious formulations like these can be changed with less of a direct investment of my time.
Links to relevant discussions (where appropriate): Wikipedia:Disambiguation pages with links generally contains the protocol for repairing links to disambiguation pages.
Edit period(s): Intermittent; I intend to run this when a page move creates a large number of disambiguation links, for which obvious formulations for a large number of fixes can be seen.
Estimated number of pages affected: New disambiguation pages are created frequently. I would guess that between a few dozen pages and a few hundred pages might require this kind of attention on any given day, although there are likely to be days where no pages require such attention.
Exclusion compliant (Yes/No): Yes, as AWB does this automatically.
Already has a bot flag (Yes/No):
Function details: When large numbers of links to new disambiguation pages are created from existing pages having been moved to disambiguated titles, or from the buildup of common patterns of editing behavior over time, I will determine if there are obvious patterns of links to be fixed, for example changing instances of "[[Epping]], Essex" or "[[Epping]], [[Essex]]" to "[[Epping, Essex|Epping]], Essex", or "[[Epping, Essex|Epping]], [[Essex]]". I will then run AWB in bot mode to make these changes, and review the changes once made.
Discussion
BD2412 I like the idea of this bot but I think similar proposals have been rejected in the past as WP:CONTEXTBOT. Could you please raise a discussion at WP:VILLAGEPUMP so that we check whether there is consensus for these changes or not? There might be traps I can't think of right now. -- Magioladitis (talk) 12:57, 16 May 2015 (UTC)[reply]
- Which Village Pump page would that go to? bd2412 T 15:12, 16 May 2015 (UTC)[reply]
- BD2412 Let's start from Wikipedia:Village pump (miscellaneous). -- Magioladitis (talk) 21:50, 16 May 2015 (UTC)[reply]
Wikipedia:Village_pump_(miscellaneous)#Bot_request_for_disambiguation_link_fixing_issue. -- Magioladitis (talk) 11:11, 21 May 2015 (UTC)[reply]
As I was afraid... Wikipedia:Village_pump_(miscellaneous)/Archive_49#Bot_request_for_disambiguation_link_fixing_issue. I see no actual consensus there. -- Magioladitis (talk) 23:19, 27 May 2015 (UTC)[reply]
BD2412 can you provide me a list of 50 manual edits doing this task? I would like to judge reactions. I do not guarantee approval. In fact, while I like this task a lot, I think it will get a lot of reactions. Still I think you can try to make 50 edits so we can really see reactions. Take it an unofficial bot trial. -- Magioladitis (talk) 23:22, 27 May 2015 (UTC)[reply]
- I recently did a run of about 10,000 fixes to links to Striker (which is soon to be turned unto a disambiguation page). Not all of these fall into the pattern that I have discussed here, but those that changed [[Midfielder]]/[[Striker]] to [[Midfielder]]/[[Striker (association football)|Striker]] would. There were probably a few hundred of those in the mix. This run of my contributions was in the thick of this run. bd2412 T 23:40, 27 May 2015 (UTC)[reply]
BD2412 My experience show that there will be a lot of reaction. I'll reject the bot request and I encourage you that you keep doing this kind of changes supervised by your normal account using AWB. Unless, of course, there is at some point clear consensus that I do that do this kind of stuff. Some editors in the past even complaint for orphaning a link before xfD closes. Just a general remark for oter editors that my be readin this: BRFA is not the place to gain consensus but a place to request based on consensus. -- Magioladitis (talk) 23:14, 30 May 2015 (UTC)[reply]
- I am not proposing to orphan links prior to an XfD closing - I generally don't, in fact. Striker was an exceptional case based on the volume of links, and the fact that the RM time has run with multiple votes of support and no objections. My proposal is directed solely to link fixes needing to be made after a consensus-based page move has been carried out. I have had very few reactions to runs of thousands of fixes made using AWB, and I have never had a reaction when making obvious fixes of the type I propose. I would be glad to keep doing it this way, but I have actually physically burned out computer mice and had wrist aches that lasted for days! bd2412 T 00:37, 31 May 2015 (UTC)[reply]
BD2412 Any ideas of how we can ensure there is consensus for this task? I hope you understand my position. -- Magioladitis (talk) 18:51, 31 May 2015 (UTC)[reply]
- There is a longstanding consensus for fixing disambiguation links, which is the foundation of Wikipedia:WikiProject Disambiguation. bd2412 T 19:02, 31 May 2015 (UTC)[reply]
I need Anomie's opinion on this one... -- Magioladitis (talk) 22:17, 11 June 2015 (UTC)[reply]
Approved for trial (100 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. -- Magioladitis (talk) 13:44, 28 June 2015 (UTC)[reply]
- I'll give it a trial run this weekend. Thanks. bd2412 T 15:16, 30 June 2015 (UTC)[reply]
- @BD2412: Gentle poke given that it's been two weeks. What's the status? — Earwig talk 20:20, 13 July 2015 (UTC)[reply]
- I've been busy with things that keep me logged in to my regular account - I can't run AWB as a bot unless I log into the bot-authorized account. I'll give it a run-through tonight. I'll need to find a set of links with applicable fixes first. bd2412 T 20:44, 13 July 2015 (UTC)[reply]
- I ran a few tests, but at the moment there are no disambiguation pages with large numbers of links requiring the same solution. These come up sporadically. I tested the bot function on links containing "layout = Longitudinal" (for which the only answer will be Longitudinal engine), links to San Vicente, El Salvador (changed to San Vicente, El Salvador) and Fukushima, Japan (changed to Fukushima, Japan). I made a typo on one variation of the "Longitudinal" fix, and fixed that manually. Otherwise, everything went smoothly. bd2412 T 01:28, 14 July 2015 (UTC)[reply]
- I've been busy with things that keep me logged in to my regular account - I can't run AWB as a bot unless I log into the bot-authorized account. I'll give it a run-through tonight. I'll need to find a set of links with applicable fixes first. bd2412 T 20:44, 13 July 2015 (UTC)[reply]
- @BD2412: Gentle poke given that it's been two weeks. What's the status? — Earwig talk 20:20, 13 July 2015 (UTC)[reply]
@BD2412: Now I understand better what the bot does and I like it more. Should you try find more examples in order to complete the 100 edits test period? -- Magioladitis (talk) 10:14, 21 July 2015 (UTC)[reply]
- I intend to test more - it's a matter of time before an existing article with a few dozen incoming links is moved in favor of a disambiguation page, leaving a certain number of obvious repetitive solutions to be enacted. bd2412 T 12:24, 21 July 2015 (UTC)[reply]
- - gentle poke - any updates? :) ·addshore· talk to me! 12:53, 28 July 2015 (UTC)[reply]
- Something will come up. It always does. bd2412 T 15:06, 28 July 2015 (UTC)[reply]
- I just did a half dozen more fixes with the bot. It's not a lot, but it is exactly the kind of thing I intend to have it do. bd2412 T 18:52, 31 July 2015 (UTC)[reply]
- Ok, at last there was a big disambiguation task where I could run this for the purpose for which it is really needed; African has been made into a disambiguation page with over a thousand incoming links. I found a number of obvious fix formulations and ran AWB with those, and made 119 fixes from that group. bd2412 T 01:34, 6 August 2015 (UTC)[reply]
- I have also used the bot to fix about 700 links to IDG, which had been turned into a disambiguation page. The links that I fixed were formatted as "publisher=[[IDG]]", and there is only one "IDG" that is a publisher. bd2412 T 13:40, 10 August 2015 (UTC)[reply]
Trial complete. To update the status. @Magioladitis and Addshore:. I'll loop back around if nobody's watching. :P --slakr\ talk / 04:04, 27 August 2015 (UTC)[reply]
- @Magioladitis, Addshore, and Slakr: What's the status of this? There is a prime candidate for the bot to work on, as there should be some clear fixes that the bot can do for links to the new Palestine disambiguation page. -Niceguyedc Go Huskies! 00:56, 6 September 2015 (UTC)[reply]
- @BD2412: To clarify, this task is manually initiated and supervised, right? In that case, WP:CONTEXTBOT never applied. I can't see where opposition to this would originate from. — Earwig talk 05:05, 11 September 2015 (UTC)[reply]
- @Magioladitis, Addshore, and Slakr: What's the status of this? There is a prime candidate for the bot to work on, as there should be some clear fixes that the bot can do for links to the new Palestine disambiguation page. -Niceguyedc Go Huskies! 00:56, 6 September 2015 (UTC)[reply]
- I just did a half dozen more fixes with the bot. It's not a lot, but it is exactly the kind of thing I intend to have it do. bd2412 T 18:52, 31 July 2015 (UTC)[reply]
- Something will come up. It always does. bd2412 T 15:06, 28 July 2015 (UTC)[reply]
- - gentle poke - any updates? :) ·addshore· talk to me! 12:53, 28 July 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Approved.
Operator: MusikAnimal (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 06:23, Wednesday, April 22, 2015 (UTC)
Automatic, Supervised, or Manual: Automatic
Source code available: GitHub
Function overview: Bot clerking at WP:PERM pages.
Links to relevant discussions (where appropriate): Special:PermaLink/655854110
Edit period(s): Continuous
Estimated number of pages affected: Up to six during one run (one for each PERM page, except Confirmed and AWB requests)
Exclusion compliant (Yes/No): No
Already has a bot flag (Yes/No): No
Function details: This bot works very much like Cyberbot I does at WP:RPP. It monitors all the Request for Permissions pages for new requests, and checks if there were previously declined requests for that user and permission. If matches are found, an automated comment is left linking to those declined requests. Eventually it may also ping the declining admin, but I've side stepped that for now. There are two exceptions: The AWB checkpage which does not have the same structure as the other request for permissions pages, though I might implement special case handling for this at some point. The other is requests for confirmed, where it's very unlikely we'll see multiple requests by the same user, so the bot clerking is not that helpful there. A few notes:
- It works by using regex to parse out all the necessary info, and constructs the automated comment(s) to be saved. As long as Template:Request for permission generates a level 4 heading and Template:Rfplinks is used than it shouldn't flake out.
- Thoroughly tested on test-wiki, see testwiki:Wikipedia:Requests for permissions/Rollback (and here).
- Operates on wmflabs, with a crontab running the script every 10 minutes or so, or whatever we decide on.
- The perm clerking task can be turned off by changing User:MusikBot/PermClerk/Run to anything other than
true
. - For all six permission pages, it should take less than a minute to complete, with a 2 second pause between processing each page, and it will edit no more than 6 times total. However given the nature of the task you probably won't see but a few edits every day at most.
- Checks for edit conflicts. If one is detected it will re-attempt to process that permission page for a total of three times, waiting progressively longer each time. So after attempt #1 it will wait 1 second before trying again, after attempt #2 two seconds, etc.
- Caching is in place where appropriate, such as fetching the declined pages and any declined permalinks for a user.
- There is verbose logging that I can make publicly accessible.
- Full exception error handling. If a critical error is encountered (e.g. more than 3 failed attempts to edit a page), the script will proceed to process the next permission page rather than abort the task altogether. Fatal errors such as when the API is down will result in a full abort of the task until it is ran again by the cron job.
- To be clear, the "cron" jobs are actually submitted to the grid, which helps allocate resources so the bot doesn't get in the way of other jobs on tool labs.
Thank you! — MusikAnimal talk 06:23, 22 April 2015 (UTC)[reply]
Discussion
Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Looks sane; has support from the target audience; reasonable logic; trusted user. The thing I was actually going to ask about (i.e., pointless edits on already-handled entries) looks like it's already covered:
if section.match(/{{(?:template\:)?(done|not done|already done)}}/i)
--slakr\ talk / 07:27, 29 April 2015 (UTC)[reply]- Thank you! It is now running, processing the PERM pages once every 10 minutes. 50 edits could take a while, but I'm no hurry. In the meantime allow me to note that I am implementing another clerking feature, where it will remove extraneous headers (e.g. see bottom request at testwiki:Wikipedia:Requests for permissions/Rollback). This happens a fair amount from new users, who do not read the instructions stating not put anything in the heading field. This development is happening completely on my local environment and will not interfere with the currently running bot, which is running off of code on tool labs. — MusikAnimal talk 16:14, 29 April 2015 (UTC)[reply]
- Just letting you know I've updated the bot to remove extraneous headers when present. This requires no additional edits should there also be previously declined requests for a user – the bot will simply make all changes to the page at once. Thanks — MusikAnimal talk 15:35, 30 April 2015 (UTC)[reply]
- @MusikAnimal: This message is however totally misplaced, see this edit. It's also incorrectly indented. Armbrust The Homunculus 05:34, 1 May 2015 (UTC)[reply]
- @Armbrust: The bot acted exactly as programmed, only removing the level 2 header. The rest of the text was left as is. Here the user also ignored the
2=
parameter of {{rfp}} and instead wrote the request body on the line below it. I am working on a more intelligent regex solution that can fix this common scenario in full. The incorrectly added level 2 heading is more common, however, so the bot is at least addressing that. Anyway, there's clearly discussion needed so I've disabled that feature for now. Let's talk more at WT:PERM#Bot clerking so others can chime in. — MusikAnimal talk 06:03, 1 May 2015 (UTC)[reply]
- @Armbrust: The bot acted exactly as programmed, only removing the level 2 header. The rest of the text was left as is. Here the user also ignored the
- @MusikAnimal: This message is however totally misplaced, see this edit. It's also incorrectly indented. Armbrust The Homunculus 05:34, 1 May 2015 (UTC)[reply]
- Just letting you know I've updated the bot to remove extraneous headers when present. This requires no additional edits should there also be previously declined requests for a user – the bot will simply make all changes to the page at once. Thanks — MusikAnimal talk 15:35, 30 April 2015 (UTC)[reply]
- Thank you! It is now running, processing the PERM pages once every 10 minutes. 50 edits could take a while, but I'm no hurry. In the meantime allow me to note that I am implementing another clerking feature, where it will remove extraneous headers (e.g. see bottom request at testwiki:Wikipedia:Requests for permissions/Rollback). This happens a fair amount from new users, who do not read the instructions stating not put anything in the heading field. This development is happening completely on my local environment and will not interfere with the currently running bot, which is running off of code on tool labs. — MusikAnimal talk 16:14, 29 April 2015 (UTC)[reply]
Approved for extended trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. With the updated regex please. Thanks, Magioladitis (talk) 11:42, 8 May 2015 (UTC)[reply]
- @Magioladitis: Thank you for the endorsement. Just to be sure, has MusikBot been approved for a total 100 edits? The new regex is now in place and working nicely. An important note: I will be on holiday starting this Friday though the rest of the month. I am programming the bot to automatically shut off when it reaches 50 edits, or 100, as advised. I will still be able to occasionally check its activity for accuracy and act accordingly. Thanks — MusikAnimal talk 16:41, 11 May 2015 (UTC)[reply]
- MusikAnimal please make 50 additional edits. -- Magioladitis (talk) 21:01, 13 May 2015 (UTC)[reply]
- Thank you, will do — MusikAnimal talk 21:03, 13 May 2015 (UTC)[reply]
- MusikAnimal please make 50 additional edits. -- Magioladitis (talk) 21:01, 13 May 2015 (UTC)[reply]
{{OperatorAssistanceNeeded|D}}
MusikAnimal Is the bot trial done? -- Magioladitis (talk) 23:17, 27 May 2015 (UTC)[reply]
- @Magioladitis: No, and far from it, actually. I was going to bring a few ideas to your attention... There is now consensus (multiple supports, no opposition) for MusikBot to take on a new task for which it is already capable of doing. That is, comment on requests for permissions where the candidate does not meet some configured requirements. Please see User:MusikBot/PermClerk/Prerequisites for more information. If this task is approved for trial, it could be coupled in with the current allotted 100 edits and we'd be able to meet that threshold quicker. Together those 100 edits should provide ample data to evaluate the bot's overall performance for all of its tasks. What do you think?Finally, I believe MusikBot may be destined to take over KingpinBot's task of archiving requests. This is both because of the operators inactivity and that MusikBot is already parsing the same pages. This has not been developed yet, however, and whether it should be a separate BRFA altogether is up to you. At the rate we're going, I'll have the archiving functionality ready for trial before the bot has made 100 edits. — MusikAnimal talk 20:56, 28 May 2015 (UTC)[reply]
- Assuming there will continue to be frequent malformed requests made at requests for confirmed, I believe a mere 50 edits will be able to show the bot is capable and proficient at it's current tasks. The one task, "FetchDeclined" as I call it, has been on point since the first edit. The other currently enabled task, "Autoformat" has undergone a few transformations. That is simply because one cannot predict how users will construct their requests, but I believe the logic is at a point where it can handle most scenarios and do so correctly. This edit on testwiki demonstrates the bot's ability to handle a wide range of scenarios. That being said, my proposal is to terminate the trial at 50 edits, and if all is well, start a new 50-edit trial for MusikBot's archiving functionality. If allowed, I'd like to also include the aforementioned "Prerequisites" task, which is ready to go. Let me know what you think! — MusikAnimal talk 15:17, 5 June 2015 (UTC)[reply]
- This seems fine. I dunno about auto-rejecting (something someone mentioned in the discussion), as that tends to put bots in the express lane for accidental warring, frustration, and general WP:BITE complaints, but simple, informational notices that don't explicitly reject the user should be fine, and I think it's probably fine to roll them into this one without having to worry about an additional task. Not sure as far as the archiving portion goes; I dunno about the status of KingpinBot or if it's actually truly necessary to replace it (or if that functionality would even be ready to test), so I'd probably recommend submitting a separate task once those details are clearer. --slakr\ talk / 03:37, 10 June 2015 (UTC)[reply]
- Thank you! Prerequisites just comments with relevant data, Autorespond only marks requests {{already done}} when the user already has the permission, never declines. The archiving task is rather low-priority, for now anyway. The issue is the operator is becoming less active and sometimes we end up having to manually archive. Since MusikBot is already parsing and looping through the same pages, it shouldn't be terribly difficult to have it takeover archiving as well, which is why I've proposed it. The operator is in support and we are working together on this. It's still in development, though, and I'll start up a new BFRA when it comes time.Also thank you for showing me how to deactivate the template :) Cheers — MusikAnimal talk 16:18, 10 June 2015 (UTC)[reply]
- This seems fine. I dunno about auto-rejecting (something someone mentioned in the discussion), as that tends to put bots in the express lane for accidental warring, frustration, and general WP:BITE complaints, but simple, informational notices that don't explicitly reject the user should be fine, and I think it's probably fine to roll them into this one without having to worry about an additional task. Not sure as far as the archiving portion goes; I dunno about the status of KingpinBot or if it's actually truly necessary to replace it (or if that functionality would even be ready to test), so I'd probably recommend submitting a separate task once those details are clearer. --slakr\ talk / 03:37, 10 June 2015 (UTC)[reply]
- Assuming there will continue to be frequent malformed requests made at requests for confirmed, I believe a mere 50 edits will be able to show the bot is capable and proficient at it's current tasks. The one task, "FetchDeclined" as I call it, has been on point since the first edit. The other currently enabled task, "Autoformat" has undergone a few transformations. That is simply because one cannot predict how users will construct their requests, but I believe the logic is at a point where it can handle most scenarios and do so correctly. This edit on testwiki demonstrates the bot's ability to handle a wide range of scenarios. That being said, my proposal is to terminate the trial at 50 edits, and if all is well, start a new 50-edit trial for MusikBot's archiving functionality. If allowed, I'd like to also include the aforementioned "Prerequisites" task, which is ready to go. Let me know what you think! — MusikAnimal talk 15:17, 5 June 2015 (UTC)[reply]
Trial complete. @Slakr and Magioladitis: Alright! The bot has finally completed the 100-edit trail. I will go through each task and provide relevant diffs so that you can judge the bot's efficacy.
- FetchDeclined
- This is the task that inspired the bot. When a request is made, the bot checks the archives and comments if the user has had a request declined in the past 90 days. This is fairly straightforward and has been spot-on accurate since day one: [54] [55] [56]
- Autoformat
- Adding requests using the preloaded template isn't exactly user-friendly, so the bot attempts to fix common mistakes. Since it was impossible to predict what mistakes users would make, I had to continually refine the logic to handle the scenarios I observed. It is somewhat complex so I'm just going to provide a few diffs exemplifying how the bot handled various scenarios: [57] [58] [59]To be transparent, the bot didn't always do the right thing, but rest assured I quickly deployed bug fixes. Since around mid-June the autoformat task has been in a stable state and is able to repair most malformed requests, and ignores them when it can't figure out what to do. See on testwiki how the bot correctly handled a wealth scenarios in a single edit: [60]
- Prerequisites
- This tasks checks predefined qualifications for a given permission, and comments with relevant data if the user does not meet them. Sometimes the edit counters go down, but the bot uses it's own API calls so it will continue to work. It also updates the prerequisite data every 90 minutes, as necessary, until the request has been responded to. Examples: (rollback) (AWB, 2 requests) (updating prereq data)
You'll also notice the improvement of the edit summaries over time. The bot now leaves a detailed summary of what tasks where performed, and how many requests were effected. Some examples of multiples tasks performed in the same edit: [61], [62]
Let me know if you need a more thorough explanation of any of the tasks and the logic behind them. Thank you! — MusikAnimal talk 05:27, 6 July 2015 (UTC)[reply]
- {{BAGAssistanceNeeded}} — MusikAnimal talk 03:16, 13 July 2015 (UTC)[reply]
Approved. I read over the discussion, reviewed the edits, and everything looks good. — Earwig talk 05:37, 13 July 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Approved.
Operator: Jheald (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 23:36, Monday December 8, 2014 (UTC)
Automatic, Supervised, or Manual: Supervised
Programming language(s): Perl
Source code available: Still under development.
Function overview: Maintenance of subpages of Wikipedia:GLAM/Your_paintings, in particular the subpages listed at Wikipedia:GLAM/Your_paintings#Artists_by_birth_period. There is currently a drive to identify Wikidata entries for the entries on this list not yet matched. I seek approval to keep these corresponding pages on Wikipedia up to date.
Initially I would just use the bot as an uploader, to transfer wikipages edited off-line into these pages (including fixing some anomalies in the present pages -- which I would probably do sequentially, through more than one stage, reviewing each fix stage before moving on to the next).
Once the off-line code is proven, I would then propose to move to a semi-automated mode, automatically updating the pages to reflect new instances of items with d:Property:P1367 and/or corresponding Wikipedia and Commons pages.
Links to relevant discussions (where appropriate):
Edit period(s): Occasional (perhaps once a fortnight), once the initial updating has been completed. And on request.
Estimated number of pages affected: 17
Exclusion compliant (Yes/No): No. These are purely project tracking pages. No reason to expect a {{bots}} template. If anyone has any issues with what the bot does, they should talk to me directly and I'll either change it or stop running it.
Already has a bot flag (Yes/No): No. I have one on Commons, but not yet here.
Function details:
- Initially: simple multiple uploader bot -- take updated versions of the 17 pages prepared and reviewed offline, and upload them here.
- Subsequently: obtain a list of all Wikidata items with property P1367. Use the list to regenerate the "Wikidata" column of the tables, plus corresponding sitelinked Wikipedia and Commons pages.
Discussion
- Regarding uploading offline edits: Are these being made by anyone besides the operator? What license are they being made under? — xaosflux Talk 23:44, 18 December 2014 (UTC)[reply]
- @Xaosflux: The pages have been being prepared by me using perl scripts, drawing from Wikidata.
- I've slowly been making the scripts more sophisticated -- so I've recently added columns for VIAF and RKDartists links, both taken from Wikidata, defaulting to searches if there's no link, or no Wikidata item yet identified. Content not drawn from Wikidata (typically legacy entries from the pages as I first found them) I have prefixed with a question mark in the pages, meaning to be confirmed. For the most part these are blue links, which may go to completely the wrong people.
- So at the moment I'm running a WDQ search to pull out all Wikidata entries with one (or more) values for the P1367 "BBC Your Paintings identifier" property, along with the properties for Commons category name (P373), VIAF (P214) and RDKartists (P650). I'm also running an Autolist search to get en-wiki article names for all Wikidata items with a P1367. Plus I have run a look-up to get Wikidata item numbers for all other en-wiki bluelinks on the page (this gives the Q-numbers marked with question marks). But the latter was quite slow, so I have only run it the once. At the moment I'm still launching these searches by hand, and making sure they've come back properly, before updating & re-uploading the pages.
- As to the licensing -- Wikidata is licensed CC0. My uploads here are licensed CCSA like any other upload to the site (though in reality there is very little originality, creativity or expression, apart from the choice of design of the page overall, so probably (under U.S. law at least), there quite possibly is no new copyrightable content in the diffs. Various people of course are updating Wikidata -- I've been slowly working down this list (well, so far only to the middle of the 1600s page) though unfortunately not all of the Wikidata updates seem to be being picked up by WDQ at the moment; the Your Painters list is also on Magnus's Mix-and-Match tool; and various others are working at the moment, particularly to add RKD entries to painters with works in the Rijksmuseum in Amsterdam. But Wikidata is all CC0, so that all ought to be fine.
- What would help though, would be having the permission for a (limited) multiple uploader, so I could then upload the updates to all 17 pages just by launching a script, rather than laboriously having to upload all 17 by hand each time I want to refresh them, or slightly improve the treatment of one of the columns.
- I'm not sure if that entirely answers your question, but I hope does make clearer what I've been doing. All best, Jheald (talk) 00:45, 19 December 2014 (UTC)[reply]
Approved for trial (25 edits or 10 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Please post your results here after the trial. — xaosflux Talk 01:48, 19 December 2014 (UTC)[reply]
- @Xaosflux: First run of 16 edits made successfully -- see contribs for 19 December, from 15:59 to 16:55.
- (Links to RKD streamlined + data updated; one page unaffected).
- All the Captchas were a bit of a pain to have to deal with; but they will go away. Otherwise, all fine. Jheald (talk) 17:31, 19 December 2014 (UTC)[reply]
- Sorry about that, I added
confirmed
flag to avoid this for now. — xaosflux Talk 17:34, 19 December 2014 (UTC)[reply]- New trial run carried smoothly (see this related changes page).
- Update still prepared by executing several scripts manually, before a final uploader script; but I should have these all rolled together into a single process for the next test. Jheald (talk) 09:11, 11 January 2015 (UTC)[reply]
- Sorry about that, I added
Have you completed the trial? Josh Parris 10:20, 4 March 2015 (UTC)[reply]
- I was going to go on running it once a month or so, the next one probably in a day or two, until anyone progressed this any further, possibly making tweaks to my offline processing scripts as I went along. Obviously I'm open to suggestions as to anything I can improve or do better; though the actual unsupervised bit itself is just an upload script, refreshing a dozen or so pages, so nothing very complicated. (The off-line preprocessing is a bit more involved, but still pretty trivial). Jheald (talk) 00:33, 5 March 2015 (UTC)[reply]
- I note that further edits have been made. Out of interest, why do http://viaf.org IDs change? The painter's been dead for centuries. Are they merges of duplicates? Also, is the trial finished now? Josh Parris 14:54, 9 March 2015 (UTC)[reply]
- @Josh Parris: Clearly there has been a significant update of VIAF ids on Wikidata in the last three weeks, with a lot of new VIAF ids added -- I think by one of Magnus Manske's bots. This is why there are significant reductions in length for a lot of pages, with VIAF searches being replaced by explicit VIAF links.
- I imagine that this may be catch-up resynchronisation for several months of updates at VIAF; but it may also be that now VIAF is explicitly targeting Wikidata items rather than just en-wiki articles, and is actively doing matching at the VIAF end, that may be why there now seems to be a sudden rush of new VIAF <--> Wikidata matches.
- You're right that there are a few VIAF matches that have changed. I haven't looked in to any in detail, but two strong possibilities would be either erroneous matches that have been corrected (ie we used to point to the VIAF for somebody quite different); or alternatively that a group of duplicate entries on VIAF may have been merged -- eg if there had been a VIAF for the Library of Congress id, and another for the Getty ULAN id, and the two had not previously been connected.
- As to where we're at, matching of the Your Paintings painter identifiers continues to move forwards using mix-n-match. About 80% of the YP identifiers have now been triaged into has / doesn't have / shouldn't have Wikidata item, with progress ongoing; plus I've now got as far as painters born before 1825, using mix-n-match search to match to RDKartists and other databases. Then there will also a stage where new Wikidata items are created for YP ids that currently don't have them but should; and these new ids in turn will also have RKD artists (etc) that they match. So there's still a lot to do going forward, and the tracking pages will continue to need updates if they are to reflect that.
- At the moment it's still done using about four scripts that I sequentially run by hand on an occasional basis. The one I'd have to write a bit more code to integrate is the one that merges in the article names on en-wiki for the Wikidata items, because these are currently got using an Autolist query which is then saved manually. I'd need to look into how to replace that batch look-up with an API call, if I was to make the whole thing more integrated and run on regular basis (weekly?) I'm happy to do that work if anybody wants it, but for the time being it's also as easy just to go on doing what I've been doing, generating the updates in a partially manual way. So I'm happy to be open to views, if anybody has got any strong preferences either way. Jheald (talk) 23:27, 4 May 2015 (UTC)[reply]
- I note that further edits have been made. Out of interest, why do http://viaf.org IDs change? The painter's been dead for centuries. Are they merges of duplicates? Also, is the trial finished now? Josh Parris 14:54, 9 March 2015 (UTC)[reply]
Jheald what is to be done here? I have no followed the entire discussion to be honest. -- Magioladitis (talk) 08:49, 15 July 2015 (UTC)[reply]
- ping Magioladitis (talk) 17:59, 14 August 2015 (UTC)[reply]
- Hi @Magioladitis:. What I am looking for is permission to go on making script-driven updates to the 17 pages linked from Wikipedia:GLAM/Your paintings/header, as the data continues to develop on Wikidata. Thanks, Jheald (talk) 17:42, 17 August 2015 (UTC)[reply]
Approved for extended trial (50 edits or 5 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Jheald As I noticed that bots run an updated script already? -- Magioladitis (talk) 18:21, 17 August 2015 (UTC)[reply]
- Hi @Magioladitis:. Thanks for the extended trial. You're right, I ran another update earlier this afternoon. But what I'm really looking for is for permission now to be extended indefinitely. I have run the scripts on and off for over 9 months now; and it's a very small set of pages affected, in project space rather than main space, pages primarily used by myself. Can we not just sign off the permission permanently now? Jheald (talk) 18:55, 17 August 2015 (UTC)[reply]
Jheald I'll do it in 5 days from now if this is not a problem. Just ping me in 5 days and I'll immediately approve it. -- Magioladitis (talk) 18:59, 17 August 2015 (UTC)[reply]
Approved. -- Magioladitis (talk) 10:07, 26 August 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
Bots that have completed the trial period
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was
Approved.
Operator: Magnus Manske (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 08:30, Thursday, June 25, 2015 (UTC)
Automatic, Supervised, or Manual: Automatic
Programming language(s): PHP
Source code available: https://bitbucket.org/magnusmanske/listeria
Function overview: Generates wikitext lists based on Wikidata queries, and updates pages if there is a change since last time.
Links to relevant discussions (where appropriate):
Edit period(s): Daily
Estimated number of pages affected: As of this moment, a handful of talk pages.
Exclusion compliant (Yes/No): No.
Already has a bot flag (Yes/No): No
Function details: Detailed explanation in my blog entry. Basically, edits pages containing two specific templates. Runs a Wikidata query daily or on manual request, generates a wikitext list, and updates the page if the list is different than the current one. Example.
Discussion
Note: This bot appears to have edited since this BRFA was filed. Bots may not edit outside their own or their operator's userspace unless approved or approved for trial. AnomieBOT⚡ 08:37, 25 June 2015 (UTC)[reply]
- Dear NagBot, only on talk and user talk pages, AFAICT. Does not operate unless specific templates are set on those pages. --Magnus Manske (talk) 08:40, 25 June 2015 (UTC)[reply]
- YOur bot is not allowed to run at all outside of your or its userspace. Please shut it down.—cyberpowerChat:Online 13:05, 25 June 2015 (UTC)[reply]
- Dear NagBot, only on talk and user talk pages, AFAICT. Does not operate unless specific templates are set on those pages. --Magnus Manske (talk) 08:40, 25 June 2015 (UTC)[reply]
Magnus Manske I blocked the bot until we set this out and the bot gets approval. I did not check the request yet, I only checked the fact that the bot kept editing without any formal approval and without being in test phase. -- Magioladitis (talk) 13:43, 28 June 2015 (UTC)[reply]
- This is a minor bot task and will not break anything, and is used on a global scale. I suggest BAG will do a {{BotSpeedy}}. The bot has done a trial already (without approval) which has been fine. Ping Magioladitis. (t) Josve05a (c) 17:11, 29 June 2015 (UTC)[reply]
- Sorry. I really don't understand. I discovered the use of this bot last week with wonder in the eyes. It's very useful to create lists from Wikidata very easily (for example, yesterday : Drawings by Nicolas Poussin and Mythological paintings by Pompeo Batoni. Everybody agree to have guidline for bots, but if nobody complain about the activity, exposed, explained, approved after trial and is considered as very useful for some people, why blocking it? Best regards --Shonagon (talk) 18:14, 29 June 2015 (UTC)[reply]
Magnus Manske How and why did these users created the initial list? Are they aware that the bot will edit in their subpages? -- Magioladitis (talk) 17:26, 29 June 2015 (UTC)[reply]
- Yes, that is the whole point after all. Blog post explaining this. --Magnus Manske (talk) 18:06, 29 June 2015 (UTC)[reply]
Approved for trial (5 days). Please provide a link to the relevant contributions and/or diffs when the trial is complete. Ping me in 5 days if there are no problems and no complains to approve this. Thanks, Magioladitis (talk) 20:01, 29 June 2015 (UTC)[reply]
- @Magioladitis: Seems to have gone well. --Magnus Manske (talk) 11:32, 4 July 2015 (UTC)[reply]
Trial complete. I do not understand why the bot keeps running outside bot trial period. -- Magioladitis (talk) 08:11, 5 July 2015 (UTC)[reply]
Approved. -- Magioladitis (talk) 07:07, 6 July 2015 (UTC)[reply]
Pinging Anomie and MBisanz in case they want to comment the fact that the bot kept editing outside the bot trial period. -- Magioladitis (talk) 07:08, 6 July 2015 (UTC)[reply]
- Sigh. Is it really that hard to just follow the rules and not have the bot edit while the paperwork is being done? Anomie⚔ 13:01, 6 July 2015 (UTC)[reply]
- I am not here for rules. I am not here for paperwork. I am here to improve the encyclopedia. Sorry if you are not. --Magnus Manske (talk) 09:02, 7 July 2015 (UTC)[reply]
- Magnus Manske Please do not assume bad faith. -- Magioladitis (talk) 22:17, 7 July 2015 (UTC)[reply]
- I am not here for rules. I am not here for paperwork. I am here to improve the encyclopedia. Sorry if you are not. --Magnus Manske (talk) 09:02, 7 July 2015 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.
Approved requests
Bots that have been approved for operations after a successful BRFA will be listed here for informational purposes. No other approval action is required for these bots. Recently approved requests can be found here (edit), while old requests can be found in the archives.
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 45) Approved 13:47, 29 May 2024 (UTC) (bot has flag)
- Numberguy6Bot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 13:18, 26 May 2024 (UTC) (bot has flag)
- BsoykaBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 13:18, 26 May 2024 (UTC) (bot has flag)
- SDZeroBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 13) Approved 13:08, 26 May 2024 (UTC) (bot has flag)
- CopyPatrolBot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 12:59, 18 May 2024 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 29) Approved 11:35, 3 May 2024 (UTC) (bot has flag)
- ButlerBlogBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 4) Approved 11:35, 3 May 2024 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 42) Approved 11:39, 1 April 2024 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 43b) Approved 20:42, 31 March 2024 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 44) Approved 20:26, 31 March 2024 (UTC) (bot has flag)
- BattyBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 82) Approved 12:48, 30 March 2024 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 28) Approved 07:32, 29 March 2024 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 25) Approved 12:45, 13 March 2024 (UTC) (bot has flag)
- AnomieBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 81) Approved 21:04, 10 March 2024 (UTC) (bot has flag)
- FrostlySnowman (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 11) Approved 09:40, 17 February 2024 (UTC) (bot has flag)
- AnomieBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 82) Approved 19:58, 4 February 2024 (UTC) (bot has flag)
- PrimeBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 43) Approved 13:45, 4 February 2024 (UTC) (bot has flag)
- SDZeroBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 12) Approved 16:43, 1 February 2024 (UTC) (bot has flag)
- The Sky Bot (BRFA · contribs · actions log · block log · flag log · user rights) Approved 13:02, 25 January 2024 (UTC) (bot has flag)
- DeadbeefBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 19:46, 23 January 2024 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 26) Approved 14:24, 6 January 2024 (UTC) (bot has flag)
- BsoykaBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Approved 13:35, 1 January 2024 (UTC) (bot has flag)
- ButlerBlogBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Approved 13:35, 1 January 2024 (UTC) (bot has flag)
- Cewbot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 12) Approved 13:38, 31 December 2023 (UTC) (bot has flag)
- BattyBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 79) Approved 08:41, 31 December 2023 (UTC) (bot has flag)
- KiranBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 8) Approved 16:54, 28 December 2023 (UTC) (bot has flag)
- BattyBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 80) Approved 14:13, 17 December 2023 (UTC) (bot has flag)
- ButlerBlogBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Approved 14:13, 17 December 2023 (UTC) (bot has flag)
- Qwerfjkl (bot) (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 27) Approved 19:35, 14 December 2023 (UTC) (bot has flag)
- BaranBOT (BRFA · contribs · actions log · block log · flag log · user rights) Approved 13:06, 14 December 2023 (UTC) (bot has flag)
Denied requests
Bots that have been denied for operations will be listed here for informational purposes for at least 7 days before being archived. No other action is required for these bots. Older requests can be found in the Archive.
- Tulsibot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 13:56, 28 June 2015 (UTC)
- BioLinkBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 11:33, 19 June 2015 (UTC)
- Humbot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 18:49, 12 June 2015 (UTC)
- SamoaBot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Bot denied 22:03, 3 May 2015 (UTC)
- TheMagikBOT (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 14:05, 26 April 2015 (UTC)
- PavloChemBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 09:40, 25 April 2015 (UTC)
- BRPbot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 12:22, 18 March 2015 (UTC)
- StanfordLinkBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 13:25, 9 March 2015 (UTC)
- AlicornBot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 14:58, 17 October 2014 (UTC)
- BatreeqbotPro (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 21:59, 15 October 2014 (UTC)
- Archive 'o' matic (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 21:59, 15 October 2014 (UTC)
- Tymon the Bot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 22:28, 8 October 2014 (UTC)
- Page correction BOT (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 11:13, 29 September 2014 (UTC)
- Page correction BOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Bot denied 11:13, 29 September 2014 (UTC)
- Helsabot (BRFA · contribs · actions log · block log · flag log · user rights) Bot denied 14:25, 29 June 2014 (UTC)
Expired/withdrawn requests
These requests have either expired, as information required by the operator was not provided, or been withdrawn. These tasks are not authorized to run, but such lack of authorization does not necessarily follow from a finding as to merit. A bot that, having been approved for testing, was not tested by an editor, or one for which the results of testing were not posted, for example, would appear here. Bot requests should not be placed here if there is an active discussion ongoing above. Operators whose requests have expired may reactivate their requests at anytime. The following list shows recent requests (if any) that have expired, listed here for informational purposes for at least 7 days before being archived. Older requests can be found in the respective archives: Expired, Withdrawn.
- Commons fair use upload bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Expired 17:05, 27 June 2015 (UTC)
- Mdann52 bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 8) Withdrawn by operator 05:34, 4 June 2015 (UTC)
- Demibot (BRFA · contribs · actions log · block log · flag log · user rights) Expired 22:38, 24 March 2015 (UTC)
- SocietyBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 21:32, 22 March 2015 (UTC)
- Bot1058 (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 2) Withdrawn by operator 22:57, 15 February 2015 (UTC)
- Cerabot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 3) Withdrawn by operator 13:58, 30 December 2014 (UTC)
- MoohanBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 7) Withdrawn by operator 02:09, 30 December 2014 (UTC)
- Revibot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 15:04, 22 November 2014 (UTC)
- Jimmy the Bot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 11:12, 2 October 2014 (UTC)
- Faebot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 13:40, 11 July 2014 (UTC)
- Legobot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 36) Withdrawn by operator 05:35, 9 July 2014 (UTC)
- Mdann52 bot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 5) Withdrawn by operator 16:32, 3 July 2014 (UTC)
- CleanupWorklistBot (BRFA · contribs · actions log · block log · flag log · user rights) Withdrawn by operator 22:38, 30 June 2014 (UTC)
- Josvebot (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 11) Withdrawn by operator 21:03, 23 June 2014 (UTC)
- MoohanBOT (BRFA · contribs · actions log · block log · flag log · user rights) (Task: 6) Withdrawn by operator 10:31, 18 June 2014 (UTC)
Get personal technical help at the Teahouse, help desk, village pump (technical), talk pages or IRC. | |
General technical help | |
Special page-related | |
Wikitext | |
Links and diffs | |
Media files: images, videos and sounds | |
Other graphics | |
Templates and Lua modules | |
Data structure | |
HTML and CSS | |
Customisation and tools | |
Automated editing | |
|