Jump to content

Wikipedia:Bots/Requests for approval/VoxelBot 2: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Line 59: Line 59:
::Alright, I've got the page mover set up (just need to implement the WhatLinksHere checker which will change the link from a double redirect). I can't test the AWB part until you or another admin adds VoxelBot to the AWB Bots approval list. Still trying to figure out the regular expressions here; for some reason even if I select the "Skip external/internal links, etc" option in the RegEx menu, it still edits links? Which is obviously not the behavior needed here. Any ideas? I saw somewhere about an option for this but I've searched AWB far and wide and found nothing. [[User:Vacation9|<span style="color:green">Vaca</span>]][[User talk:Vacation9|<span style="color:teal">tion</span>]][[Special:Contributions/Vacation9|<span style="color:orange">9</span>]] 13:41, 10 February 2013 (UTC)
::Alright, I've got the page mover set up (just need to implement the WhatLinksHere checker which will change the link from a double redirect). I can't test the AWB part until you or another admin adds VoxelBot to the AWB Bots approval list. Still trying to figure out the regular expressions here; for some reason even if I select the "Skip external/internal links, etc" option in the RegEx menu, it still edits links? Which is obviously not the behavior needed here. Any ideas? I saw somewhere about an option for this but I've searched AWB far and wide and found nothing. [[User:Vacation9|<span style="color:green">Vaca</span>]][[User talk:Vacation9|<span style="color:teal">tion</span>]][[Special:Contributions/Vacation9|<span style="color:orange">9</span>]] 13:41, 10 February 2013 (UTC)
:::I have added VoxelBot to the awblist [https://en.wikipedia.org/w/index.php?diff=537540863&oldid=537450417 here]. What regular expression are you using and what do you desire it to do? :) '''[[User:Addshore|<span style="color:#FF4000;">·Add§hore·</span>]]''' <sup>[[User_talk:Addshore|<span style="color:#FF4000;">T<small>alk</small> T<small>o</small> M<small>e</small>!</span>]]</sup></span> 13:44, 10 February 2013 (UTC)
:::I have added VoxelBot to the awblist [https://en.wikipedia.org/w/index.php?diff=537540863&oldid=537450417 here]. What regular expression are you using and what do you desire it to do? :) '''[[User:Addshore|<span style="color:#FF4000;">·Add§hore·</span>]]''' <sup>[[User_talk:Addshore|<span style="color:#FF4000;">T<small>alk</small> T<small>o</small> M<small>e</small>!</span>]]</sup></span> 13:44, 10 February 2013 (UTC)
::::It's just a simple find and replace; not even a regular expression persay. It's replacing the incorrect characters with the correct ones. But of course this shouldn't apply in links as it would break them. Any ideas? I don know whythe ignoe comments, links, etc. option is not working. [[User:Vacation9 Public|<span style="color:#008B8B">Vacation</span>]]<sup>[[User talk:Vacation9 Public|<span style="color:#FF8C00">nine</span>]]</sup> <span style="color:#008B8B">Public</span> 13:49, 10 February 2013 (UTC)

Revision as of 13:49, 10 February 2013

Operator: Vacation9 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 02:56, Thursday January 31, 2013 (UTC)

Automatic, Supervised, or Manual: Supervised

Programming language(s): AutoWikiBrowser, Python

Source code available: AWB, Standard pywikipedia

Function overview: Replace substitutes for Romanian letters used before Unicode 3 was released (Ş, ş, Ţ, and ţ) with their proper letters in the Romanian alphabet: (Ș, ș, Ț, and ț). It will replace everywhere except for image links and interwiki/external links.

Links to relevant discussions (where appropriate): Wikipedia:Bot requests/Archive 52#Romanian_orthography

Edit period(s): One time run

Estimated number of pages affected: Hundreds of thousands. Working off the Geography of Romania Category

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): Yes

Function details: From a database scan of pages with the characters in their titles list of pages in categories related to Romanian Geography, move them to their correct name if they contain incorrect characters. Then, replace the characters defined above with their correct letters in the Romanian alphabet and fix the double redirects created.. The input pages will only be strictly Romanian, since the current letters are correct in non-Romanian languages. The input pages won't just be taken from the base category (Geography of Romania) but will be sub-categories recursed by AWB that are manually checked.

Discussion

  • Will it do any other kinds of gen-fixes when making these edits? Will most users be able to see the difference in the letters or will it be purely an underlying language code change? MBisanz talk 03:23, 31 January 2013 (UTC)[reply]
    • In some manual test edits ([1] for example) you can see that it does make a visible change. It is set up to also do genfixes (of course only if it matches the regex as well), but this can be disabled as you know. It's already editing so genfixes (which many of these pages need) are quite good in this case. Vacation9 03:42, 31 January 2013 (UTC)[reply]

I believe this must be supervised, not automatic. --MZMcBride (talk) 03:52, 31 January 2013 (UTC)[reply]

What exactly do you mean by supervised? If you mean checking the edits as they are made, this is completely possible. Vacation9 04:08, 31 January 2013 (UTC)[reply]
Yes, I mean checking the edits. "Supervised" instead of "automatic" (which is the equivalent to "unsupervised"). I think human review will be needed for each of these edits, as find and replace is notoriously tricky on any large body of work. The number of edge cases is simply overwhelming. --MZMcBride (talk) 18:54, 31 January 2013 (UTC)[reply]
That's completely possible, and I've noted the bot as Supervised instead of Automatic. Vacation9 22:28, 31 January 2013 (UTC)[reply]
  • Another problem has arisen: we have to move pages with incorrect characters in them as well. Thus, I came up with a three-step plan. First, a database scan for articles with incorrect characters in them, which outputs to a file. Then, (after review of the articles) using Pywikipediabot or a similar framework I mass move the pages. Then, when doing the AWB scan, we can both correct the redlinks and correct everything else in the page. This will need some interesting code but I think it would be worth it. Thoughts? Vacation9 13:34, 31 January 2013 (UTC)[reply]
Approved for trial (50 edits). Please provide a link to the relevant contributions and/or diffs when the trial is complete. sure thing. Haha it's what I signed up for. MBisanz talk 17:22, 9 February 2013 (UTC)[reply]
Alright, I've got the page mover set up (just need to implement the WhatLinksHere checker which will change the link from a double redirect). I can't test the AWB part until you or another admin adds VoxelBot to the AWB Bots approval list. Still trying to figure out the regular expressions here; for some reason even if I select the "Skip external/internal links, etc" option in the RegEx menu, it still edits links? Which is obviously not the behavior needed here. Any ideas? I saw somewhere about an option for this but I've searched AWB far and wide and found nothing. Vacation9 13:41, 10 February 2013 (UTC)[reply]
I have added VoxelBot to the awblist here. What regular expression are you using and what do you desire it to do? :) ·Add§hore· Talk To Me! 13:44, 10 February 2013 (UTC)[reply]
It's just a simple find and replace; not even a regular expression persay. It's replacing the incorrect characters with the correct ones. But of course this shouldn't apply in links as it would break them. Any ideas? I don know whythe ignoe comments, links, etc. option is not working. Vacationnine Public 13:49, 10 February 2013 (UTC)[reply]