Jump to content

Wikipedia talk:Bots/Requests for approval: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
→‎Quick approvals without community consensus or discussion: Find all the technicalities, "the field exists, therefore it should be filled in," it doesn't amount to community consensus.
Line 98: Line 98:
BAG itself seems to indicate there is no design for community input: a trial is "most likely" approved, without rgard to community input, then the bot is approved solely on technical issues. Linking thousands of wikipedia pages to worldcat required community consensus, not rapid approval. If this is done here it could be an easy avenue for vandalism. --[[Special:Contributions/69.226.111.130|69.226.111.130]] ([[User talk:69.226.111.130|talk]]) 21:05, 23 October 2009 (UTC)
BAG itself seems to indicate there is no design for community input: a trial is "most likely" approved, without rgard to community input, then the bot is approved solely on technical issues. Linking thousands of wikipedia pages to worldcat required community consensus, not rapid approval. If this is done here it could be an easy avenue for vandalism. --[[Special:Contributions/69.226.111.130|69.226.111.130]] ([[User talk:69.226.111.130|talk]]) 21:05, 23 October 2009 (UTC)
:I'm afraid I'm in the camp that says that the bot is not adding the link to WorldCat - the ''template'' is. If there's no consensus to link to WorldCat, then the template needs to be changed - if Cobra went through and did it all manually, it would still be overlinked. The bot may have highlighted a controversial template setting, but it didn't create it [[User:Fritzpoll|Fritzpoll]] ([[User talk:Fritzpoll|talk]]) 21:39, 23 October 2009 (UTC)
:I'm afraid I'm in the camp that says that the bot is not adding the link to WorldCat - the ''template'' is. If there's no consensus to link to WorldCat, then the template needs to be changed - if Cobra went through and did it all manually, it would still be overlinked. The bot may have highlighted a controversial template setting, but it didn't create it [[User:Fritzpoll|Fritzpoll]] ([[User talk:Fritzpoll|talk]]) 21:39, 23 October 2009 (UTC)

::The bot used a functionality of the template without community input. As I said above, there are dozens of fields in wikipedia taxoboxes. If you assign a bot to fill in every one of them, the bot will simply be blocked by a responsible admin. Even if you approved a bot to fill in only a specific field on every single taxobox the bot would be blocked unless it went forward with community consensus.

::The spirit of the policy for community consensus at wikipedia is that the community gets some say in the matter. Find all the technicalities, "the field exists, therefore it should be filled in," it doesn't amount to community consensus. BAG approved a bot to do a task without gaining community consensus.

::It's for the community to decide whether or not that field should be filled in on every article, not for an individual BAG member. I've already linked to a discussion showing there appears to be no consensus for that field to be filled in. Bots policy doesn't say "if there's no consensus against something a bot can do it."

::What it says is, "'''In order for a bot to be approved, its operator should demonstrate that it: performs only tasks for which there is [[Wikipedia:Consensus|consensus]].'''" So, let's run with bots policy. There is no link to community consensus for this task, because there was no attempt to establish community consensus for the task.

::Other templates have blank fields. If there is consensus by the community to have a bot fill it in, fine. But there isn't. So, rollback of an unapproved bot task is a simple and straight-forward means to take care of the issue. --[[Special:Contributions/69.226.111.130|69.226.111.130]] ([[User talk:69.226.111.130|talk]]) 01:19, 24 October 2009 (UTC)


== Another speedy approval, and speedy approvals and community input in general ==
== Another speedy approval, and speedy approvals and community input in general ==

Revision as of 01:19, 24 October 2009

Is approval scheme working?

It was recently agreed on Wikipedia:Village pump that any large-scale automated or semi-automated article creation task should require BRFA. One concern was that it would be impossible to follow up, has this been the case? Take for instance Sasata's recent large creation of fungi articles. Or Fergananim's very short articles on medieval Irish aboots, like Gillabhrenainn Ua hAnradhain. According to the new regulations these should both require approval, but I can't see that this has been done? Lampman (talk) 14:57, 28 September 2009 (UTC)[reply]

With the exception of the content creation bot BRfA which is open atm, there haven't been many (if any) recent BRfAs for page creation. But then, this proposal hasn't been widely "advertised", and I'm willing to bet that the users mentioned above aren't even aware of it. - Kingpin13 (talk) 15:08, 28 September 2009 (UTC)[reply]
No, this was an issue brought up at the discussion: that it would be just another layer of guidelines and regulations that nobody would care about or even know about. I should perhaps bring it up at the Village Pump to figure out how it can be better advertised and implemented, otherwise it's rather pointless to have such a rule at all. Lampman (talk) 15:22, 28 September 2009 (UTC)[reply]
Since the community has decided that mass content creation must be done by an authorized bot, it can be enforced as part of the existing "no unauthorized bots" rule. Although it would probably be better to just warn the user first for the moment. Anomie 20:31, 28 September 2009 (UTC)[reply]

Most likely approve

Why is it bot policy that a bot will most likely be approved after a community discussion? Isn't it that a decision to approve a trial will be made after discussion?

After a reasonable amount of time has passed for community input, an approvals group member will most likely approve a trial for your bot and move the request to this section.

What? --69.225.5.4 (talk) 18:23, 29 September 2009 (UTC)[reply]

The statement is accurate from a historical standpoint, mostly becuase people rarely ask for approval of a controversial task. --ThaddeusB (talk) 00:15, 30 September 2009 (UTC)[reply]
It's not a statement about the history of the policy; and the policy isn't just the type of tasks, it's the code, the feasibility, what the community desires. There are plenty of bots that are not approved, so it's inaccurate.
To say the task "will most likely be approved" smacks of a lack of regard for community input.
BAG members should approve trials only if there is support for the bot and the bot is in accordance with policy.
I'm going to reword it, barring input from BAG saying that, indeed, they "most likely" approve trials for bots. --69.225.5.4 (talk) 18:09, 1 October 2009 (UTC)[reply]
Since August 1, there have been 4 bots denied, 3 requests expired, and 6 withdrawn by the operator. A good deal of the expired/withdrawn ones were approved for trial at the time of closing. In that time, there have been 33 approved bots. So "most likely" does seem accurate. Mr.Z-man 18:22, 1 October 2009 (UTC)[reply]
It should outline policy, not predict the results of a request. --69.225.5.4 (talk) 20:12, 1 October 2009 (UTC)[reply]

Hypotheticals.

So, suppose someone wanted to run a bot that did admin activities, but was not themselves an admin. (I can't program, so no, this isn't asking about me.) Is this allowed? If it's not been considered, should it be allowed? Also, what about someone who is an admin, runs a useful bot, and is desysoped? (Or, what in the case where a regular user who runs a bot is banned.) I'm just curious how such issues are approached, so I can better know policy. Irbisgreif (talk) 08:10, 15 October 2009 (UTC)[reply]

It's not allowed, as it would allow someone access to the admin tools who didn't go through RFA. If an admin is desysopped, their bots would be desysopped to; if a user is banned, their bots would be banned (and deflagged) too. If a user is just blocked for a short time (e.g. a 31-hour block), I don't know whether their bots would be blocked too or if they would be left alone as long as they weren't used for block evasion. A bot's activities (or the operator's activities relating to being a bot operator, e.g. communication) can also be brought to this page for review, which could result in the bot being deapproved. Anomie 11:15, 15 October 2009 (UTC)[reply]

Quick approvals without community consensus or discussion

There was a remark that this bot was approved without community consensus.[1] ("Bots seem to get approved based on a technical evaluation rather than on whether they conform to bot policy by only making edits that have consensus, as happened here.")

The bot was approved in two days with no input from anyone else in the RFBA process, other than bot operator and single BAG member who approved the bot for editing thousands of mainspace articles after examing some trial edits made by the bot before it was approved for trial edits, also, thereby, eliminating the opportunity for community input on the trial run.[2]

This bot only edits pages which have the parameter blank already, but the parameter does not show up in the article if not filled in (| oclc = | dewey = congress = ), and whether it should be filled in by a bot should be discussed with the wider community to gain consensus for the bot task.

I would like to see some time pass before bots that impact article space widely are approved.

Bot policy requires that a bot "performs only tasks for which there is consensus," means that bots should not be approved for main space tasks without community input. One BAG member does not constitute community input, imo.

Link to discussion calling OCLC linkspam-controversy.

[3]

[4] This link is not about CobraBot. I include it because a quick search shows that OCLCs are something that generates a lot of discussion on en.wiki. This discussion mentions, for instance, that consensus shows "OCLCs are considered superfluous when ISBNs are present." This discussion shows that, contrary to being approved, the CobraBot task maybe should have been denied as there might not be community consensus for the task at all. Consensus is required by bot policy. None was asked for in this approval. No time for community input was allowed before community approval. A prior bot was stopped from doing this task by the community. Maybe this bot task should not have been approved against community consensus.

--69.225.5.183 (talk) 07:33, 18 October 2009 (UTC)[reply]

I assume that Kingpin was working under the assumption that this was an uncontroversial task, relatively easily undone that wouldn't be irritating to the community. The question on gaining community consensus for a bot task is tricky - do we need to actively seek it out for every single task? If it affects a specific set of articles, it's a good idea to go to the relevant Wikiprojects to ask for input first if it's going to cause a major change. But if we went and asked for every single little cleanup bot then we'd get no input at all - the community is strangely disinterested in bot operations and would quickly tire of our requests. So, not to dwell on the specific case as you request, the general case is that many tasks will in practice, be approved, when there is no consensus against the task rather than a positive consensus in its favour. Fritzpoll (talk) 09:27, 20 October 2009 (UTC)[reply]
Unfortunately in this specific case the task is a controversial task. If the time on the RFBA board had been longer than 2 days, I might have been able to point this out because I've seen discussions on wikipedia about OCLC.
So, if the task is going to impact article space, and it's going to impact a lot of articles, and it is adding something new, this was not the time, imo, to make a quick solo decision the task was not controversial.
Just because a parameter is available in an article information box doesn't mean adding it via a bot is non-controversial. The organism article boxes have dozens of parameters that could be filled with a bot, if you started filling all of them, or even some of the hierarchies in all organisms, the bot would be blocked immediately by one of the biology editors with admin privileges.
Yes, there is a lot of community disinterest in bots. But I don't think this was a good situation for assuming that the bot would be uncontroversial and could be quickly approved, because a search by the BAG member in non-main space would have revealed discussions about the issue, the BOT is adding information to an infobox, the BOT is working in article space, and the BOT is impacting thousands of articles.
In addition, the bot owner was notified that his task was not controversial and should have stopped the bot at that point and revisited its operating parameters at the time, since the flag was given without any community input other than rapid approval by a single BAG member.
I think a default position, when dealing with editing thousands of mainspace articles, that no word against something in less than two days is tacit approval is a poor operating practice in a community that works on consensus. --69.225.5.183 (talk) 16:05, 20 October 2009 (UTC)[reply]
Well, I'm going to follow your initial suggestion that this was not specifically about CobraBot and not comment on it, since I have no insider knowledge about what Kingpin's reasoning was. As a BAG member myself, I'm also not going to revisit it and try to second-guess what I would have done. The question of community consensus is a witty one in all walks of Wikipedia - what constitutes a consensus, etc. That clause exists not to force us to race out and gather consensus, but to avoid approving things where there is doubt. Sometimes that goes wrong, as may have happened here: perhaps we need to be more thorough and that's certainly a comment I'll take on board. I'm looking at the dates in this particular case, however, and not seeing how we'd have found the main discussion, which occurred after approval. Will re-examine your links to check what I've probably missed. Fritzpoll (talk) 16:40, 20 October 2009 (UTC)[reply]
I'd have to say there was little to indicate that this task would be controversial: The only discussion linked above that dates to before the BRFA was Template talk:Citation/Archive 3#Why OCLC?, which didn't actually generate a lot of discussion and was located on an unrelated template's talk page. The documentation for the template in question does mention "use OCLC when the book has no ISBN", but I missed it at first when checking just now because I paged straight to the documentation for the oclc parameter itself rather than reading through the introductory text. And Cybercobra did stop the bot once it became clear that the task was in fact controversial (although it does seem most of the controversy is due to just one very vocal editor on a crusade).
Sure, this might have been avoided by insisting on arbitrary waiting periods before bot approval and other bureaucracy. But is more bureaucracy really either necessary or desired by the community as a whole? Anomie 17:08, 20 October 2009 (UTC)[reply]
Exactly - a system like this will always allow some number of mistakes to occur - and all that tightening the rules too strongly will do is to make the 99.99% of never-controversial bots that BAG handle much slower to process. Fritzpoll (talk) 17:12, 20 October 2009 (UTC)[reply]
If anything is to come of this, I think it should be that unless the task has positive consensus, the bot operator should be willing to stop it at the first sign of opposition and take part in the discussion. Too often it seems that operators refuse to stop their bot until someone really starts screaming. Franamax (talk) 17:56, 20 October 2009 (UTC)[reply]
Whatever lack of indicators beforehand there were about the controversial nature of this task, I think it should be considered that the task in general added a lot of edits to article space, added a visible parameter to articles, and this was done without community input. What's important to me, is that a bot not be approved so quickly when it is editing main space, when it is editing it in a way that human editors haven't (it's not changing a template that has been deprecated, for example, but adding something that meat editors didn't put in the articles).
In this instance, and in future instances, where the bot is adding something to articles that will appear in mainspace, and adding it to a lot of articles, and, most particularly when the addition is a link to an external site, the task should not be considered non-controversial as a default value. --69.225.5.183 (talk) 02:57, 21 October 2009 (UTC)[reply]
That (like 1RR) is probably a good rule to follow in general, although it needs to be balanced against the ignorance and pointiness factors. Anomie 18:35, 20 October 2009 (UTC)[reply]


1RR with bots is a good rule to follow with bots.

Adding thousands of links without community input is a major concern. However, in the case of mainspace edits that contribute to thousands of article additions or changes, I would like to see community input at least given a chance in the future, and anything that makes this explicit to BAG members would be a way of addressing the situation.

At this point, however, I would also like community input about rolling back the bot edits, since they were made without community input, and they link externally. This should not have been done without major community input. And, in the case of future editing mainspace with a bot adding external links, I think the default value should be to not do so if the community has not positively spoken for adding the link. --69.225.5.183 (talk) 02:57, 21 October 2009 (UTC)[reply]

I actually meant 1RR is generally a good rule for human editors to follow. Most bots should be following something akin to 0RR, although there are a number of exceptions (e.g. bots implementing CFD decisions). As for CobraBot supposedly adding external links, you're making the same mistaken assumption User:Gavin.collins insisted on repeatedly making. Anomie 03:03, 21 October 2009 (UTC)[reply]
Oh on the 1RR. I checked, and there was no link before the bot added the OCLC, and there is one afterwards. So, the bot added a link, it's through a template, I suppose, but it is still links. If the links aren't there, maybe you could link to a CobraBot addition that shows the before there is and after there is or before there isn't and after there isn't edit. --69.225.5.183 (talk) 03:51, 21 October 2009 (UTC)[reply]
The bot added the oclc parameter to the template, what the template decides to do with it is beside the point. As mentioned elsewhere, if the community decides that external links to WorldCat are inappropriate the template would be changed with no alteration to what the bot did. Anomie 03:53, 21 October 2009 (UTC)[reply]
It's fine to put functionality in a template that may or may not be used at the time the template is created. Again, this is the case with taxoboxes, for example. There are multiple parameters that are not necessarily used. It's the community that decides, or the individual editors, to use that parameter in the template. In this case, because the community was not consulted the decision to activate all these links was made without a single mention of it in the BRFA or anywhere. Neither the bot creator, nor Kingpin mention that this bot will be creating active links where none existed before by editing this parameter. ::::So, it's another point I guess, for bot instructions, when the edit that is being made will be making links, this should be explicitly stated in the BRFA. And it's another good reason for requiring proactive community support rather than just a lack of disapproval for that particular bot. --69.225.5.183 (talk) 04:20, 21 October 2009 (UTC)[reply]
Nearly every bot contributes to thousands of articles. What is or is not trivial/non-controversial is always a judgment call, and therefore impossible to insure 100% accuracy. In this case there was no reason to believe filling in an existing template parameter would be controversial. (And in all reality most controversial requests are unlikely to draw objections until they go live, as 99.9% of Wikipedia pays zero attention to BRFAs.) --ThaddeusB (talk) 03:55, 21 October 2009 (UTC)[reply]
Yes, we've kinda moved on to how to handle it in the future though. --69.225.5.183 (talk) 04:20, 21 October 2009 (UTC)[reply]
Even if the application is filed and quickly approved, bots still have more oversight than most users. They have a bot owner who (almost always) takes responsibility for errors, and while many edits may be made, they can be very easily stopped with no repercussions (through blocking or shutoff buttons). More analysis on "pre-editing" in the human wiki world would lead to claims of WP:CREEP. tedder (talk) 04:17, 21 October 2009 (UTC)[reply]

I think that "say no to linkspam" says it all, no matter what the age. There was no consensus to actively link to this site, the bot move forward without gaining any community consensus, making en.wiki the "feeder site" to thousands of links to worldcat. The community should decide whether or not the infoboxes provide links to this particular website, not BAG, particularly since BAG's fallback is to generally approve a trial, then approve the bot for flagging based only on technical issues.

BAG itself seems to indicate there is no design for community input: a trial is "most likely" approved, without rgard to community input, then the bot is approved solely on technical issues. Linking thousands of wikipedia pages to worldcat required community consensus, not rapid approval. If this is done here it could be an easy avenue for vandalism. --69.226.111.130 (talk) 21:05, 23 October 2009 (UTC)[reply]

I'm afraid I'm in the camp that says that the bot is not adding the link to WorldCat - the template is. If there's no consensus to link to WorldCat, then the template needs to be changed - if Cobra went through and did it all manually, it would still be overlinked. The bot may have highlighted a controversial template setting, but it didn't create it Fritzpoll (talk) 21:39, 23 October 2009 (UTC)[reply]
The bot used a functionality of the template without community input. As I said above, there are dozens of fields in wikipedia taxoboxes. If you assign a bot to fill in every one of them, the bot will simply be blocked by a responsible admin. Even if you approved a bot to fill in only a specific field on every single taxobox the bot would be blocked unless it went forward with community consensus.
The spirit of the policy for community consensus at wikipedia is that the community gets some say in the matter. Find all the technicalities, "the field exists, therefore it should be filled in," it doesn't amount to community consensus. BAG approved a bot to do a task without gaining community consensus.
It's for the community to decide whether or not that field should be filled in on every article, not for an individual BAG member. I've already linked to a discussion showing there appears to be no consensus for that field to be filled in. Bots policy doesn't say "if there's no consensus against something a bot can do it."
What it says is, "In order for a bot to be approved, its operator should demonstrate that it: performs only tasks for which there is consensus." So, let's run with bots policy. There is no link to community consensus for this task, because there was no attempt to establish community consensus for the task.
Other templates have blank fields. If there is consensus by the community to have a bot fill it in, fine. But there isn't. So, rollback of an unapproved bot task is a simple and straight-forward means to take care of the issue. --69.226.111.130 (talk) 01:19, 24 October 2009 (UTC)[reply]

Another speedy approval, and speedy approvals and community input in general

Can we give more than 3 minutes for interested users to examine trial runs? [5] There seem to be many excuses for why community consensus is not needed, not given, no time for it. In this particular bot case, the task is straight-forward, responsible and responsive bot owner, dealing with deprecated code, etc., etc. But, sometimes I want to examine the trial runs after they have been run, but before the final approval, to see if they are problems that show up during the trial. A good reason for doing trials in the first place is to examine the results.

3 minutes is not enough time, and I don't see the urgency in approving this bot in 3 minutes. A couple of days for interested users to examine the trial run is not unreasonable, imo, no matter what the task.

One reason for instruction creep, by the way, is that editors seem to other editors to be overlooking common courtesies and common sense. I don't see why the instructions should say wait 2 days or wait more than 3 minutes, except that it is apparently not obvious that waiting more than 3 minutes gives time for community input.

There was no urgency in approving this bot, so allowing more than 3 minutes for the trial run to be examined by interested parties would have been a simple courtesy. --IP69.226.103.13 (talk) 20:58, 22 October 2009 (UTC)[reply]

The trial is a technical check. Mr Z-Man allowed time prior to the trial for community input on the task itself. Once the trial was proven to be technically functional - which is the reason BAG members are chosen, because of technical proficiency - there was no reason for further delay. Fritzpoll (talk) 23:04, 22 October 2009 (UTC)[reply]
I would also note that in this case, there was also a bot request open for several days. (Oddly enough, WP:BOTREQ tends to get significantly higher traffic than BRFA). Mr.Z-man 23:36, 22 October 2009 (UTC)[reply]
Lots of people want things done for them, but few probably care about unrelated tasks that aren't theirs; there's also probably a discouragement factor for non-programmers, who might be uncertain if their non-technical input is wanted/needed/helpful/relevant (Answer: Yes, it is!). --Cybercobra (talk) 00:39, 23 October 2009 (UTC)[reply]

So, it boils down to: after community input, a BAG member "will most likely approve a trial for your bot," (without any reference to the community input), then based entirely on technical functionality, the bot will be quickly approved after the trial. The bot owner is solely responsible for the actions of the bot.

So, BAG does nothing, but allow for a community input board then fast forward bots to be flagged by bureaucrats, or whoever flags bots... Interesting.

I will then move forward with this understanding of BAG's role on en.wiki. --69.226.111.130 (talk) 20:49, 23 October 2009 (UTC)[reply]

If you do so, you will be moving forward with an incorrect understanding. BAG checks if a task seems to have sufficient consensus for how controversial it seems to be (and yes, sometimes something turns out to be controversial that didn't seem like it would be); at times, and increasingly often lately, BAG will insist that wider community consensus must be sought before the request may proceed. BAG also considers the technical aspects of the proposed task; among other things, this includes reviewing the code (when available) for errors, making suggestions on how to do things more efficiently, and pointing out situations to watch out for. If both of those seem good, a BAG member will approve a trial. It turns out that this has historically happened more often than not, hence the wording "will most likely approve a trial for your bot" that you continue to insist on misinterpreting. If the trial is completed without issues and it seems unlikely that there will be further community reaction as a result of the trial edits, BAG approves the bot. Then a bureaucrat comes along, checks the BRFA again for consensus and such, and actually grants the bot flag. Anomie 21:05, 23 October 2009 (UTC)[reply]
"Insist on misinterpreting" because the history isn't included in the policy? How should someone interpret "most likely" to be approved without knowing the history? It's the bots policy, not a predictor of the outcome of a request, and it should be written clearly and cleanly as a policy that invites both the casual and familiar editor of wikipedia to learn about how bots work on wikipedia. But if you're going to insist upon describing the history of occurrences here instead of giving a clean and clear statement of policy you're going to wind up with people misinterpreting what you mean.
It also doesn't seem that BAG's ability to predict community response is particularly good, since recent incidents of community consensus were wrong, and, again, there's no point in a group wherein individuals are expected to predict the community response. The community can simply give their response. That makes it easy on everyone. BAG members aren't expected to be mind readers. When they're unwilling to do a search and find out the community consensus, they can just wait a reasonable amount of time. Again, community inclusiveness would rule decision making rather than mind-reading skills or some other BAG ability to know what the community wants without asking. There's no other place on wikipedia where editors are expected to individually gauge the community consensus without input from the community.
Asking for sufficient time greater than 3 minutes to be able to look over a trial and input a response is not unreasonable.
I'm not insisting on misinterpreting, I'm simply not making the effort to read the entire history of bag to learn what a single paragraph actually means, when it's meaning is not obvious. Policies in a dynamic community should be written in a way that allows someone to gather the correct policy, not the history of the policy, from reading the page.
Why not be courteous to the spirit of the community of wikipedia as it actually is: people come and go, and the policy could be written out for interested editors who drop by and don't know and don't want to read the history. Then it's clear, also, to others how it is intended. That I'm misinterpreting is also not so obvious without reading the history, by the way. That's what your policy says. --69.226.111.130 (talk) 01:09, 24 October 2009 (UTC)[reply]