Wikipedia:Bots/Requests for approval/CwraschkeDataBot
- The following discussion is an archived debate. Please do not modify it. Subsequent comments should be made in a new section. The result of the discussion was Approved.
Automatic or Manually Assisted: Supervised. Also, there will be NO EDITS. This request is for a read-only Bot.
Programming Language(s): Python
Function Summary: Collect data on inter-wiki links to construct a network. This is done for an and-of-semester research project.
Edit period(s) (e.g. Continuous, daily, one time run): One time run
Already has a bot flag (Y/N): NA
Function Details: This bot will extract all interwiki links that are contained within articles of a category (e.g. Category:Southern Association of Colleges and Schools) to construct a snapshot of a network of interwiki links for a research project. Doing this job manually is very tedious, and so I am requesting approval for this Bot to do the task for me. This Bot will read only!
Discussion
[edit]How many requests is the bot going to be making? Does it need apihighlimits? BJTalk 20:28, 13 November 2008 (UTC)[reply]
- I plan two use information from about 2,000 articles. While high limits are nice, I would be able to time the requests to whatever limit you allow me. What is the standard limit you place on requests? I tried researching exactly what apihighlimits or the lack thereof implies, but obtained conflicting information. Could you clarify? --Cwraschke (talk) 23:27, 19 November 2008 (UTC)[reply]
- Approved. Read only, decent task. Please honor maxlag=5 (read about it here.) --uǝʌǝsʎʇɹoɟʇs(st47) 23:23, 24 November 2008 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. Subsequent comments should be made in a new section.