Wikipedia:Bots/Requests for approval/hypejar bot
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Denied.
Operator: hypejar (talk · contribs)
Time filed: 15:02, Tuesday December 20, 2011 (UTC)
Automatic or Manual: initially manual
Programming language(s): Python
Source code available: standard pywikipedia
Function overview: loading articles from the Upcoming Products category to be displayed on our soon-releasing website.
Links to relevant discussions (where appropriate):
Edit period(s): none
Estimated number of pages affected: none
Exclusion compliant (Y/N):
Already has a bot flag (Y/N):
Function details: Our site would like to display information about upcoming products by displaying the product's corresponding wikipedia article. We would not be editing any pages but would instead (similar to the facebook comminity pages bot) would be storing necessary wikipedia articles and updating this info weekly (at first)
Discussion
[edit]If you are not going to edit the encyclopedia, you do not need the bot flag to access the API. — HELLKNOWZ ▎TALK 19:36, 27 December 2011 (UTC)[reply]
- So speedy fail then? Rcsprinter (converse) 12:41, 1 January 2012 (UTC)[reply]
- Well, there are a few cases where a bot flag may be needed for highapilimits and the account can be indeffed to be "read only".
- {{OperatorAssistanceNeeded}} — HELLKNOWZ ▎TALK 13:03, 1 January 2012 (UTC)[reply]
- Yes we were hoping to have the bot flag in order to query over 50 pages per second --Hypejar (talk) 21:02, 1 January 2012 (UTC)[reply]
- Why would you need to query so many pages individually? The API supports requests for multiple pages at once. — HELLKNOWZ ▎TALK 21:05, 1 January 2012 (UTC)[reply]
- We plan on storing wikipedia articles on our server to give descriptions of certain products. Periodically we would therefore have to check the original wikipedia article to see if it has changed (very similar to the facebook community pages for which there is a bot ). We want to query just the revision number of as many pages as possible to compare with the revision on our server in order to know which articles to update. If there is a better way of doing this, please let me know. --Hypejar (talk) 20:01, 2 January 2012 (UTC)[reply]
- You can use the API to get multiple pages/revisions e.g. (see the documentation for more details), I'm not sure what the limit is per request (I would expect it to be about 50 pages, or possibly 500). You might find it easier to send as a POST request, so that you can fit in all of the titles without creating an excessively long URL. It may be worth getting a database dump and just using that (depending on how up to date you need the data). Really, it depends on how many pages you need to retrieve. It may be helpful to email wikitech-l@lists.wikimedia.org --Chris 14:13, 3 January 2012 (UTC)[reply]
Denied. As per above, does not need approval. If you have anymore questions, about the API/technical implementations, feel free to drop a line on my talkpage or the Village Pump --Chris 12:19, 6 January 2012 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.