|Slogan||ScraperWiki is a platform for doing data science on the web|
|Affero General Public License|
|Revenue||Sponsored by 4iP|
|133,089 (April 2014[update])|
ScraperWiki is a web-based platform for collaboratively building programs to extract and analyze public (online) data, in a wiki-like fashion. "Scraper" refers to screen scrapers, programs that extract data from websites. "Wiki" means that any user with programming experience can create or edit such programs for extracting new data, or for analyzing existing datasets. The main use of the website is providing a place for programmers and journalists to collaborate on analyzing public data.
ScraperWiki was founded in 2009 by Julian Todd and Aidan McGuire. It was initially funded by 4iP, the venture capital arm of TV station Channel 4. Since then, it has attracted an additional £1 Million round of funding from Enterprise Ventures.
- "ScraperWiki Terms and Conditions".
- Jamie Arnold (2009-12-01). "4iP invests in ScraperWiki". 4iP.
- "Scraperwiki.com Site Info". Alexa Internet. Retrieved 2014-04-01.
- Cian Ginty (2010-11-19). "Hacks and hackers unite to get solid stories from difficult data". The Irish Times.
- Paul Bradshaw (2010-07-07). "An introduction to data scraping with Scraperwiki". Online Journalism Blog.
- Charles Arthur (2010-11-22). "Analysing data is the future for journalists, says Tim Berners-Lee". The Guardian.
- Deirdre McArdle (2010-11-19). "In The Papers 19 November". ENN.
- "Journalists and developers join forces for Lichfield ‘hack day’". The Lichfield Blog. 2010-11-15.
- Alison Spillane (2010-11-17). "Online tool helps to create greater public data transparency". Politico.
- From CMS to DMS: C is for Content, D is for Data from the ScraperWiki Data Blog
|This website-related article is a stub. You can help Wikipedia by expanding it.|