QuickCode
   HOME

TheInfoList



OR:

QuickCode (formerly ScraperWiki) was a web-based platform for collaboratively building programs to extract and analyze public (online) data, in a
wiki A wiki ( ) is an online hypertext publication collaboratively edited and managed by its own audience, using a web browser. A typical wiki contains multiple pages for the subjects or scope of the project, and could be either open to the pu ...
-like fashion. "Scraper" refers to screen scrapers, programs that extract data from websites. "Wiki" means that any user with programming experience can create or edit such programs for extracting new data, or for analyzing existing datasets. The main use of the website is providing a place for
programmer A computer programmer, sometimes referred to as a software developer, a software engineer, a programmer or a coder, is a person who creates computer programs — often for larger computer software. A programmer is someone who writes/creates ...
s and
journalist A journalist is an individual that collects/gathers information in form of text, audio, or pictures, processes them into a news-worthy form, and disseminates it to the public. The act or process mainly done by the journalist is called journalism ...
s to collaborate on analyzing public data. The service was renamed circa 2016, as "it isn't a wiki or just for scraping any more". At the same time, the eponymous parent company was renamed 'The Sensible Code Company'.


Scrapers

Scrapers are created using a browser based IDE or by connecting via SSH to a server running
Linux Linux ( or ) is a family of open-source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991, by Linus Torvalds. Linux is typically packaged as a Linux distribution, which ...
. They can be programmed using a variety of programming languages, including
Perl Perl is a family of two high-level, general-purpose, interpreted, dynamic programming languages. "Perl" refers to Perl 5, but from 2000 to 2019 it also referred to its redesigned "sister language", Perl 6, before the latter's name was offici ...
,
Python Python may refer to: Snakes * Pythonidae, a family of nonvenomous snakes found in Africa, Asia, and Australia ** ''Python'' (genus), a genus of Pythonidae found in Africa and Asia * Python (mythology), a mythical serpent Computing * Python (pro ...
,
Ruby A ruby is a pinkish red to blood-red colored gemstone, a variety of the mineral corundum ( aluminium oxide). Ruby is one of the most popular traditional jewelry gems and is very durable. Other varieties of gem-quality corundum are called sa ...
,
JavaScript JavaScript (), often abbreviated as JS, is a programming language that is one of the core technologies of the World Wide Web, alongside HTML and CSS. As of 2022, 98% of Website, websites use JavaScript on the Client (computing), client side ...
and R.


History

ScraperWiki was founded in 2009 by
Julian Todd Julian Todd is a British computer programmer and activist for freedom of information who works in Liverpool. He was inventor and co-founder of Public Whip with Francis Irving, and also the affiliated TheyWorkForYou website, a project that parses ...
and Aidan McGuire. It was initially funded by 4iP, the venture capital arm of TV station
Channel 4 Channel 4 is a British free-to-air public broadcast television network operated by the state-owned enterprise, state-owned Channel Four Television Corporation. It began its transmission on 2 November 1982 and was established to provide a four ...
. Since then, it has attracted an additional £1 Million round of funding from Enterprise Ventures. Aidan McGuire is the
chief executive officer A chief executive officer (CEO), also known as a central executive officer (CEO), chief administrator officer (CAO) or just chief executive (CE), is one of a number of corporate executives charged with the management of an organization especially ...
of The Sensible Code Company


See also

*
Data driven journalism Data journalism or data-driven journalism (DDJ) is a journalistic process based on analyzing and filtering large data sets for the purpose of creating or elevating a news story. Data journalism is a type of journalism reflecting the increased ...
*
Web scraping Web scraping, web harvesting, or web data extraction is data scraping used for extracting data from websites. Web scraping software may directly access the World Wide Web using the Hypertext Transfer Protocol or a web browser. While web scraping ...


References


External links

*
github repository of custard
Collaborative projects Wikis Social information processing Web analytics Mashup (web application hybrid) Web scraping Software using the GNU AGPL license {{wiki-stub