Crowdsourcing is a sourcing model in which individuals or
organizations obtain goods and services, including ideas and finances,
from a large, relatively open and often rapidly-evolving group of
internet users; it divides work between participants to achieve a
cumulative result. The word crowdsourcing itself is a portmanteau of
crowd and outsourcing, and was coined in 2005. As a mode
of sourcing, crowdsourcing existed prior to the digital age (i.e.
Major differences between crowdsourcing and outsourcing include
features such as: crowdsourcing comes from a less-specific, more
public group (i.e. whereas outsourcing is commissioned from a
specific, named group) and; includes a mix of bottom-up and top-down
processes. Advantages of using crowdsourcing may include
improved costs, speed, quality, flexibility, scalability, or
Some forms of crowdsourcing, such as in "idea competitions" or
"innovation contests" provide ways for organizations to learn beyond
the "base of minds" provided by their employees (e.g. LEGO Ideas).
Tedious "microtasks" performed in parallel by large, paid crowds (e.g.
Amazon Mechanical Turk) are another form of crowdsourcing. It has also
been used by not-for-profit organisations and to create common goods
(e.g.). The effect of user communication and the
platform presentation should be taken into account when evaluating the
performance of ideas in crowdsourcing contexts.
2 Historical examples
2.1 Timeline of major events
2.2 Early competitions
2.3 In astronomy
2.4 In energy system research
2.5 In genealogy research
2.6 In genetic genealogy research
2.7 In journalism
2.8 In linguistics
2.9 In ornithology
2.10 In public policy
2.11 In seismology
2.12 In libraries
3 Modern methods
Crowdsourcing creative work
Crowdsourcing language-related data collection
4.7 Mobile crowdsourcing
4.10 Simple projects
4.11 Complex projects
4.12 Inducement prize contests
4.13 Implicit crowdsourcing
4.14 Health-care crowdsourcing
Crowdsourcing in agriculture
Crowdsourcing in cheating in bridge
5.3 Participation in crowdsourcing
6 Limitations and controversies
6.1 Impact of crowdsourcing on product quality
6.2 Entrepreneurs contribute less capital themselves
6.3 Increased number of funded ideas
6.4.1 Irresponsible crowdsourcing
7 See also
9 External links
The term "crowdsourcing" was coined in 2005 by Jeff Howe and Mark
Robinson, editors at Wired, to describe how businesses were using the
Internet to "outsource work to the crowd", which quickly led to the
portmanteau "crowdsourcing." Howe first published a definition for the
term crowdsourcing in a companion blog post to his June 2006 Wired
article, "The Rise of Crowdsourcing", which came out in print just
"Simply defined, crowdsourcing represents the act of a company or
institution taking a function once performed by employees and
outsourcing it to an undefined (and generally large) network of people
in the form of an open call. This can take the form of peer-production
(when the job is performed collaboratively), but is also often
undertaken by sole individuals. The crucial prerequisite is the use of
the open call format and the large network of potential laborers."
In a February 1, 2008, article, Daren C. Brabham, "the first [person]
to publish scholarly research using the word crowdsourcing" and writer
of the 2013 book, Crowdsourcing, defined it as an "online, distributed
problem-solving and production model." Kristen L. Guth and
Brabham found that the performance of ideas offered in crowdsourcing
platforms are affected not only by their quality, but also by the
communication among users about the ideas, and presentation in the
After studying more than 40 definitions of crowdsourcing in the
scientific and popular literature, Enrique Estellés-Arolas and
Fernando González Ladrón-de-Guevara, researchers at the Technical
University of Valencia, developed a new integrating definition:
Crowdsourcing is a type of participative online activity in which an
individual, an institution, a nonprofit organization, or company
proposes to a group of individuals of varying knowledge,
heterogeneity, and number, via a flexible open call, the voluntary
undertaking of a task. The undertaking of the task; of variable
complexity and modularity, and; in which the crowd should participate,
bringing their work, money, knowledge **[and/or]** experience, always
entails mutual benefit. The user will receive the satisfaction of a
given type of need, be it economic, social recognition, self-esteem,
or the development of individual skills, while the crowdsourcer will
obtain and use to their advantage that which the user has brought to
the venture, whose form will depend on the type of activity
As mentioned by the definitions of Brabham and Estellés-Arolas and
Ladrón-de-Guevara above, crowdsourcing in the modern conception is an
IT-mediated phenomenon, meaning that a form of IT is always used to
create and access crowds of people. In this respect,
crowdsourcing has been considered to encompass three separate, but
stable techniques; competition crowdsourcing, virtual labor market
crowdsourcing, and open collaboration crowdsourcing.
Henk van Ess, a college lecturer in online communications, emphasizes
the need to "give back" the crowdsourced results to the public on
ethical grounds. His nonscientific, noncommercial definition is widely
cited in the popular press:
Crowdsourcing is channeling the experts’ desire to solve a problem
and then freely sharing the answer with everyone."
Despite the multiplicity of definitions for crowdsourcing, one
constant has been the broadcasting of problems to the public, and an
open call for contributions to help solve the problem. Members of the
public submit solutions that are then owned by the entity, which
originally broadcast the problem. In some cases, the contributor of
the solution is compensated monetarily with prizes or with
recognition. In other cases, the only rewards may be kudos or
Crowdsourcing may produce solutions from
amateurs or volunteers working in their spare time or from experts or
small businesses, which were previously unknown to the initiating
Another consequence of the multiple definitions is the controversy
surrounding what kinds of activities that may be considered
While the term "crowdsourcing" was popularized online to describe
Internet-based activities, some examples of projects, in
retrospect, can be described as crowdsourcing.
Timeline of major events
1714 – The
Longitude Prize: When the British government was trying
to find a way to measure a ship’s longitudinal position, they
offered the public a monetary prize to whomever came up with the best
King Louis XVI
King Louis XVI offered an award to the person who could
‘make the alkali’ by decomposing sea salt by the ‘simplest and
most economic method.’
Matthew Fontaine Maury
Matthew Fontaine Maury distributed 5000 copies of his Wind
and Current Charts free of charge on the condition that sailors
returned a standardized log of their voyage to the U.S. Naval
Observatory . By 1861, he had distributed 200,000 copies free of
charge, on the same conditions.
1884 – Publication of the Oxford English Dictionary: 800 volunteers
catalogued words to create the first fascicle of the OED
Planters Peanuts contest: The
Mr. Peanut logo was designed by
a 14-year-old boy who won the Planter Peanuts logo contest.
1957 – Jørn Utzon, winner of the design competition for the Sydney
1970 – French amateur photo contest ‘C’était
Paris en 1970’
Paris in 1970’) sponsored by the city of Paris,
France-Inter radio, and the Fnac: 14,000 photographers produced 70,000
black-and-white prints and 30,000 color slides of the French capital
to document the architectural changes of Paris. Photographs were
donated to the Bibliothèque historique de la ville de Paris.
1996 – The
Hollywood Stock Exchange
Hollywood Stock Exchange was founded: Allowed for the
buying and selling of shares
1997 – British rock band
Marillion raised $60,000 from their fans to
help finance their U.S. tour.
1999 - SETI@home was launched by the University of California,
Berkeley. Volunteers can contribute to searching for signals that
might come from extraterrestrial intelligence by installing a program
that uses idle computer time for analyzing chunks of data recorded by
radio telescopes involved in the
JustGiving established: This online platform allows the
public to help raise money for charities.
2000 – UNV Online
Volunteering service launched: Connecting people
who commit their time and skills over the
Internet to help
organizations address development challenges
2000 – iStockPhoto was founded: The free stock imagery website
allows the public to contribute to and receive commission for their
2001 – Launch of: “Free-access, free content Internet
2004 – Toyota’s first "Dream car art" contest: Children were asked
globally to draw their ‘dream car of the future.’
2005 – Kodak’s "Go for the Gold" contest:
Kodak asked anyone to
submit a picture of a personal victory.
2006 – Jeff Howe coined the term crowdsourcing in Wired.
2009 – Waze, a community-oriented
GPS app, allows for users to
submit road info and route data based on location, such as reports of
car accidents or traffic, and integrates that data into its routing
algorithms for all users of the app
Crowdsourcing has often been used in the past as a competition to
discover a solution. The French government proposed several of these
competitions, often rewarded with Montyon Prizes, created for poor
Frenchmen who had done virtuous acts. These included the Leblanc
process, or the Alkali prize, where a reward was provided for
separating the salt from the alkali, and the Fourneyron's turbine,
when the first hydraulic commercial turbine was developed.
In response to a challenge from the French government, Nicolas Appert
won a prize for inventing a new way of food preservation that involved
sealing food in air-tight jars. The British government provided a
similar reward to find an easy way to determine a ship's longitude in
Longitude Prize. During the Great Depression, out-of-work clerks
tabulated higher mathematical functions in the Mathematical Tables
Project as an outreach project. One of the biggest crowdsourcing
campaigns was a public design contest in 2010 hosted by the Indian
government's finance ministry to create a symbol for the Indian rupee.
Thousands of people sent in entries before the government zeroed in on
the final symbol based on the
Devanagari script using the letter
Crowdsourcing in astronomy was used in the early 19th century by
astronomer Denison Olmsted. After being awakened in a late November
night due to a meteor shower taking place, Olmsted noticed a pattern
in the shooting stars. Olmsted wrote a brief report of this meteor
shower in the local newspaper. “As the cause of ‘Falling Stars’
is not understood by meteorologists, it is desirable to collect all
the facts attending this phenomenon, stated with as much precision as
possible,” Olmsted wrote to readers, in a report subsequently picked
up and pooled to newspapers nationwide. Responses came pouring in from
many states, along with scientists’ observations sent to the
American Journal of Science and Arts. These responses helped him
make a series of scientific breakthroughs, the major discovery being
that meteor showers are seen nationwide, and fall from space under the
influence of gravity. Also, they demonstrated that the showers
appeared in yearly cycles, a fact that often eluded scientists. The
responses allowed him to suggest a velocity for the meteors, although
his estimate turned out to be too conservative. If he had just taken
the responses as presented, his conjecture on the meteors' velocity
would have been closer to their actual speed.
A more recent version of crowdsourcing in astronomy is NASA's photo
organizing project, which asks internet users to browse photos
taken from space and try to identify the location the picture is
In energy system research
Energy system models require large and diverse datasets, increasingly
so given the trend towards greater temporal and spatial
resolution. In response, there have been several initiatives to
crowdsource this data. Launched in December 2009,
OpenEI is a
collaborative website, run by the US government, providing open energy
data. While much of its information is from US government
sources, the platform also seeks crowdsourced input from around the
world. The semantic wiki and database Enipedia also publishes
energy systems data using the concept of crowdsourced open
information. Enipedia went live in March 2011.:184–188
In genealogy research
Genealogical research was using crowdsourcing techniques long before
personal computers were common. Beginning in 1942, members of The
Church of Jesus Christ of Latter-day Saints encouraged members to
submit information about their ancestors. The submitted information
was gathered together into a single collection. In 1969, to encourage
more people to participate in gathering genealogical information about
their ancestors, the church started the three-generation program. In
this program, church members were asked to prepare documented family
group record forms for the first three generations. The program was
later expanded to encourage members to research at least four
generations and became known as the four-generation program.
Institutes that have records of interest to genealogical research have
used crowds of volunteers to create catalogs and indices to records.
In genetic genealogy research
Genetic genealogy is a combination of traditional genealogy with
genetics. The rise of personal
DNA testing, after the turn of the
century, by companies such as Gene by Gene, FTDNA, GeneTree, 23andMe,
and Ancestry.com, has led to public and semipublic databases of DNA
testing which uses crowdsourcing techniques. In recent years, citizen
science projects have become increasingly focused providing benefits
to scientific research. This includes support,
organization, and dissemination of personal
DNA (genetic) testing.
Similar to amateur astronomy, citizen scientists encouraged by
volunteer organizations like the International Society of Genetic
Genealogy, have provided valuable information and research to the
professional scientific community.
Spencer Wells, director of the
Genographic Project blurb:
Since 2005, the
Genographic Project has used the latest genetic
technology to expand our knowledge of the human story, and its
pioneering use of
DNA testing to engage and involve the public in the
research effort has helped to create a new breed of "citizen
scientist." Geno 2.0 expands the scope for citizen science, harnessing
the power of the crowd to discover new details of human population
Collaborative journalism and Citizen journalism
Crowdsourcing is increasingly used in professional journalism.
Journalists crowdsource information from the crowd, typically fact
check the information and then use it in their articles as they see
fit. The leading daily newspaper in Sweden has successfully used
crowdsourcing in investigating the home loan interest rates in the
country in 2013-2014, resulting to over 50,000 submissions. The
leading daily newspaper in Finland crowdsourced investigation in stock
short selling in 2011-2012, and the crowdsourced information lead to a
revelation of a sketchy tax evasion system in a Finnish bank. The bank
executive was fired and policy changes followed. TalkingPointsMemo
in the United States asked its readers to examine 3000 emails
concerning the firing of federal prosecutors in 2008. The British
newspaper the Guardian crowdsourced the examination of hundreds of
thousands of documents in 2009.
Crowdsourcing strategies have been applied to estimate word knowledge
and vocabulary size.
Another early example of crowdsourcing occurred in the field of
ornithology. On December 25, 1900, Frank Chapman, an early officer of
the National Audubon Society, initiated a tradition, dubbed the
"Christmas Day Bird Census". The project called birders from across
North America to count and record the number of birds in each species
they witnessed on Christmas Day. The project was successful, and the
records from 27 different contributors were compiled into one bird
census, which tallied around 90 species of birds. This large-scale
collection of data constituted an early form of citizen science, the
premise upon which crowdsourcing is based. In the 2012 census, more
than 70,000 individuals participated across 2,369 bird count
circles. Christmas 2014 marked the National Audubon Society's
115th annual Christmas Bird Count.
In public policy
Crowdsourcing public policy and the production of public services is
also referred to as citizen sourcing.
The first conference focusing on
Crowdsourcing for Politics and Policy
took place at Oxford University, under the auspices of the Oxford
Internet Institute in 2014. Research has emerged since 2012 that
focuses on the use of crowdsourcing for policy purposes. These
include the experimental investigation of the use of Virtual Labor
Markets for policy assessment, and an assessment of the potential
for citizen involvement in process innovation for public
Governments across the world are increasingly using crowdsourcing for
knowledge discovery and civic engagement. Iceland crowdsourced their
constitution reform process in 2011, and Finland has crowdsourced
several law reform processes to address their off-road traffic laws.
The Finnish government allowed citizens to go on an online forum to
discuss problems and possible resolutions regarding some off-road
traffic laws. The crowdsourced information and resolutions would then
be passed on to legislators for them to refer to when making a
decision, letting citizens more directly contribute to public
policy. The City of Palo Alto is crowdsourcing people's
feedback for its Comprehensive City Plan update in a process, which
started in 2015. The House of Representatives in Brazil has used
crowdsourcing in policy-reforms, and federal agencies in the United
States have used crowdsourcing for several years.
European-Mediterranean Seismological Centre (EMSC) has developed a
seismic detection system by monitoring the traffic peaks on its
website and by the analysis of keywords used on Twitter.
Crowdsourcing is used in libraries for OCR corrections on digitized
texts, for tagging and for funding.[page needed]
Currently, crowdsourcing has transferred mainly to the Internet, which
provides a particularly beneficial venue for crowdsourcing since
individuals tend to be more open in web-based projects where they are
not being physically judged or scrutinized, and thus can feel more
comfortable sharing. This approach ultimately allows for well-designed
artistic projects because individuals are less conscious, or maybe
even less aware, of scrutiny towards their work. In an online
atmosphere, more attention can be given to the specific needs of a
project, rather than spending as much time in communication with other
According to a definition by Henk van Ess:
"The crowdsourced problem can be huge (epic tasks like finding alien
life or mapping earthquake zones) or very small ('where can I skate
safely?'). Some examples of successful crowdsourcing themes are
problems that bug people, things that make people feel good about
themselves, projects that tap into niche knowledge of proud experts,
subjects that people find sympathetic or any form of injustice."
Crowdsourcing can either take an explicit or an implicit route.
Explicit crowdsourcing lets users work together to evaluate, share,
and build different specific tasks, while implicit crowdsourcing means
that users solve a problem as a side effect of something else they are
With explicit crowdsourcing, users can evaluate particular items like
books or webpages, or share by posting products or items. Users can
also build artifacts by providing information and editing other
Implicit crowdsourcing can take two forms: standalone and piggyback.
Standalone allows people to solve problems as a side effect of the
task they are actually doing, whereas piggyback takes users'
information from a third-party website to gather information.
In his 2013 book, Crowdsourcing, Daren C. Brabham puts forth a
problem-based typology of crowdsourcing approaches:
Knowledge discovery and management is used for information management
problems where an organization mobilizes a crowd to find and assemble
information. It is ideal for creating collective resources.
Distributed human intelligence tasking is used for information
management problems where an organization has a set of information in
hand and mobilizes a crowd to process or analyze the information. It
is ideal for processing large data sets that computers cannot easily
Broadcast search is used for ideation problems where an organization
mobilizes a crowd to come up with a solution to a problem that has an
objective, provable right answer. It is ideal for scientific problem
Peer-vetted creative production is used for ideation problems, where
an organization mobilizes a crowd to come up with a solution to a
problem which has an answer that is subjective or dependent on public
support. It is ideal for design, aesthetic, or policy problems.
Crowdsourcing often allows participants to rank each other's
contributions, e.g. in answer to the question "What is one thing we
can do to make Acme a great company?" One common method for ranking is
"like" counting, where the contribution with the most likes ranks
first. This method is simple and easy to understand, but it privileges
early contributions, which have more time to accumulate likes. In
recent years several crowdsourcing companies have begun to use
pairwise comparisons, backed by ranking algorithms. Ranking algorithms
do not penalize late contributions. They also produce results faster.
Ranking algorithms have proven to be at least 10 times faster than
manual stack ranking. "Crowdvoting: How Elo Limits Disruption".
thevisionlab.com. May 25, 2017. One drawback, however, is that
ranking algorithms are more difficult to understand than like
Some common categories of crowdsourcing can be used effectively in the
commercial world, including crowdvoting, crowdsolving, crowdfunding,
microwork, creative crowdsourcing, crowdsource workforce management,
and inducement prize contests. Although this may not be an exhaustive
list, the items cover the current major ways in which people use
crowds to perform tasks.
Crowdvoting occurs when a website gathers a large group's opinions and
judgments on a certain topic. The Iowa Electronic Market is a
prediction market that gathers crowds' views on politics and tries to
ensure accuracy by having participants pay money to buy and sell
contracts based on political outcomes.
Some of the most famous examples have made use of social media
channels: Domino's Pizza, Coca-Cola, Heineken, and Sam Adams have thus
crowdsourced a new pizza, bottle design, beer, and song,
respectively. Threadless.com selects the T-shirts it sells by
having users provide designs and vote on the ones they like, which are
then printed and available for purchase.
California Report Card
California Report Card (CRC), a program jointly launched in
January 2014 by the Center for Information Technology Research in the
Interest of Society and Lt. Governor Gavin Newsom, is an example
of modern-day crowd voting. Participants access the CRC online and
vote on six timely issues. Through principal component analysis, the
users are then placed into an online "café" in which they can present
their own political opinions and grade the suggestions of other
participants. This system aims to effectively involve the greater
public in relevant political discussions and highlight the specific
topics with which Californians are most concerned.
Crowdvoting's value in the movie industry was shown when in 2009 a
crowd accurately predicting the success or failure of a movie based on
its trailer, a feat that was replicated in 2013 by Google.
On reddit, users collectively rate web content, discussions and
comments as well as questions posed to persons of interest in "AMA"
and AskScience online interviews.
In 2017, Project Fanchise purchased a team in the Indoor Football
League and created the
Salt Lake Screaming Eagles
Salt Lake Screaming Eagles a fan run team.
Using a mobile app the fans voted on the day-to-day operations of the
team, the mascot name, signing of players and even the offensive
playcalling during games.
Crowdsourcing creative work
Creative crowdsourcing spans sourcing creative projects such as
graphic design, crowdsourcing architecture, apparel design,
movies, writing, company naming, illustration, etc.
While crowdsourcing competitions have been used for decades in some
creative fields (such as architecture), creative crowdsourcing has
proliferated with the recent development of web-based platforms where
clients can solicit a wide variety of creative work at lower cost than
by traditional means.
Crowdsourcing language-related data collection
Crowdsourcing has also been used for gathering language-related data.
For dictionary work, as was mentioned above, it was applied over a
hundred years ago by the
Oxford English Dictionary
Oxford English Dictionary editors, using
paper and postage. Much later, a call for collecting examples of
proverbs on a specific topic (religious pluralism) was printed in a
journal. Today, as "crowdsourcing" has the inherent connotation of
being web-based, such language-related data gathering is being
conducted on the web by crowdsourcing in accelerating ways. Currently,
a number of dictionary compilation projects are being conducted on the
web, particularly for languages that are not highly academically
documented, such as for the Oromo language. Software programs have
been developed for crowdsourced dictionaries, such as WeSay. A
slightly different form of crowdsourcing for language data has been
the online creation of scientific and mathematical terminology for
American Sign Language. Proverb collection is also being done via
crowdsourcing on the Web, most innovatively for the
Pashto language of
Afghanistan and Pakistan.
Crowdsourcing has been
extensively used to collect high-quality gold standard for creating
automatic systems in natural language processing (e.g. named entity
recognition, entity linking).
Main article: Crowdsolving
Crowdsolving is a collaborative, yet holistic, way of solving a
problem using many people, communities, groups, or resources. It is a
type of crowdsourcing with focus on complex and intellectively
demanding problems requiring considerable effort, and quality/
uniqueness of contribution.
Chicago-based startup Crowdfind, formerly "crowdfynd", uses a version
of crowdsourcing best termed as crowdsearching, which differs from
microwork in that no payment for taking part in the search is
made. Their platform, through geographic location anchoring,
builds a virtual search party of smartphone and
Internet users to find
lost items, pets, or persons, as well as returning them.
TrackR uses a system they call "crowd GPS" to load Bluetooth
identities to a central server to track lost or stolen items.
Main article: Crowdfunding
Crowdfunding is the process of funding projects by a multitude of
people contributing a small amount to attain a certain monetary goal,
typically via the Internet.
Crowdfunding has been used for both
commercial and charitable purposes. The crowdfuding model that has
been around the longest is rewards-based crowdfunding. This model is
where people can prepurchase products, buy experiences, or simply
donate. While this funding may in some cases go towards helping a
business, funders are not allowed to invest and become shareholders
via rewards-based crowdfunding.
Individuals, businesses, and entrepreneurs can showcase their
businesses and projects to the entire world by creating a profile,
which typically includes a short video introducing their project, a
list of rewards per donation, and illustrations through images. The
goal is to create a compelling message towards which readers will be
drawn. Funders make monetary contribution for numerous reasons:
They connect to the greater purpose of the campaign, such as being a
part of an entrepreneurial community and supporting an innovative idea
They connect to a physical aspect of the campaign like rewards and
gains from investment.
They connect to the creative display of the campaign’s presentation.
They want to see new products before the public.
The dilemma for equity crowdfunding in the US as of 2012 was how the
Securities and Exchange Commission (SEC) is going to regulate the
entire process. At the time, rules and regulations were being refined
by the SEC, which had until January 1, 2013, to tweak the fundraising
methods. The regulators were overwhelmed trying to regulate Dodd –
Frank and all the other rules and regulations involving public
companies and the way they trade. Advocates of regulation claimed that
crowdfunding would open up the flood gates for fraud, called it the
"wild west" of fundraising, and compared it to the 1980s days of penny
stock "cold-call cowboys". The process allows for up to $1 million to
be raised without some of the regulations being involved. Companies
under the then-current proposal would have exemptions available and be
able to raise capital from a larger pool of persons, which can include
lower thresholds for investor criteria, whereas the old rules required
that the person be an "accredited" investor. These people are often
recruited from social networks, where the funds can be acquired from
an equity purchase, loan, donation, or ordering. The amounts collected
have become quite high, with requests that are over a million dollars
for software such as Trampoline Systems, which used it to finance the
commercialization of their new software.
Mobile crowdsourcing involves activities that take place on
smartphones or mobile platforms, frequently characterized by GPS
technology. This allows for real-time data gathering and gives
projects greater reach and accessibility. However, mobile
crowdsourcing can lead to an urban bias, as well as safety and privacy
Macrowork tasks typically have these characteristics: they can be done
independently, they take a fixed amount of time, and they require
special skills. Macrotasks could be part of specialized projects or
could be part of a large, visible project where workers pitch in
wherever they have the required skills. The key distinguishing factors
are that macrowork requires specialized skills and typically takes
longer, while microwork requires no specialized skills.
Microwork is a crowdsourcing platform where users do small tasks for
which computers lack aptitude for low amounts of money. Amazon’s
popular Mechanical Turk has created many different projects for users
to participate in, where each task requires very little time and
offers a very small amount in payment. The Chinese versions of
this, commonly called Witkey, are similar and include such sites as
Taskcn.com and k68.cn. When choosing tasks, since only certain users
“win”, users learn to submit later and pick less popular tasks to
increase the likelihood of getting their work chosen. An example
of a Mechanical Turk project is when users searched satellite images
for a boat to find lost researcher Jim Gray. Based on an elaborate
survey of participants in a microtask crowdsourcing platform, Gadiraju
et al. have proposed a taxonomy of different types of microtasks that
Simple projects are those that require a large amount of time and
skills compared to micro and macrowork. While an example of macrowork
would be writing survey feedback, simple projects rather include
activities like writing a basic line of code or programming a
database, which both require a larger time commitment and skill level.
These projects are usually not found on sites like Amazon Mechanical
Turk, and are rather posted on platforms like
Upwork that call for a
Complex projects generally take the most time, have higher stakes, and
call for people with very specific skills. These are generally
“one-off” projects that are difficult to accomplish and can
include projects like designing a new product that a company hopes to
patent. Tasks like that would be “complex” because design is a
meticulous process that requires a large amount of time to perfect,
and also people doing these projects must have specialized training in
design to effectively complete the project. These projects usually pay
the highest, yet are rarely offered.
Inducement prize contests
Web-based idea competitions or inducement prize contests often consist
of generic ideas, cash prizes, and an Internet-based platform to
facilitate easy idea generation and discussion. An example of these
competitions includes an event like IBM's 2006 "Innovation Jam",
attended by over 140,000 international participants and yielding
around 46,000 ideas. Another example is the
Netflix Prize in
2009. The idea was to ask the crowd to come up with a recommendation
algorithm more accurate than Netflix's own algorithm. It had a grand
prize of US$1,000,000, and it was given to the BellKor's Pragmatic
Chaos team which bested Netflix's own algorithm for predicting
ratings, by 10.06%.
Another example of competition-based crowdsourcing is the 2009 DARPA
balloon experiment, where DARPA placed 10 balloon markers across the
United States and challenged teams to compete to be the first to
report the location of all the balloons. A collaboration of efforts
was required to complete the challenge quickly and in addition to the
competitive motivation of the contest as a whole, the winning team
(MIT, in less than nine hours) established its own "collaborapetitive"
environment to generate participation in their team. A similar
challenge was the Tag Challenge, funded by the US State Department,
which required locating and photographing individuals in five cities
in the US and Europe within 12 hours based only on a single
photograph. The winning team managed to locate three suspects by
mobilizing volunteers worldwide using a similar incentive scheme to
the one used in the balloon challenge.
Open innovation platforms are a very effective way of crowdsourcing
people's thoughts and ideas to do research and development. The
InnoCentive is a crowdsourcing platform for corporate research
and development where difficult scientific problems are posted for
crowds of solvers to discover the answer and win a cash prize, which
can range from $10,000 to $100,000 per challenge. InnoCentive, of
Waltham, MA and London, England provides access to millions of
scientific and technical experts from around the world. The company
claims a success rate of 50% in providing successful solutions to
previously unsolved scientific and technical problems.
IdeaConnection.com challenges people to come up with new inventions
and innovations and Ninesigma.com connects clients with experts in
various fields. The
X Prize Foundation
X Prize Foundation creates and runs incentive
competitions offering between $1 million and $30 million for solving
Local Motors is another example of crowdsourcing. A
community of 20,000 automotive engineers, designers, and enthusiasts
competes to build off-road rally trucks.
Implicit crowdsourcing is less obvious because users do not
necessarily know they are contributing, yet can still be very
effective in completing certain tasks. Rather than users actively
participating in solving a problem or providing information, implicit
crowdsourcing involves users doing another task entirely where a third
party gains information for another topic based on the user's
A good example of implicit crowdsourcing is the ESP game, where users
guess what images are and then these labels are used to tag Google
images. Another popular use of implicit crowdsourcing is through
reCAPTCHA, which asks people to solve CAPTCHAs to prove they are
human, and then provides CAPTCHAs from old books that cannot be
deciphered by computers, to digitize them for the web. Like many tasks
solved using the Mechanical Turk, CAPTCHAs are simple for humans, but
often very difficult for computers.
Piggyback crowdsourcing can be seen most frequently by websites such
as Google that data-mine a user's search history and websites to
discover keywords for ads, spelling corrections, and finding synonyms.
In this way, users are unintentionally helping to modify existing
systems, such as Google's AdWords.
Research has emerged that outlines the use of crowdsourcing techniques
in the public health domain. The collective intelligence
outcomes from crowdsourcing are being generated in three broad
categories of public health care; health promotion, health
research, and health maintenance.
Crowdsourcing also enables
researchers to move from small homogeneous groups of participants to
large heterogenous groups, beyond convenience samples such as
students or higher educated people. The
SESH group focuses on using
crowdsourcing to improve health.
Crowdsourcing in agriculture
Crowdsource research also reaches to the field of agriculture. This is
mainly to give the farmers and experts a kind of help in
identification of different types of weeds from the fields and
also to give them the best way to remove the weeds from fields.
Crowdsourcing in cheating in bridge
Cheating in bridge and Boye Brogeland
Boye Brogeland initiated a crowdsourcing investigation of cheating by
top-level bridge players that showed several players were guilty,
which led to their suspension.
A number of motivations exist for businesses to use crowdsourcing to
accomplish their tasks, find solutions for problems, or to gather
information. These include the ability to offload peak demand, access
cheap labor and information, generate better results, access a wider
array of talent than might be present in one organization, and
undertake problems that would have been too difficult to solve
Crowdsourcing allows businesses to submit problems on
which contributors can work, on topics such as science, manufacturing,
biotech, and medicine, with monetary rewards for successful solutions.
Although crowdsourcing complicated tasks can be difficult, simple work
tasks can be crowdsourced cheaply and effectively.
Crowdsourcing also has the potential to be a problem-solving mechanism
for government and nonprofit use. Urban and transit planning are
prime areas for crowdsourcing. One project to test crowdsourcing's
public participation process for transit planning in Salt Lake City
was carried out from 2008 to 2009, funded by a U.S. Federal Transit
Administration grant. Another notable application of
crowdsourcing to government problem solving is the Peer to Patent
Community Patent Review project for the U.S. Patent and Trademark
Researchers have used crowdsourcing systems like the Mechanical Turk
to aid their research projects by crowdsourcing some aspects of the
research process, such as data collection, parsing, and evaluation.
Notable examples include using the crowd to create speech and language
databases, and using the crowd to conduct user studies.
Crowdsourcing systems provide these researchers with the ability to
gather large amounts of data. Additionally, using crowdsourcing,
researchers can collect data from populations and demographics they
may not have had access to locally, but that improve the validity and
value of their work.
Artists have also used crowdsourcing systems. In his project called
the Sheep Market,
Aaron Koblin used Mechanical Turk to collect 10,000
drawings of sheep from contributors around the world. Sam Brown
(artist) leverages the crowd by asking visitors of his website
explodingdog to send him sentences that he uses as inspirations for
paintings. Art curator Andrea Grover argues that individuals tend
to be more open in crowdsourced projects because they are not being
physically judged or scrutinized. As with other crowdsourcers,
artists use crowdsourcing systems to generate and collect data. The
crowd also can be used to provide inspiration and to collect financial
support for an artist's work.
Additionally, crowdsourcing from 100 million drivers is being used by
INRIX to collect users' driving times to provide better
and real-time traffic updates.
The crowd is an umbrella term for the people who contribute to
crowdsourcing efforts. Though it is sometimes difficult to gather data
about the demographics of the crowd, a study by Ross et al. surveyed
the demographics of a sample of the more than 400,000 registered
Amazon Mechanical Turk to complete tasks for pay. A
previous study in 2008 by Ipeirotis found that users at that time were
primarily American, young, female, and well-educated, with 40% earning
more than $40,000 per year. In November 2009, Ross found a very
different Mechanical Turk population, 36% of which was Indian.
Two-thirds of Indian workers were male, and 66% had at least a
bachelor's degree. Two-thirds had annual incomes less than $10,000,
with 27% sometimes or always depending on income from Mechanical Turk
to make ends meet.
The average US user of Mechanical Turk earned $2.30 per hour for tasks
in 2009, versus $1.58 for the average Indian worker.
While the majority of users worked less than five hours per week, 18%
worked 15 hours per week or more. This is less than minimum wage in
the United States (but not in India), which Ross suggests raises
ethical questions for researchers who use crowdsourcing.
The demographics of Microworkers.com differ from Mechanical Turk in
that the US and India together account for only 25% of workers; 197
countries are represented among users, with Indonesia (18%) and
Bangladesh (17%) contributing the largest share. However, 28% of
employers are from the US.
Another study of the demographics of the crowd at iStockphoto found a
crowd that was largely white, middle- to upper-class, higher educated,
worked in a so-called "white-collar job" and had a high-speed Internet
connection at home. In a crowd-sourcing diary study of 30 days in
Europe the participants were predominantly higher educated women.
Studies have also found that crowds are not simply collections of
amateurs or hobbyists. Rather, crowds are often professionally trained
in a discipline relevant to a given crowdsourcing task and sometimes
hold advanced degrees and many years of experience in the
profession. Claiming that crowds are amateurs,
rather than professionals, is both factually untrue and may lead to
marginalization of crowd labor rights.
G. D. Saxton et al. (2013) studied the role of community users, among
other elements, during his content analysis of 103 crowdsourcing
organizations. Saxton et al. developed a taxonomy of nine
crowdsourcing models (intermediary model, citizen media production,
collaborative software development, digital goods sales, product
design, peer-to-peer social financing, consumer report model,
knowledge base building model, and collaborative science project
model) in which to categorize the roles of community users, such as
researcher, engineer, programmer, journalist, graphic designer, etc.,
and the products and services developed.
Further information: Online participation § Motivations
Many scholars of crowdsourcing suggest that both intrinsic and
extrinsic motivations cause people to contribute to crowdsourced tasks
and these factors influence different types of
contributors. For example,
students and people employed full-time rate human capital advancement
as less important than part-time workers do, while women rate social
contact as more important than men do.
Intrinsic motivations are broken down into two categories:
enjoyment-based and community-based motivations. Enjoyment-based
motivations refer to motivations related to the fun and enjoyment that
contributors experience through their participation. These motivations
include: skill variety, task identity, task autonomy, direct feedback
from the job, and pastime. Community-based motivations refer to
motivations related to community participation, and include community
identification and social contact. In crowdsourced journalism, the
motivation factors are intrinsic: the crowd is driven by a possibility
to make social impact, contribute to social change and help their
Extrinsic motivations are broken down into three categories: immediate
payoffs, delayed payoffs, and social motivations. Immediate payoffs,
through monetary payment, are the immediately received compensations
given to those who complete tasks. Delayed payoffs are benefits that
can be used to generate future advantages, such as training skills and
being noticed by potential employers. Social motivations are the
rewards of behaving pro-socially, such as the altruistic
motivations of online volunteers. Chandler and Kapelner found that US
users of the
Amazon Mechanical Turk were more likely to complete a
task when told they were going to “help researchers identify tumor
cells,” than when they were not told the purpose of their task.
However, of those who completed the task, quality of output did not
depend on the framing of the task.
Motivation factors in crowdsourcing are often a mix of intrinsic and
extrinsic factors. In a crowdsourced law-making project, the
crowd was motivated by a mix of intrinsic and extrinsic factors.
Intrinsic motivations included fulfilling civic duty, affecting the
law for sociotropic reasons, to deliberate with and learn from peers.
Extrinsic motivations included changing the law for financial gain or
other benefits. Participation in crowdsourced policy-making was an act
of grassroots advocacy, whether to pursue one’s own interest or more
altruistic goals, such as protecting nature.
Another form of social motivation is prestige or status. The
International Children's Digital Library recruits volunteers to
translate and review books. Because all translators receive public
acknowledgment for their contributions, Kaufman and Schulz cite this
as a reputation-based strategy to motivate individuals who want to be
associated with institutions that have prestige. The Mechanical Turk
uses reputation as a motivator in a different sense, as a form of
quality control. Crowdworkers who frequently complete tasks in ways
judged to be inadequate can be denied access to future tasks,
providing motivation to produce high-quality work.
Using crowdsourcing through means such as
Amazon Mechanical Turk can
help provide researchers and requesters with an already established
infrastructure for their projects, allowing them to easily use a crowd
and access participants from a diverse culture background. Using
crowdsourcing can also help complete the work for projects that would
normally have geographical and population size limitations.
Participation in crowdsourcing
Despite the potential global reach of IT applications online, recent
research illustrates that differences in location[which?] affect
participation outcomes in IT-mediated crowds.
Limitations and controversies
At least five major topics cover the limitations and controversies
Impact of crowdsourcing on product quality
Entrepreneurs contribute less capital themselves
Increased number of funded ideas
The value and impact of the work received from the crowd
The ethical implications of low wages paid to crowdworkers
Impact of crowdsourcing on product quality
Crowdsourcing allows anyone to participate, allowing for many
unqualified participants and resulting in large quantities of unusable
contributions. Companies, or additional crowdworkers, then have to
sort through all of these low-quality contributions. The task of
sorting through crowdworkers’ contributions, along with the
necessary job of managing the crowd, requires companies to hire actual
employees, thereby increasing management overhead. For example,
susceptibility to faulty results is caused by targeted, malicious work
efforts. Since crowdworkers completing microtasks are paid per task,
often a financial incentive causes workers to complete tasks quickly
rather than well. Verifying responses is time-consuming, so requesters
often depend on having multiple workers complete the same task to
correct errors. However, having each task completed multiple times
increases time and monetary costs.
Crowdsourcing quality is also impacted by task design. Lukyanenko et
al. argue that, the prevailing practice of modeling crowdsourcing
data collection tasks in terms of fixed classes (options),
unnecessarily restricts quality. Results demonstrate that information
accuracy depends on the classes used to model domains, with
participants providing more accurate information when classifying
phenomena at a more general level (which is typically less useful to
sponsor organizations, hence less common). Further, greater overall
accuracy is expected when participants could provide free-form data
compared to tasks in which they select from constrained choices.
Just as limiting, oftentimes the scenario is that just not enough
skills or expertise exist in the crowd to successfully accomplish the
desired task. While this scenario does not affect "simple" tasks such
as image labeling, it is particularly problematic for more complex
tasks, such as engineering design or product validation. In these
cases, it may be difficult or even impossible to find the qualified
people in the crowd, as their voices may be drowned out by consistent,
but incorrect crowd members. However, if the difficulty of the
task is even "intermediate" in its difficultly, estimating
crowdworkers' skills and intentions and leveraging them for inferring
true responses works well, albeit with an additional computation
Crowdworkers are a nonrandom sample of the population. Many
researchers use crowdsourcing to quickly and cheaply conduct studies
with larger sample sizes than would be otherwise achievable. However,
due to limited access to the Internet, participation in low developed
countries is relatively low. Participation in highly developed
countries is similarly low, largely because the low amount of pay is
not a strong motivation for most users in these countries. These
factors lead to a bias in the population pool towards users in medium
developed countries, as deemed by the human development index.
The likelihood that a crowdsourced project will fail due to lack of
monetary motivation or too few participants increases over the course
of the project.
Crowdsourcing markets are not a first-in, first-out
queue. Tasks that are not completed quickly may be forgotten, buried
by filters and search procedures so that workers do not see them. This
results in a long-tail power law distribution of completion
times. Additionally, low-paying research studies online have
higher rates of attrition, with participants not completing the study
once started. Even when tasks are completed, crowdsourcing does
not always produce quality results. When
Facebook began its
localization program in 2008, it encountered some criticism for the
low quality of its crowdsourced translations.
One of the problems of crowdsourcing products is the lack of
interaction between the crowd and the client. Usually little
information is known about the final desired product, and often very
limited interaction with the final client occurs. This can decrease
the quality of product because client interaction is a vital part of
the design process.
An additional cause of the decrease in product quality that can result
from crowdsourcing is the lack of collaboration tools. In a typical
workplace, coworkers are organized in such a way that they can work
together and build upon each other’s knowledge and ideas.
Furthermore, the company often provides employees with the necessary
information, procedures, and tools to fulfill their responsibilities.
However, in crowdsourcing, crowdworkers are left to depend on their
own knowledge and means to complete tasks.
A crowdsourced project is usually expected to be unbiased by
incorporating a large population of participants with a diverse
background. However, most of the crowdsourcing works are done by
people who are paid or directly benefit from the outcome (e.g. most of
open source projects working on Linux). In many other cases, the end
product is the outcome of a single person's endeavour, who creates the
majority of the product, while the crowd only participates in minor
Entrepreneurs contribute less capital themselves
To make an idea turn into a reality, the first component needed is
capital. Depending on the scope and complexity of the crowdsourced
project, the amount of necessary capital can range from a few thousand
dollars to hundreds of thousands, if not more. The capital-raising
process can take from days to months depending on different variables,
including the entrepreneur’s network and the amount of initial
The crowdsourcing process allows entrepreneurs to access to a wide
range of investors who can take different stakes in the project.
In effect, crowdsourcing simplifies the capital-raising process and
allows entrepreneurs to spend more time on the project itself and
reaching milestones rather than dedicating time to get it started.
Overall, the simplified access to capital can save time to start
projects and potentially increase efficiency of projects.
Opponents of this issue argue easier access to capital through a large
number of smaller investors can hurt the project and its creators.
With a simplified capital-raising process involving more investors
with smaller stakes, investors are more risk-seeking because they can
take on an investment size with which they are comfortable. This
leads to entrepreneurs losing possible experience convincing investors
who are wary of potential risks in investing because they do not
depend on one single investor for the survival of their project.
Instead of being forced to assess risks and convince large
institutional investors why their project can be successful, wary
investors can be replaced by others who are willing to take on the
There are translation companies and several users of translations who
pretend to use crowdsourcing as a means for drastically cutting costs,
instead of hiring professional translators. This situation has been
systematically denounced by
IAPTI and other translator
Increased number of funded ideas
The raw number of ideas that get funded and the quality of the ideas
is a large controversy over the issue of crowdsourcing.
Proponents argue that crowdsourcing is beneficial because it allows
niche ideas that would not survive venture capitalist or angel
funding, many times the primary investors in startups, to be started.
Many ideas are killed in their infancy due to insufficient support and
lack of capital, but crowdsourcing allows these ideas to be started if
an entrepreneur can find a community to take interest in the
Crowdsourcing allows those who would benefit from the project to fund
and become a part of it, which is one way for small niche ideas get
started. However, when the raw number of projects grows, the
number of possible failures can also increase.
niche and high-risk projects to start because of a perceived need from
a select few who seek the product. With high risk and small target
markets, the pool of crowdsourced projects faces a greater possible
loss of capital, lower return, and lower levels of success.
Because crowdworkers are considered independent contractors rather
than employees, they are not guaranteed minimum wage. In practice,
workers using the
Amazon Mechanical Turk generally earn less than the
minimum wage, with US users earning an average of $2.30 per hour for
tasks in 2009, and users in India earning an average of $1.58 per
hour, which is below minimum wage in the United States (but not in
India). Some researchers who have considered using
Mechanical Turk to get participants for research studies have argued
that the wage conditions might be unethical. However,
according to other research, workers on
Amazon Mechanical Turk do not
feel that they are exploited and are ready to participate in
crowdsourcing activities in the future. When
Facebook began its
localization program in 2008, it received criticism for using free
labor in crowdsourcing the translation of site guidelines.
Typically, no written contracts, nondisclosure agreements, or employee
agreements are made with crowdworkers. For users of the Amazon
Mechanical Turk, this means that requestors decide whether users' work
is acceptable, and reserve the right to withhold pay if it does not
meet their standards. Critics say that crowdsourcing arrangements
exploit individuals in the crowd, and a call has been made for crowds
to organize for their labor rights.
Collaboration between crowd members can also be difficult or even
discouraged, especially in the context of competitive crowd sourcing.
InnoCentive allows organizations to solicit
solutions to scientific and technological problems; only 10.6% of
respondents report working in a team on their submission. Amazon
Mechanical Turk workers collaborated with academics to create a
platform, WeAreDynamo.org, that allows them to organize and create
campaigns to better their work situation.
This section does not cite any sources. Please help improve this
section by adding citations to reliable sources. Unsourced material
may be challenged and removed. (July 2017) (Learn how and when to
remove this template message)
The popular forum website reddit came under the spotlight during the
first few days after the events of the
Boston Marathon bombing
Boston Marathon bombing as it
showed how powerful social media and crowdsourcing could be. Reddit
was able to help many victims of the bombing as they sent relief and
some even opened up their homes, all being communicated very
efficiently on their site. However,
Reddit soon came under fire after
they started to crowdsource information on the possible perpetrators
of the bombing. While the FBI received thousands of photos from
average citizens, the website also started to focus on crowdsourcing
their own investigation, with the information that they were
Reddit members claimed to have found 4
bombers but all were innocent, including a college student who had
committed suicide a few days before the bombing. The problem was
exacerbated when the media also started to rely on
Reddit as their
source for information, allowing the misinformation
to spread almost nationwide. The FBI has since warned the media to be
more careful of where they are getting their information but
Reddit’s investigation and its false accusations opened up questions
about what should be crowdsourced and the unintended consequences of
Collaborative innovation network
Collective problem solving
Commons-based peer production
Crowdsourcing software development
List of crowdsourcing projects
Virtual Collective Consciousness
Wisdom of the crowd
^ a b Safire, William (February 5, 2009). "On Language". New York
Times Magazine. Retrieved May 19, 2013.
^ Schenk, Eric; Guittard, Claude (2009). Crowdsourcing: What can be
Outsourced to the Crowd, and Why ?
^ Hirth, Matthias; Hoßfeld, Tobias; Tran-Gia, Phuoc. Anatomy of a
Crowdsourcing Platform - Using the Example of Microworkers.com
Archived 2015-11-22 at the Wayback Machine.. 5th IEEE International
Conference on Innovative Mobile and
Internet Services in Ubiquitous
Computing (IMIS 2011), June 2011, doi:10.1109/IMIS.2011.89
^ a b Estellés-Arolas, Enrique; González-Ladrón-de-Guevara,
Fernando (2012), "Towards an Integrated
(PDF), Journal of Information Science, 38 (2): 189–200,
^ a b c Howe, Jeff (2006). "The Rise of Crowdsourcing". Wired.
^ Brabham, D. C. (2013). Crowdsourcing. Cambridge, Massachusetts;
London, England: The MIT Press.
^ Brabham, D. C. (2008). "
Crowdsourcing as a Model for Problem Solving
an Introduction and Cases". Convergence: The International Journal of
Research into New Media Technologies. 14 (1): 75–90.
^ Prpić, J., & Shukla, P. (2016).
Crowd Science: Measurements,
Models, and Methods. In Proceedings of the 49th Annual Hawaii
International Conference on System Sciences, Kauai, Hawaii: IEEE
^ Buettner, Ricardo (2015). A Systematic Literature Review of
Crowdsourcing Research from a Human Resource Management Perspective.
48th Annual Hawaii International Conference on System Sciences. Kauai,
Hawaii: IEEE. pp. 4609–4618. doi:10.13140/2.1.2061.1845.
^ a b Prpić, John; Taeihagh, Araz; Melton, James (September 2015).
"The Fundamentals of Policy Crowdsourcing". Policy & Internet. 7
(3): 340–361. doi:10.1002/poi3.102.
^ Schlagwein, Daniel; Bjørn-Andersen, Niels (2014), "Organizational
Learning with Crowdsourcing: The Revelatory Case of LEGO" (PDF),
Journal of the Association for Information Systems, 15 (11)
^ Taeihagh, Araz (2017-06-19). "Crowdsourcing, Sharing Economies, and
Development". Journal of Developing Societies: 0169796X1771007.
^ a b Guth, Kristen L.; Brabham, Daren C. (2017-08-04). "Finding the
diamond in the rough: Exploring communication and platform in
crowdsourcing performance". Communication Monographs. 0 (4): 1–24.
doi:10.1080/03637751.2017.1359748. ISSN 0363-7751.
^ Howe, Jeff (June 2, 2006). "Crowdsourcing: A Definition".
Crowdsourcing Blog. Retrieved January 2, 2013.
^ "Daren C. Brabham". USC Annenberg. University of Southern
California. Retrieved 17 September 2014.
^ a b c d e Brabham, Daren (2008), "
Crowdsourcing as a Model for
Problem Solving: An Introduction and Cases" (PDF), Convergence: The
International Journal of Research into New Media Technologies, 14 (1):
75–90, doi:10.1177/1354856507084420, archived from the original
(PDF) on 2012-04-25
^ a b Afuah, A.; Tucci, C. L. (2012). "
Crowdsourcing as a Solution to
Distant Search". Academy of Management Review. 37 (3): 355–375.
^ Vuković, M. (2009).
Crowdsourcing for enterprises. In Services-I,
2009 World Conference on (pp. 686-692). IEEE.
^ de Vreede, T., Nguyen, C., de Vreede, G. J., Boughzala, I., Oh, O.,
& Reiter-Palmon, R. (2013). A Theoretical Model of User Engagement
in Crowdsourcing. In Collaboration and Technology (pp. 94-109).
Springer Berlin Heidelberg
^ Claypole, Maurice (February 14, 2012). "Learning through
crowdsourcing is deaf to the language challenge". The Guardian.
^ a b c d e f g h "A Brief History of
Crowdsourcing.org. 2012-03-18. Retrieved 2015-07-02.
^ Hern, Chester G.(2002). Tracks in the Sea, p. 123 & 246. McGraw
Hill. ISBN 0-07-136826-4.
Paris en 1970'". Etudesphotographiques.revues.org.
1970-04-25. Retrieved 2015-07-02.
^ "UNV Online
Volunteering Service History". Onlinevolunteering.org.
^ a b "Wired 14.06: The Rise of Crowdsourcing". Archive.wired.com.
2009-01-04. Retrieved 2015-07-02.
^ Lih, Andrew (2009). The revolution : how a bunch of
nobodies created the world's greatest encyclopedia (1st ed.). New
York: Hyperion. ISBN 1401303714.
^ a b  Archived November 29, 2014, at the Wayback Machine.
^ "Antoine-Jean-Baptiste-Robert Auget, Baron de Montyon". New Advent.
Retrieved February 25, 2012.
^ "It Was All About Alkali". Chemistry Chronicles. Retrieved February
^ "Nicolas Appert". John Blamire. Retrieved February 25, 2012.
^ "9 Examples of Crowdsourcing, Before 'Crowdsourcing' Existed".
MemeBurn. Retrieved February 25, 2012.
^ Pande, Shamni. "The People Know Best". Business Today. India: Living
Media India Limited.
^ Vergano, Dan. "1833 Meteor Storm Started Citizen Science". National
Geographic. StarStruck. Retrieved 18 September 2014.
^ "Gateway to Astronaut Photography of Earth". NASA.
^ McLaughlin, Elliot. "Image Overload:
Help us sort it all out, NASA
requests". Cnn.com. CNN. Retrieved 18 September 2014.
^ Després, Jacques; Hadjsaid, Nouredine; Criqui, Patrick; Noirot,
Isabelle (1 February 2015). "Modelling the impacts of variable
renewable sources on the power sector: reconsidering the typology of
energy modelling tools". Energy. 80: 486–495.
doi:10.1016/j.energy.2014.12.005. ISSN 0360-5442.
OpenEI — Energy Information, Data, and other Resources". OpenEI.
^ Garvin, Peggy (12 December 2009). "New Gateway: Open Energy Info".
SLA Government Information Division. Dayton, OH, USA. Retrieved
^ Brodt-Giles, Debbie (2012). WREF 2012:
OpenEI — an open energy
data and information exchange for international audiences (PDF).
Golden, CO, USA: National Renewable Energy Laboratory (NREL).
^ Davis, Chris; Chmieliauskas, Alfredas; Dijkema, Gerard; Nikolic,
Igor. "Enipedia". Delft, The Netherlands: Energy and Industry group,
Faculty of Technology, Policy and Management, TU Delft. Retrieved
^ Davis, Chris (2012). Making sense of open data: from raw data to
actionable insight — PhD thesis (PDF). Delft, The Netherlands: Delft
University of Technology. Retrieved 2016-07-21. Chapter 9
discusses in depth the initial development of Enipedia.
^ "What Is the Four-Generation Program?". The Church of Jesus Christ
of Latter-day Saints. Retrieved January 30, 2012.
^ Bonney, R. and LaBranche, M. (2004). Citizen Science: Involving the
Public in Research. ASTC Dimensions. May/June 2004, p. 13.
^ Baretto, C.; Fastovsky, D.; Sheehan, P. (2003). "A Model for
Integrating the Public into Scientific Research". Journal of
Geoscience Education. 50 (1): 71–75.
^ McCaffrey, R.E. (2005). "Using Citizen Science in Urban Bird
Studies". Urban Habitats. 3 (1): 70–86.
^ King, Turi E.; Jobling, Mark A. (2009). "What's in a name? Y
chromosomes, surnames and the genetic genealogy revolution". Trends in
Genetics. 25 (8): 351–60. doi:10.1016/j.tig.2009.06.003.
PMID 19665817. The International Society of Genetic Genealogy
(www.isogg.org) advocates the use of genetics as a tool for
genealogical research, and provides a support network for genetic
genealogists. It hosts the ISOGG Y-haplogroup tree, which has the
virtue of being regularly updated.
^ Mendex, etc. al., Fernando (28 February 2013). "An African American
Paternal Lineage Adds an Extremely Ancient Root to the Human Y
Chromosome Phylogenetic Tree". The American Journal of Human Genetics.
The American Society of Human Genetics. 92: 454–459.
doi:10.1016/j.ajhg.2013.02.002. PMC 3591855 .
PMID 23453668. Retrieved 10 July 2013.
^ Wells, Spencer (2013). "The
Genographic Project and the Rise of
Citizen Science". Southern California
Genealogical Society (SCGS).
Archived from the original on 2013-07-10. Retrieved July 10,
^ Aitamurto, Tanja (2015). "Motivation Factors in Crowdsourced
Journalism: Social Impact, Social Change and Peer-Learning".
International Journal of Communication. 9: 3523–3543.
^ Aitamurto, Tanja (2016). "
Crowdsourcing as a Knowledge-Search Method
in Digital Journalism: Ruptured Ideals and Blended Responsibility".
Digital Journalism. 4: 280–297.
^ Aitamurto, Tanja (2013). "Balancing between open and closed:
co-creation in magazine journalism". Digital Journalism. 1 (2):
^ Keuleers; et al. (Feb 2015). "Word knowledge in the crowd: Measuring
vocabulary size and word prevalence in a massive online experiment".
Quarterly journal of experimental psychology. 68: 1665–1692.
^ "History of the
Christmas Bird Count Audubon". Birds.audubon.org.
^  Archived August 24, 2014, at the Wayback Machine.
^ Aitamurto, Tanja (2012).
Crowdsourcing for Democracy: New Era In
Policy–Making. Committee for the Future, Parliament of Finland.
pp. 10–30. ISBN 978-951-53-3459-6.
^ Prpić, J.; Taeihagh, A.; Melton, J. (2014). "
Policy Cycle. Collective Intelligence 2014, MIT Center for Collective
Intelligence" (PDF). Humancomputation.com. Retrieved 2015-07-02.
^ Prpić, J.; Taeihagh, A.; Melton, J. (2014). "A Framework for Policy
Internet Institute, University of Oxford - IPP
Crowdsourcing for Politics and Policy" (PDF).
Ipp.oxii.ox.ac.uk. Retrieved 2015-07-02.
^ Prpić, J.; Taeihagh, A.; Melton, J. (2014). "Experiments on
Crowdsourcing Policy Assessment. Oxford
Internet Institute, University
of Oxford - IPP 2014 -
Crowdsourcing for Politics and Policy" (PDF).
Ipp.oii.ox.ac.uk. Retrieved 2015-07-02.
^ Thapa, B.; Niehaves, B.; Seidel, C.; Plattfaut, R. (2015). "Citizen
involvement in public sector innovation: Government and citizen
perspectives". Information Polity. pp. 3–17.
^ Aitamurto and Landemore. "Five design principles for crowdsourced
policymaking: Assessing the case of crowdsourced off-road traffic law
reform in Finland". Journal of Social Media for Organizations (1):
^ a b Aitamurto, Landemore, Galli (2016). "Unmasking the Crowd:
Participants' Motivation Factors, Profile and Expectations for
Participation in Crowdsourced Policymaking". Information,
Communication & Society. 20: 1239–1260.
doi:10.1080/1369118x.2016.1228993 – via Routledge. CS1 maint:
Multiple names: authors list (link)
^ Aitamurto, Chen, Cherif, Galli and Santana (2016). "Making Sense of
Crowdsourced Civic Data with Big Data Tools". ACM Digital Archive:
Academic Mindtrek – via ACM Digital Archive. CS1 maint:
Multiple names: authors list (link)
^ Aitamurto, Tanja.
Crowdsourcing for Democracy: New Era in
Policymaking. Committee for the Future, Parliament of Finland.
^ "Home - ISCRAM2015 - University of Agder" (PDF).
^ Andro, M. (2018). Digital libraries and crowdsourcing, Wiley / ISTE.
^ a b DeVun, Leah (November 19, 2009). "Looking at how crowds produce
and present art". Wired News. Archived from the original on
2012-10-24. Retrieved February 26, 2012.
^ Ess, Henk van "Crowdsourcing: how to find a crowd", ARD ZDF Akademie
2010, Berlin, p. 99,
^ a b c Doan, A.; Ramarkrishnan, R.; Halevy, A. (2011), "Crowdsourcing
Systems on the World Wide Web" (PDF), Communications of the ACM, 54
(4): 86–96, doi:10.1145/1924421.1924442 [permanent dead link]
^ Brabham, Daren C. (2013), Crowdsourcing, MIT Press.,
^ Howe, Jeff (2008), "Crowdsourcing: Why the Power of the
Driving the Future of Business" (PDF), The International Achievement
^ Robson, John (February 24, 2012). "IEM Demonstrates the Political
Wisdom of Crowds". Canoe.ca. Retrieved March 31, 2012.
^ "4 Great Examples of
Crowdsourcing through Social Media".
^ Goldberg, Ken; Newsom, Gavin. "Let's amplify California's collective
intelligence". Citris-uc.org. Retrieved 14 June 2014.
^ Escoffier, N. and B. McKelvey (2014). "Using "Crowd-Wisdom Strategy"
to Co-Create Market Value: Proof-of-Concept from the Movie Industry."
in International Perspective on Business Innovation and Disruption in
the Creative Industries: Film, Video, Photography, P. Wikstrom and R.
DeFillippi, eds., UK: Edward Elgar Publishing Ltd, Chap. 11.
^ Block, A. B. (2009). "How boxoffice trading could flop." The
Hollywood Reporter, (April 22).
^ Chen, A. and R. Panaligan (2013). "Quantifying movie magic with
Google search." Google White Paper, Industry Perspectives+User
^ Williams, Jack (2017-02-17). "An Indoor Football Team Has Its Fans
Call the Plays". The New York Times. ISSN 0362-4331. Retrieved
^ Cunard, C. (2010). "The Movie Research Experience gets audiences
involved in filmmaking." The Daily Bruin, (July 19)
^ MacArthur, Kate. "Squadhelp wants your company to crowdsource better
names (and avoid Boaty McBoatface)". chicagotribune.com. Retrieved
^ "Compete To Create Your Dream Home". FastCoexist.com. June 4, 2013.
^ "Designers, clients forge ties on web". Boston Herald. June 11,
2012. Retrieved 2014-02-03.
^ Stan Nussbaum. 2003. Proverbial perspectives on pluralism.
Connections: the journal of the WEA Missions Committee October, pp.
^ "Oromo dictionary project". OromoDictionary.com. Retrieved
^ "Description of WeSay software and process" (PDF). Retrieved
^ "Developing ASL vocabulary for science and math". Washington.edu.
December 7, 2012. Retrieved 2014-02-03.
^ "Pashto Proverb Collection project". AfghanProverbs.com. Retrieved
^ "Comparing methods of collecting proverbs" (PDF). gial.edu.
^ Edward Zellem. 2014. Mataluna: 151 Afghan Pashto Proverbs. Tampa,
FL: Culture Direct.
^ "Web 2.0-based crowdsourcing for high-quality gold standard
development in clinical Natural Language Processing". Jmir.org.
doi:10.2196/jmir.2426. Retrieved 2014-02-03.
^ Geiger D, Rosemann M, Fielt E.
Crowdsourcing information systems: a
systems theory perspective. InProceedings of the 22nd Australasian
Conference on Information Systems (ACIS 2011) 2011.
^ Lombard, Amy (May 5, 2013). "Crowdfynd: The First Place to Look".
TIME.com. Retrieved 2014-02-03.
^ Prive, Tanya. "What Is
Crowdfunding And How Does It Benefit The
Economy". Forbes.com. Retrieved 2015-07-02.
^ Choy, Katherine; Schlagwein, Daniel (2016), "
Crowdsourcing for a
better world: On the relation between IT affordances and donor
motivations in charitable crowdfunding" (PDF), Information Technology
& People, 29 (1): 221–247, doi:10.1108/ITP-09-2014-0215
^ Barnett, Chance. "
Crowdfunding Sites In 2014". Forbes.com. Retrieved
^ a b c Agrawal, Ajay, Christian Catalini, and Avi Goldfarb. "Some
Simple Economics of Crowdfunding." National Bureau of Economic
Research (2014): 63-97.
^ "Mobile Crowdsourcing". Clickworker. Retrieved 10 December
^ Thebault-Spieker, Terveen, & Hecht. Avoiding the South Side and
the Suburbs: The Geography of Mobile
Crowdsourcing Markets. CS1
maint: Multiple names: authors list (link)
^ Chatzimiloudis, Konstantinidis & Laoudias, Zeinalipour-Yazti.
Crowdsourcing with smartphones" (PDF).
^ MIST: Fog-based Data Analytics Scheme with Cost-Efficient Resource
Provisioning for IoT Crowdsensing Applications .
^ Yang, J.; Adamic, L.; Ackerman, M. (2008), "
Knowledge Sharing: Strategic User Behavior on Taskcn" (PDF),
Proceedings of the 9th ACM Conference on Electronic Commerce
^ Gadiraju, U.; Kawase, R.; Dietze, S. (2014), "A Taxonomy of
Microtasks on the Web" (PDF), Proceedings of the 25th ACM Conference
on Hypertext and Social Media
^ Felstiner, Alek (August 2011). "Working the Crowd: Employment and
Labor Law in the
Crowdsourcing Industry" (PDF). BERKELEY JOURNAL OF
EMPLOYMENT & LABOR LAW. 32: 150–151 – via WTF.
^ "View of Crowdsourcing: Libertarian Panacea or Regulatory
Nightmare?". online-shc.com. Retrieved 2017-05-26.
^ Leimeister, J.M.; Huber, M.; Bretschneider, U.; Krcmar, H. (2009),
"Leveraging Crowdsourcing: Activation-Supporting Components for
IT-Based Ideas Competition", Journal of Management Information
Systems, 26 (1): 197–224, doi:10.2753/mis0742-1222260108
^ Ebner, W.; Leimeister, J.; Krcmar, H. (2009), "Community Engineering
for Innovations: The Ideas Competition as a method to nurture a
Virtual Community for Innovations", R&D Management, 39 (4):
^ "DARPA Network Challenge". DARPA Network Challenge. Archived from
the original on August 11, 2011. Retrieved November 28, 2011.
^ "Social media web snares 'criminals'". New Scientist. Retrieved
April 4, 2012.
^ "Beyond XPrize: The 10 Best
Crowdsourcing Tools and Technologies".
February 20, 2012. Retrieved March 30, 2012.
^ a b Kittur, A.; Chi, E.H.; Sun, B. (2008), "
studies with Mechanical Turk" (PDF), CHI 2008
^ Tang, Weiming; Han, Larry; Best, John; Zhang, Ye; Mollan, Katie;
Kim, Julie; Liu, Fengying; Hudgens, Michael; Bayus, Barry
Crowdsourcing HIV Test Promotion Videos: A
Noninferiority Randomized Controlled Trial in China". Clinical
Infectious Diseases. 62 (11): 1436–1442. doi:10.1093/cid/ciw171.
ISSN 1537-6591. PMC 4872295 . PMID 27129465.
^ a b Zhang, Ye; Kim, Julie A.; Liu, Fengying; Tso, Lai Sze; Tang,
Weiming; Wei, Chongyi; Bayus, Barry L.; Tucker, Joseph D. (November
2015). "Creative Contributory Contests to Spur Innovation in Sexual
Health: 2 Cases and a Guide for Implementation". Sexually Transmitted
Diseases. 42 (11): 625–628. doi:10.1097/OLQ.0000000000000349.
ISSN 1537-4521. PMC 4610177 . PMID 26462186.
^ van der Krieke; et al. (2015). "HowNutsAreTheDutch (HoeGekIsNL): A
crowdsourcing study of mental symptoms and strengths". International
Journal of Methods in Psychiatric Research. 25 (2): 123–144.
doi:10.1002/mpr.1495. PMID 26395198. CS1 maint: Explicit use
of et al. (link)
^ Prpić, J. (2015). "Health Care Crowds: Collective Intelligence in
Public Health. Collective Intelligence 2015. Center for the Study of
Complex Systems, University of Michigan". Paoers.ssrn.com.
SSRN 2570593 . Missing or empty url= (help);
access-date= requires url= (help)
^ a b van der Krieke, L; Blaauw, FJ; Emerencia, AC; Schenk, HM;
Slaets, JP; Bos, EH; de Jonge, P; Jeronimus, BF (2016). "Temporal
Dynamics of Health and Well-Being: A
Crowdsourcing Approach to
Momentary Assessments and Automated Generation of Personalized
Feedback (2016)". Psychosomatic Medicine: 1.
doi:10.1097/PSY.0000000000000378. PMID 27551988.
^ Rahman, Mahbubur; Blackwell, Brenna; Banerjee, Nilanjan; Dharmendra,
Saraswat (2015), "Smartphone-based hierarchical crowdsourcing for weed
identification", Computers and Electronics in Agriculture: 14–23,
retrieved 12 August 2015
^ Primarily on the Bridge Winners website
^ Noveck, Beth Simone (2009),
Wiki Government: How Technology Can Make
Government Better, Democracy Stronger, and Citizens More Powerful,
Brookings Institution Press
^ Sarasua, Cristina; Simperl, Elena; Noy, Natalya F. (2012),
Crowdsourcing Ontology Alignment with Microtasks" (PDF), Institute
AIFB. Karlsruhe Institute of Technology: 2
Crowdfunding and Civic Society in Europe: A Profitable
Partnership?". Open Citizenship Journal. Retrieved April 29,
^ Federal Transit Administration Public Transportation Participation
Pilot Program, U.S. Department of Transportation, archived from the
original on January 7, 2009
^ Peer-to-Patent Community Patent Review Project, Peer-to-Patent
Community Patent Review Project
^ Callison-Burch, C.; Dredze, M. (2010), "Creating Speech and Language
Data With Amazon's Mechanical Turk" (PDF), Human Language Technologies
^ McGraw, I.; Seneff, S. (2011), "Growing a Spoken Language Interface
on Amazon Mechanical Turk" (PDF), Interspeech: 3057–3060
^ a b c Mason, W.; Suri, S. (2010), "Conducting Behavioral Research on
Amazon's Mechanical Turk", Behavior Research Methods,
^ Koblin, A. (2008), "The sheep market", Creativity and
^ "explodingdog 2015". Explodingdog.com. Retrieved 2015-07-02.
^ Linver, D. (2010),
Crowdsourcing and the Evolving Relationship
between Art and Artist
^ "Why". INRIX.com. 2014-09-13. Retrieved 2015-07-02.
^ a b Ross, J.; Irani, L.; Silberman, M.S.; Zaldivar, A.; Tomlinson,
B. (2010). "Who are the Crowdworkers? Shifting
Mechanical Turk" (PDF). CHI 2010. Archived from the original (PDF) on
^ Hirth, M.; Hoßfeld, T.; Train-Gia, P. (2011), Human Cloud as
Internet Application – Anatomy of the Microworkers
Crowdsourcing Platform (PDF)
^ a b c Brabham, Daren C. (2008). "Moving the
Crowd at iStockphoto:
The Composition of the
Crowd and Motivations for Participation in a
Crowdsourcing Application". First Monday.
^ a b c Lakhani; et al. (2007). "The Value of Openness in Scientific
Problem Solving" (PDF). Retrieved February 26, 2012.
^ Brabham, Daren C. (2012). "Managing Unexpected Publics Online: The
Challenge of Targeting Specific Groups with the Wide-Reaching Tool of
the Internet". International Journal of Communication.
^ a b Brabham, Daren C. (2010). "Moving the
Crowd at Threadless:
Motivations for Participation in a
Information, Communication & Society. 13: 1122–1145.
^ a b Brabham, Daren C. (2012). "The Myth of
Amateur Crowds: A
Critical Discourse Analysis of
Crowdsourcing Coverage". Information,
Communication & Society. 15: 394–410.
^ Saxton, Oh, & Kishore (2013). "Rules of Crowdsourcing: Models,
Issues, and Systems of Control". Information Systems Management. 30:
^ a b Aitamurto, Tanja (2015). "Motivation Factors in Crowdsourced
Journalism: Social Impact, Social Change, and Peer Learning".
International Journal of Communication. 9: 3523–3543.
^ a b Kaufmann, N.; Schulze, T.; Viet, D. (2011). "More than fun and
money. Worker Motivation in
Crowdsourcing – A Study on Mechanical
Turk" (PDF). Proceedings of the Seventeenth Americas Conference on
Information Systems. Archived from the original (PDF) on
^ Brabham, Daren C. (2012). "Motivations for Participation in a
Crowdsourcing Application to Improve Public Engagement in Transit
Planning". Journal of Applied Communication Research. 40: 307–328.
^ Lietsala, Katri; Joutsen, Atte (2007). "Hang-a-rounds and True
Believers: A Case Analysis of the Roles and Motivational Factors of
the Star Wreck Fans". MindTrek 2007 Conference Proceedings.
^ "State of the World's Volunteerism Report 2011" (PDF). Unv.org.
Archived from the original (PDF) on 2014-12-02. Retrieved
^ Chandler, D.; Kapelner, A. (2010). "Breaking Monotony with Meaning:
Crowdsourcing Markets" (PDF).
^ Aparicio, M.; Costa, C.; Braga, A. (2012). "Proposing a system to
support crowdsourcing" (PDF). OSDOC '12 Proceedings of the Workshop on
Open Source and Design of Communication.
^ Aitamurto, Landemore, Galli (2016). "Unmasking the Crowd:
Participants' Motivation Factors, Expectations, and Profile in a
Crowdsourced Law Reform". Information, Communication &
Society. CS1 maint: Multiple names: authors list (link)
^ Quinn, Alexander J.; Bederson, Benjamin B. (2011). "Human
Computation:A Survey and Taxonomy of a Growing Field, CHI 2011
[Computer Human Interaction conference], May 7–12, 2011, Vancouver,
BC, Canada" (PDF). Retrieved 30 June 2015.
^ a b Paolacci, G; Chandler, J; Ipeirotis, P.G. (2010). "Running
experiments on Amazon Mechanical Turk". Judgment and Decision Making.
5 (5): 411–419.
^ Prpić, J; Shukla, P.; Roth, Y.; Lemoine, J.F. (2015). "A Geography
of Participation in IT-Mediated Crowds". Proceedings of the Hawaii
International Conference on Systems Sciences 2015.
SSRN 2494537 .
^ a b Borst, Irma. "The Case For and Against Crowdsourcing: Part 2".
^ Ipeirotis; Provost; Wang (2010). "Quality Management on Amazon
Mechanical Turk" (PDF).
^ Lukyanenko, Roman; Parsons, Jeffrey; Wiersma, Yolanda (2014). "The
IQ of the Crowd: Understanding and Improving Information Quality in
Structured User-Generated Content". Information Systems Research. 25
(4): 669–689. doi:10.1287/isre.2014.0537.
^ Burnap, Alex; Ren, Alex J.; Papazoglou, Giannis; Gerth, Richard;
Gonzalez, Richard; Papalambros, Panos. "When
Crowdsourcing Fails: A
Study of Expertise on Crowdsourced Design Evaluation" (PDF).
^ Kurve, Aditya; Miller, David J.; Kesidis, George (30 May 2014).
Crowdsourcing Accounting for Variable Task Difficulty,
Worker Skill, and Worker Intention". IEEE KDE (99).
^ Hirth; Hoßfeld; Tran-Gia (2011), Human Cloud as Emerging Internet
Application - Anatomy of the Microworkers
^ Ipeirotis (2010). "Analyzing the Amazon Mechanical Turk
Marketplace". XRDS: Crossroads, The ACM Magazine for Students
(PDF)format= requires url= (help). ACM. 17 (2).
doi:10.1145/1870000/1869094. SSRN 1688194 .
^ a b Hosaka, Tomoko A. (April 2008). "
Facebook asks users to
translate for free". MSNBC.
^ Britt, Darice. "Crowdsourcing: The Debate Roars On". Retrieved
^ Woods, Dan (28 September 2009). "The Myth of Crowdsourcing". Forbes.
^ a b "The Promise of Idea Crowdsourcing: Benefits, Contexts,
Limitations Tanja Aitamurto". Academia.edu. 1970-01-01. Retrieved
^ "International Translators Association Launched in Argentina". Latin
American Herald Tribune. Retrieved 23 November 2016.
^ Kleeman, Frank (2008). "Un(der)paid Innovators: The Commercial
Utilization of Consumer Work through Crowdsourcing". Sti-studies.de.
^ Jason (2011). "Crowdsourcing: A Million Heads is Better Than One".
Crowdsourcing.org. Retrieved 2015-07-02.
^ Dupree, Steven (2014). "
Crowdfunding 101: Pros and Cons".
Gsb.stanford.edu. Retrieved 2015-07-02.
^ "Fair Labor Standards Act Advisor". Retrieved 28 February
^ Greg Norcie, 2011, "Ethical and practical considerations for
compensation of crowdsourced research participants," CHI WS on Ethics
Logs and VideoTape: Ethics in Large Scale Trials & User Generated
Content, , accessed 30 June 2015.
^ Busarovs, Aleksejs (2013). "Ethical Aspects of Crowdsourcing, or is
it a Modern Form of Exploitation" (PDF). International Journal of
Economics & Business Administration. 1 (1): 3–14. Retrieved 26
^ Graham, Mark; Hjorth, Isis; Lehdonvirta, Vili (2017-05-01). "Digital
labour and development: impacts of global digital labour platforms and
the gig economy on worker livelihoods". Transfer: European Review of
Labour and Research. 23 (2): 135–162. doi:10.1177/1024258916687250.
Crowdsourcing Scam (Dec. 2014), The Baffler, No. 26
^ Salehi; et al. (2015). "We Are Dynamo: Overcoming Stalling and
Friction in Collective Action for
Crowd Workers" (PDF). Retrieved June
Crowdsourcing at Wikibooks
Media related to