Crowdsourcing: New Trends and Developments (Part 1)
04/09/2018
Back in mid-2015, we published a brief on crowdsourcing, “Clarifying Crowdsourcing: Contingent and Services Procurement Examples, Definition and Analysis,” to provide procurement professionals with a clearer understanding of this valuable, innovative tool kit for business problem solving. We’re not sure of how many practitioners took note at the time or, if they did, what they might have done about it. In any case, after a three-year hiatus, we’re back to review recent trends and developments.
Crowdsourcing, properly speaking, is still neither highly visible nor well understood, except for by the managers that use it. Yet, in some ways, it is also becoming a mainstream sourcing practice. Its use within large enterprises continues to grow, and the range of real, applied solutions crowdsourcing provides is expanding. And while crowdsourcing becomes a new normal, what it is and how enterprises are using it also continue to change.
This two-part Spend Matters Plus series explores what forms crowdsourcing is taking in 2018 and how this approach could continue to evolve. Part 1 provides a definition of crowdsourcing (and what it is not), as well as examples of how companies are currently using crowdsourcing platforms. Part 2 looks at how procurement functions are managing the use of crowdsourcing within their organizations and potential new ways in which they could soon be using the technology.
Crowdsourcing: What It Is and What It’s Not
In our 2015 brief, we distinguished crowdsourcing platforms from online work marketplace platforms, where businesses and and individual workers “find one another and agree to terms of a specific engagement and a project to be performed.” Examples of providers offering such online marketplaces include Upwork, Freelancer.com, Catalant, Fiverr and Shiftgig. Marketplace platforms essentially find the right worker for the job and engage him or her to perform it, while also facilitating all or part of each of those engagements.
Crowdsourcing is something different. It’s not about engaging a particular worker to perform a job. Rather, it’s about paying for various kinds of outcomes, deliverables or solutions produced from the responses or activities of members of a crowd. And since crowds can range in size from a hundred to hundreds of thousands members, a sophisticated, fit-for-purpose technology-based platform is always required to “get the job done.”
While one could oversimplify and say there are two basic types of crowdsourcing models (i.e., (a) microtasking and (b) bounty/contest/challenge), we believe it best to avoid the dichotomy and describe the different dimensions along which crowdsourcing models can vary. What we said in 2015 still applies today:
- Tasks to be performed directly versus competitions. Some crowdsourcing involves presenting a task to be performed directly by crowd respondents (i.e., the task can be broken down or reorganized by the platform). Other crowdsourcing involves presenting the task to initiate a competition that will lead to winning submission. For example, a business’ data set requiring tagging can be broken down and distributed as microtasks to workers through Amazon Mechanical Turk, or a business in need of a logo design can use 99Designs to run a contest to find the desired design
- Low knowledge activities versus high knowledge activities. Another crowdsourcing model requires little knowledge. For example, most data processing that tends to be broken down into trivial microtasks. Some elements may require higher levels of knowledge or talent, ranging from software development to knowledge of molecular biology. For example, see the Mechanical Turk example above, or consider a business that is grappling with a complex problem and uses Innocentive to run a crowd challenge program to find a solution
- Open crowd versus vetted crowd. Alternatively, other crowdsourcing approaches engage a crowd without specific required characteristics. (It is rare to encounter a public crowd that volunteers its contributions.) Some crowdsourcing engages a crowd that only meets certain specifications, and in some cases, a crowdsourcing solution provider may provide training. For example, Amazon Mechanical Turk’s crowd is largely open, though workers become vetted algorithmically over time. Applause claims a vetted crowd of 800,000 software testers, though these too become vetted and “promoted” over time. In general, most crowdsourcing platforms offer some degree of vetting and attribution
- Unmanaged process versus managed process. In some crowdsourcing models, the digital platform is used on a self-service basis. In other cases, the platform provider can provide a range of services that can include vetting of a specific crowd, organization and execution of the process or even full-service outsourcing. For example, 99Designs graphics design contests are almost always self-service.
These are some examples of various crowdsourcing use cases and some platform providers that support them:
Large data set processing | ClickWorkers |
Simple graphic designs | DesignCrowd |
Language content translation | Unbabel |
Software testing | Bugcrowd |
Software development | TopCoder (Wipro) |
Data analytics/modeling | Kaggle (Google) |
Creative branding | Zooppa |
Innovative ideas/solutions | IdeaConnection |
Today, all of these platform providers (as well as many more) are providing those services to a growing number of enterprise and public sector clients. As noted, since our last brief in 2015, two changes have occurred: Topcoder is now a part of Wipro and Kaggle is now a part of Google (we would regard both of these acquisitions as validating crowdsourcing).
Large Organizations are Going for Crowdsourcing
While there is no real industry research on the number of large organizations using crowdsourcing, the evidence of the continuing and increasing use can be found in many different places. For example, current and past crowdsourcing challenges (and often their originating organizations) can be found within crowdsourcing platforms. Below are some samples of challenges (and some prize amounts) at two established crowdsourcing platform providers:
Innocentive
Kaggle
From Kaggle, here is an example of Zillow’s $1 million award challenge to build an improved home value prediction algorithm:
Clearly, corporate business crowdsourcing challenges are big-ticket items. At HeroX, we found another $1 million prize challenge, this one launched by Coca Cola to develop “a natural, safe, low/reduced-calorie compound that tastes like sugar when used in beverages and foods.”
Use of challenge crowdsourcing by the U.S. Federal government continues to grow, as well. NASA and DOE have been a prodigious user of crowdsourcing over the past years, and use has been spreading across agencies. The January 2018 launch of a $200,000 challenge by IARPA provides a good illustration:
These are just a sample of actual crowdsourcing challenges. Hopefully they suggest that crowdsourcing is already, at many large organization, “serious business”
In Part 1 of this TWO-part series, we provided an overview of crowdsourcing, defining what it is and how it is different from online freelancer marketplaces. We not only provided examples of different crowdsourcing platform providers (of which there are many) but also provided illustrations of real crowdsourcing in action.
In Part 2 of this series, we will cover the emergence of practices and functions to effectively manage crowdsourcing across organizations and some of the segments where crowdsourcing has grown both on the demand (buyer organizations) and supply (platform providers) side. We will also look at how the space is evolving and provide some highlights and suggestions for practitioners.
-
CORE05/24/2021
-
-
-
CORE02/27/2019
-
-
CORE05/24/2021
-
-
-
CORE02/27/2019
-