Be careful when crowdsourcing data: The crowd could give you something you don't want

3 years ago 510

Crowdsourcing is a great concept in theory, but it doesn't always work out in practice. Just ask the namers of Boaty McBoatface.

Crowdsourcing concept

Image: Lightspring/Shutterstock

You've likely been in a meeting or two in the last few years where someone referenced the "wisdom of the crowd," suggesting that a crowdsourcing technique might be the answer to the problem at hand. In essence, crowdsourcing is a fancy term for the old idea that two heads are better than one and involves techniques and technologies for using multiple people to perform a task or provide input on a given stimulus.

SEE: IT expense reimbursement policy (TechRepublic Premium)

Crowdsourcing as a business tool gained traction in recent years. It's an effective tool for tasks like software development, especially in areas like web design or social media used by thousands or even millions of users that essentially constitute a built-in crowd. Rather than invest the time to perform extensive research, a company could A/B test a new feature or design element, randomly presenting different options to large user pools. Whichever feature achieved the desired response from this crowd won and was adopted for all users.

More sophisticated examples of crowdsourcing have been deployed for everything from labor arbitrage, where a complex task is broken into pieces and delivered to a "crowd" of freelancers for completion, to predicting the outcome of sporting events under the premise that a large collection of people have better combined predictive abilities than one or two smart individuals. Used properly, crowdsourcing can also be an effective tool for designing new products or services, allowing for different features and functions, pricing strategies and marketing techniques to be quickly tested at scale.

When crowdsourcing goes awry

Perhaps the most famous incident of crowdsourcing gone wrong was a 2016 online poll administered by the UK Government to name an expensive new polar research ship. The winning name, Boaty McBoatface, was ultimately vetoed for the far less interesting RRS Sir David Attenborough, creating a degree of public controversy that the "wisdom of the crowd" had been solicited in a very public fashion, only to be summarily ignored.

SEE: Juggling remote work with kids' education is a mammoth task. Here's how employers can help (free PDF) (TechRepublic)

Some editorialists suggested that perhaps the crowd was indeed "wise" in reflecting public sentiment that people like polar scientists and government officials assigned to naming expensive things shouldn't take themselves so seriously, and perhaps governments and large organizations shouldn't ask the public for an answer they might not like.

The less-nuanced lesson from Boaty McBoatface is that the crowd may not always be "wise" and may have a very different agenda from your own. Furthermore, assembled in an anonymous mass, the crowd may not act as a pool of rational individuals, but as a discrete organism that operates under very different rules. This is not a new lesson, as throughout human history there are examples of crowds behaving irrationally, often led by a charismatic individual to do things collectively that they would never consider individually.

Using the crowd the right way

While crowdsourcing can be fraught on its own and shouldn't be relied upon by itself, it can be a useful tool in conjunction with other techniques for gathering input. The crowd is at its best when providing directional guidance between a limited set of options. For example, if you're designing a new financial dashboard, presenting two or three options to a "crowd" from your finance department, and observing how they interact with each option can be far more effective than hundreds of hours of research and interviews.

SEE: Tech projects for IT leaders: How to build a home lab (TechRepublic)

The hardest parts of using crowdsourcing is designing effective experiments and knowing when to replace or augment crowdsourced data with other sources of input. As the Boaty McBoatface debacle shows, poorly defined or open-ended questions are typically not the best problems to solve with crowdsourcing. Restructuring the poll to be less open-ended may not have garnered as much public interest but would have likely produced a more "acceptable" result.

Similarly, crowdsourcing may provide guidance without identifying the root cause. Consider a survey about a restaurant that's submitted to a crowd of past customers. The survey might indicate the crowd loved the food, found the prices reasonable, the service excellent, but indicated they would not return to the restaurant. Supplementing this confusing result with an interview or two might quickly identify that poor parking availability or difficulties in securing a reservation were the problems hindering the restaurant's success. These challenges would only be identified through long-form individual interviews with a far smaller sample set versus crowdsourcing generalized feedback.

Before betting the outcome of a key decision on the purported wisdom of the crowd, apply some of your own wisdom to ensure it's the right tool and one of several that you're applying to the problem at hand. Crowdsourcing can be effective in a limited context, but deployed carelessly or improperly might leave you with the next Boaty McBoatface.

Executive Briefing Newsletter

Discover the secrets to IT leadership success with these tips on project management, budgets, and dealing with day-to-day challenges. Delivered Tuesdays and Thursdays

Sign up today

Also see

Read Entire Article