Nielsen’s usability heuristics contain many principles relevant to crowdsourcing projects, including: keeping users informed of the system status through appropriate feedback; speaking the users' language; preventing errors; supporting recovery from error when errors do occur; following platform conventions; minimising memory load by making actions and options visible; and (where necessary) providing concrete instructions that focus on the users’ task (1995).37
In design principles specific to crowdsourcing, task ‘size’ can be measured in terms of the amount of source material to process, the time per task, modularity (whether tasks are independent and asynchronous) and cognitive load (roughly, the amount of mental effort required). '8 Research has found that microtasks lead to fewer mistakes and an ‘easier’ experience. '" They provide opportunities to learn the skills required for more complex tasks but are easier for novices to complete. If you have to design macro- or more specialist tasks, ensure that motivational text and recruitment are strong enough to match the size or complexity. Finding the sweet spot between tasks likely to attract participants, that provide useful data and are possible within the resources available can require some creativity.
Most crowdsourcing projects report that up to 80—90% of the work is done by 10% of participants and many other participants contribute a small amount each.4" Given the role ‘super-contributors’ play in a projects’ productivity, it could be tempting to optimise designs for their need but projects must cater for both casual and super contributors.
Documentation and tutorials
Ideally, interactive tutorials would show new participants how to complete the task successfully while letting them try it, rather than read about it, but the user experience design and technical resources required to do so are rarely available,41 many tutorials appear as modal windows overlaid over the task window.42 However, many users automatically close tutorials without reading or watching them, so it is important to have a visible link for a Help page that includes the tutorial and/or more detailed documentation.
Help text, whether on the task interface or a separate page, should help reassure potential participants by anticipating and answering their questions. It should be clear and unambiguous, and available at the point at which it is needed (Nielsen, 1995), address ‘boundary cases’, and ideally provide examples of what is expected (Kittur et al., 2013). Balancing the need for simplicity with the need for flexibility is a challenge for projects working with materials that may contain unexpected or inconsistent information. Producing good tutorials and documentation can take several iterations. Including tutorials and help text in usability testing will highlight issues, and test participants may provide more user-friendly alternatives for language used.
Quality control: validation and verification systems
Even the most highly skilled and well-intentioned volunteer makes occasional mistakes, and crowdsourcing projects usually carefully check the information they receive. Most methods involve comparing two or more task results for the same source against each other, with a simple ‘majority rules’ decision to accept the most common answer. The most appropriate method for reaching consensus will depend on the material, even for ‘type what you see tasks', where small differences in punctuation may make transcriptions fail‘exact match'tests. Ben Brumfield has provided a useful overview of quality control methods for transcription (Brumfield, 2012a).Verifying tags is difficult to do automatically without excluding potentially valuable unique tags from contributors with specialist knowledge,4' but verification tasks can help.44
Rewards and recognition
Public recognition of volunteer contributions is important, and can be built into many points of the project interface and communications. Some projects name contributors in project updates4’ or list them as co-authors on journal articles.46 Describing, or even better, showing the impact of contributions towards a project’s goals can powerfully link to participant motivations (Rotman et al., 2012).
Metrics for recognition should be chosen carefully. Ben Brumfield has a story that illustrates the dangers of external motivators like leaderboards, where contributors may focus on aspects that are quantified on a leaderboard at the expense of more important but unquantified tasks (Brumfield, 2012b).
Running crowdsourcing projects
The key challenge in running a project is motivating continued participation. In this section I discuss expectations around launching projects, the effect of media stories, consider the role of participant discussion, ongoing communications and maintaining participation, and planning for a ‘graceful exit’.
Participatory projects can be challenging for organisations used to ‘launch and forget it’ exhibitions and publications. Ideally, iterative design processes can continue after launch. Participants tend to have creative ideas for new tasks,47 suggest sensible tweaks to existing tasks and text, and report bugs. Over the longer term, an interface that looks good in 2019 may look dated in 2022, or you may want to take advantage of emerging technologies. It is important to allow resources for post-launch.