In Situ Supervision by Professionals
Scientists and survey monitors participated in the sampling activity in 43 % of the examined citizen-science studies. These professionals were in charge of assuring accuracy of debris classification, data recording and identification of missed/overlooked debris items (e.g. Ribic et al. 2011, 2012a). For example, in a study from South Korea on the impacts of marine debris on wildlife, experts from wildlife, nature and marine research institutes provided data quality assurance on a voluntary basis contributing pictures of dissections or autoradiography in order to demonstrate how animals were affected by the debris (Hong et al. 2013).
Validation of Data and Samples
Citizen-science studies can also incorporate a validation process in which the data gathered by volunteers are compared to data obtained by professional scientists. This comparative approach was applied by 18 % of the studies, which evaluated the quality of the citizen-science data by re-counting the litter items, also using a microscope to differentiate between biological and synthetic litter (Rosevelt et al. 2013). For instance, Hidalgo-Ruz and Thiel (2013) recounted small plastic particles in samples that had been counted by citizen scientists. In one case, it was found that glass shards had been misidentified as small plastic debris. Elimination of samples with this kind of obvious error from the analysis can substantially improve data quality. According to Lindborg et al. (2012), citizen scientists can dissect and analyze seabird boluses with high accuracy resulting in measurements of contamination rates similar to those obtained by professional scientists. Validation can also be done by scientists analyzing photographs of samples taken by volunteers (Moore et al. 2009). Technological equipment can be used to generate complementary data. For instance, Seino et al. (2009) used high-frequency ocean radar, airplanes and balloons to take photographs of marine litter, which were used to complement data collected by volunteers. Data quality control can also entail the elimination of erroneous data. For instance, in a user survey on beach littering, Eastman et al. (2013) explicitly reported the data that were dismissed for further analyses. These data were related to mistaken, non-sensical and incomplete surveys, such as when children were too young to accurately complete the survey, or data were from locations with characteristics that differed from the main surveyed area (Eastman et al. 2013).
A remaining 45 % of citizen-science studies had no data quality control. In certain studies, no specific validation step might be necessary because volunteers only gathered qualitative data during beach cleanup activities (n = 11) or citizens only participated in opportunistic sighting and sample collection of dead animals, bird boluses, pellets and drifter buoys found on beaches (n = 5). No data quality control was explicitly mentioned in the professional studies examined herein.