Citizen science can be defined as “public participation in organized research efforts” (Dickinson and Bonney 2012), where participants are in most cases involved in data collection or analysis (Bonney et al. 2009a). Citizen science projects can benefit both researchers and participants, as participants may learn or get excited about science and researchers are given an opportunity to collect data or conduct analyses with the help of many volunteers (Bonney et al. 2009a; Riesch, Potter, and Davies 2013). Citizen science projects have been developed within a wide range of scientific disciplines, including projects for which participants help with monitoring biodiversity (Bell et al. 2008; Hobbs and White 2012), transcribing old documents (Causer and Wallace 2012; Eveleigh et al. 2014), or classifying images (Raddick et al. 2013).
While many papers, statements, and guidelines emphasize the importance of providing feedback and acknowledgement to citizen science participants (Bowser et al. 2013; Domroese and Johnson 2017; Jennet et al. 2016), not many papers have examined the extent to which citizen scientists themselves find feedback important. This review gathers available evidence about citizen scientists’ preferences for communication of citizen science project outputs.
Because the sustainability of citizen science projects is dependent upon continued public participation, it is important to take motivational factors of participants into account when designing citizen science projects (Nov, Arazy, and Anderson 2014). Participants often indicate that they are motivated to engage in citizen science because of an interest in a project’s topic (Causer and Wallace 2012; Eveleigh et al. 2014; Hobbs and White 2012; Iacovides et al. 2013; Raddick et al. 2010; Raddick et al. 2013; Seeberger 2014) or in science in general (Iacovides et al. 2013; Jennet et al. 2016; Land-Zandstra et al. 2016a; Land-Zandstra et al. 2016b; Raddick et al. 2010). Participants are also motivated to contribute because they want to learn something new (Alender 2016; Martin et al. 2016; Raddick et al. 2010; Rotman et al. 2014; Seeberger 2014). One of the most often named motivations of participants is to contribute to science or scientific knowledge (Alender 2016; Cappa et al. 2016; Cooper et al. 2010; Curtis 2015; Domroese and Johnson 2017; Evans et al. 2005; Martin 2017; Raddick et al. 2010; Raddick et al. 2013; Tinati et al. 2017). Because contribution to science is such a widely shared motivation for participants, satisfying this motivation is important.
To align with their motivation to contribute to science, participants find it important that the significance of their contributions is clearly communicated (Bowser et al. 2013; Domroese and Johnson 2017; Jennet et al. 2016). Participants’ contribution to research can be recognized by communicating project output during the project as well as at its conclusion. Such communication acknowledges participants and treats them as collaborators with professional scientists, not only as means to an end (Eitzel et al. 2017; Fernandez, Kodish and Weijer 2003; Shalowitz and Miller 2008).
In collaborative or co-created projects (Bonney et al. 2009a), participants may already be engaged in data analysis and thus aware of the project’s results. Therefore, disseminating scientific output may be most important for contributory projects, for which participants may not automatically have access to data and findings.
Participants’ need for communication of project output can be explained by Vroom’s Expectancy Theory (Vroom 1964), which has been applied to understand motivational factors for different types of behavior, including engagement in volunteer tourism (Andereck et al. 2012) and alumni giving to their alma mater (Weerts and Ronca 2007). The theory describes three components of motivation: expectancy, instrumentality, and valence (see Figure 1). Expectancy characterizes a person’s perceived probability that a certain effort will lead to successful performance, and is based on having the right resources, skills, and support to perform the task at hand. Instrumentality construes a person’s perceived probability that successful performance will lead to an outcome, and is concerned with receiving some reward as a result of performance. Valence describes whether a person finds the outcome desirable.
Vroom’s Expectancy Theory also can be applied to motivational factors for engaging in citizen science. If we define performance as making a contribution to a project, then expectancy relates to a participant’s belief that he can make such a contribution, and is based on self-efficacy for participation, which can be ensured by having enough knowledge to perform the task at hand, having a clear user interface, and providing sufficient information about how to submit a contribution. There can be multiple outcomes of this performance, but one of the most prominent reasons to engage in citizen science has been to contribute to science, and if this is defined as the outcome, then instrumentality means that a participant perceives that his contribution will actually contribute to science. While the desirability of this outcome (valence) can differ by person, we can conclude that the valence of this outcome is generally positive.
Because instrumentality is one of the three key components of motivation, it is important to take it into account when designing a citizen science project. One way to strengthen instrumentality is to communicate scientific output to participants, which enables participants to perceive that their contribution actually leads to a scientific outcome.
The “Ten principles of citizen science” defined by the European Citizen Science Association (ECSA) also underline this need to communicate project findings and acknowledge participants (ECSA 2015). One principle states that project data should be made publicly available and that results should be published in an open access format. Another principle states that citizen scientists need to receive feedback from the project, for example by communicating how participants’ data are used and what the findings are. A third principle states that citizen scientists should be acknowledged in project results and publications.
Three types of scientific output can be recognized in these principles. The first is the data gathered in the project, which should be shared and be accessible. The second is project findings, meaning what project coordinators or researchers have done with the collected data. The third is recognition of project participants in (scientific) publications. Communicating these three types of scientific output can lead to a higher value of Vroom’s component of instrumentality.
This review discusses these three types of output and defines them together as “communication of scientific output.” We use the term “preferences” to indicate what citizen scientists think or find valuable regarding communication of scientific output. Although the importance of communication and feedback to participants is widely accepted among project coordinators, not many academic papers have specifically investigated this topic from the participants’ point of view. Therefore, we provide a systematic literature review of all studies that include participants’ preferences for communication of citizen science project output. This review can serve as a starting point for future research.
We used the Web of Science database to search for relevant literature on citizen scientists’ views on scientific output. Because citizen science projects are organized in various scientific disciplines, we could not use a domain-specific database. Instead, we followed the approach of West and Pateman (2016) and Kullenberg and Kasperowski (2016) to use the multidisciplinary Web of Science database to search specifically for academic papers (as opposed to Google Scholar, which has more noise). Little research has focused specifically on citizen scientists’ preferences for communication of data, findings, and publications, but we found relevant information in papers studying citizen scientists’ motivations, which generally incorporate citizen scientists’ preferences for different aspects of the project. Therefore, the search term ‘“citizen science” motivat*’ was used to search within “topics.” This approach retrieved 92 papers (see Figure 2 for the literature inclusion process), of which 16 included information on participants’ preferences for communication of citizen science project output. We then examined each paper to see whether it referred to relevant literature beyond the scope of the Web of Science database, resulting in 16 additional papers. See Table 1 for an overview of all included studies.
The 32 academic papers that were selected via this process concerned participants’ experiences with engaging in citizen science projects in a wide range of application areas (biodiversity, environmental, astronomy, geography, health). All papers contained results or remarks on participants’ preferences for communication of scientific output. In four papers, relevant information was found in quantitative analyses with either Likert scales or with percentages of participants agreeing with certain statements. In 23 papers, relevant information was found in interviews with participants, open questions in surveys, or in the discussion. In five papers, relevant information was found in both quantitative and qualitative analyses. Many names have been used to refer to those who engage in citizen science projects, for example “citizen scientists”, “volunteers,” or “participants.” For the sake of clarity, we use “participants” for the remainder of this review.
Our findings are structured into three parts concerning participants’ preferences for communication of data, findings, and scientific publications. The first part evaluates participants’ preferences for accessibility of data. The second part discusses communication of citizen science project findings to participants, i.e., what researchers or project leaders have done with the data. The third part consists of an evaluation of participants’ preferences for recognition in scientific publications. Table 1 shows which type of communication every paper touched upon. In total, 12 papers discussed accessibility of data, 20 papers discussed communicating findings, and 11 papers discussed acknowledgment of participants in publications. Papers were generally positive toward communicating scientific output, although 7 papers mentioned practical and ethical issues to take into account when doing so.
|Author(s)||Year||Scientific Discipline||Type of communication
||Practical and ethical Issues||Relevant information for this review found in quantitative or qualitative analysis?|
|Alender, B.||2016||Environmental||X||X||X||Quantitative: Likert scale
|Baruch, A., May, A. and Yu, D.||2016||Astronomy||X||Quantitative: Agreement
Qualitative: Interviews or open questions
|Bell, S., Marzano, M., Cent, J., Kobierska, H., Podjed, D., Vandzinskaite, D. et al.||2008||Biology||X||Qualitative: Discussion|
|Bonney, R., Ballard, H., Jordan, R., McCallie, E., Philips, T., Shirk, J. et al.||2009a||Multiple||X||Qualitative: Discussion|
|Bonney, R., Cooper, C. B., Dickinson, J., Kelling, S., Phillips, T., Rosenberg, K. V. et al.||2009b||Biology||X||X||Quantitative: Observational data Qualitative: Discussion|
|Bowser, A., Hansen, D., He, Y., Boston, C., Reid, M., Gunnell, L. et al.||2013||Biology||X||X||Qualitative: Discussion, Interviews or open questions|
|Brossard, D., Lewenstein, B. and Bonney, R.||2005||Biology||X||Qualitative: Discussion|
|Bruyere, B. and Rappe, S.||2007||Environmental||X||Qualitative: Discussion|
|Budhathoki, N. R. and Haythornthwaite, C.||2012||Geography||X||X||Quantitative: Likert scale|
|Carballo-Cárdenas, E. C. and Tobi, H.||2016||Biology||X||X||Qualitative: Interviews or open questions|
|Curtis, V.||2015||Biochemistry||X||Qualitative: Discussion|
|Dickinson, J. L., Shirk, J., Bonter, D., Bonney, R., Crain, R. L., Martin, J. et al.||2012||Environmental||X||X||Qualitative: Discussion|
|Domroese, M. C. and Johnson, E. A.||2017||Biology||X||X||Qualitative: Discussion, Interviews or open questions|
|Druschke, C. G. and Seltzer, C. E.||2012||Environmental||X||Qualitative: Discussion|
|Eveleigh, A., Jennet, C., Blandford, A., Brohan, P. and Cox, A. L.||2014||Environmental||X||Qualitative: Discussion, Interviews or open questions|
|Ferster, C. J., Coops, N. C., Harshaw, H. W., Kozak, R. A. and Meitner, M. J.||2013||Environmental||X||X||Quantitative: Likert scale, Agreement Qualitative: Interviews or open questions|
|Franzoni, C. and Sauermann, H.||2014||Multiple||X||X||Qualitative: Discussion|
|Ganzevoort, W. and Born, R. J. G. van den||2016||Biology||X||X||Qualitative: Interviews or open questions|
|Ganzevoort, W., Born, R. J. G. van den, Halffman, W. and Turnhout, S.||2017||Biology||X||X||X||X||Quantitative: Agreement Qualitative: Interviews or open questions|
|Haywood, B. K.||2016||Biology||X||X||Qualitative: Discussion, Interviews or open questions|
|Hobbs, S. J. and White, P. C. L.||2012||Biology||X||Quantitative: Agreement|
|Iacovides, I., Jennet, C. Cornish-Trestrail, C. and Cox, A. L.||2013||Biochemistry||X||X||Qualitative: Interviews or open questions|
|Jordan, R. C., Gray, S. A., Howe, D. V., Brooks, R. W. and Ehrenfeld, J. G.||2011||Environmental||X||Quantitative: Likert scale|
|Krebs, V.||2010||Health||X||Qualitative: Interviews or open questions|
|Land-Zandstra, A. M., Beusekom, M. M. van, Koppeschaar, C. E. and Broek, J. M. van den||2016a||Health||X||X||Qualitative: Discussion|
|Land-Zandstra, A. M., Devilee, J. L. A., Snik, F., Buurmeijer, F. and Broek, J. M. van den||2016b||Environmental, Health||X||Quantitative: Agreement|
|Martin, V., Smith, L., Bowling, A., Christidis, L., Lloyd, D. and Pecl, G.||2016||Biology||X||Qualitative: Discussion|
|Price, C. A. and Lee, H.||2013||Astronomy||X||Qualitative: Discussion|
|Rotman, D., Preece, J., Hammock, J., Procita, K., Hansen, D., Parr, C. et al.||2012||Biology||X||X||Qualitative: Interviews or open questions|
|Rotman, D., Hammock, J., Preece, J., Hansen, D., Boston, C., Bowser, A. et al.||2014||Multiple||X||Qualitative: Discussion, Interviews or open questions|
|See, L., Mooney, P., Foody, G., Bastin, L., Comber, A., Estima, J. et al.||2016||Geography||X||Qualitative: Discussion|
|Tulloch, A. I. T., Possingham, H. P., Joseph, L. N., Szabo, J. and Martin, T. G.||2013||Biology||X||Qualitative: Discussion|
In several of the reviewed studies, participants noted that they would like to get some insights into the data that they collected. According to See et al. (2016), participants can become motivated because they get something in return such as access to the data for participating in the citizen science project. Likewise, most of the participants in the study of Ferster et al. (2013) on a project that monitored forest fuels agreed with the statement “Data collected by volunteers should be shared with the volunteers who collected them.” Correspondingly, the number of participants in the eBird project on monitoring birds nearly tripled after supplying participants with the ability to track their own observations and compare them with those of others (Bonney et al. 2009b).
Land-Zandstra et al. (2016a) argue that making datasets available for project participants could also foster the learning impact of engaging in citizen science, thereby addressing participants’ motivations to learn something new. In addition, participants can be supplied with data visualization and analysis tools (Bonney et al. 2009a), which may increase a project’s learning impact (Dickinson et al. 2012). Providing opportunities for participants in which they can manipulate and study project data may be one of the most educational aspects of citizen science (Bonney et al. 2009b). Additionally, Druschke and Seltzer (2012) remark that by creating ways for participants to view the data points they have contributed, projects can lead to real engagement among participants in addition to educational impact.
Data collected through citizen science projects also may be shared with other individuals and organizations beyond participants and the project team. Some participants are positive about sharing their collected data not only among themselves, but also with others. Most participants in the study of Ferster et al. (2013) agreed that the data collected by participants should be shared with the general public. Participants in the study of Jordan et al. (2011) on a citizen science project for monitoring invasive plants scored an average 1.7 on a scale from 1 (great extent) to 5 (no extent) on the question “To what extent should scientists share data with the public?” Additionally, participants in the research of Budhathoki and Haythornthwaite (2012) who participated in the OpenStreetMap project on collection of geographic data agreed that their digital map data should be available for free with an average score of 6.45 (out of 7). In the study of Ganzevoort et al. (2017) on several citizen science projects for monitoring biodiversity, 49% of participants felt that the data from their citizen science project are part of the public good. However, 27% felt that the data are owned by the organization running the project, 18% felt that the collected data are private property, and 6% did not know. Participants in the research of Alender (2016) even find public data more important than scientific publications.
Some authors of the reviewed studies note that sharing data with the public can have some disadvantages. For example, sharing data collected on biodiversity could evoke a rush on rare species, resulting in the disturbance of the natural environment or in creating opportunities for poaching (Ganzevoort and Van den Born 2016). Moreover, sharing datasets with the public without professional interpretation may result in data taken out of context, possibly leading to incorrect public understanding of the findings (Ferster et al. 2013).
Another issue is whether to make data accessible for commercial organizations, which may use them to make profit. In the study of Ganzevoort et al. (2017), 37% of participants indicated that their data should not be used for financial gain. Another 26% felt that this issue should be left to the person managing the data, and 16% found sharing data acceptable if the volunteers or the organization are acknowledged. Only 12% were in favor of completely unconditional use of their data (2% had no opinion). In open questions, participants mentioned that project data should be used for the “right” purpose and that volunteer data should not be used by private consultancies (Ganzevoort et al. 2017). Most participants in the research of Ferster et al. (2013) were opposed to selling data collected via citizen science projects to private companies. Participants in the research of Budhathoki and Haythornthwaite (2012) indicated that they were hesitant about sharing their data with commercial organizations for free.
Privacy concerns are also present among citizen science participants. In the study of Ferster et al. (2013), 58% of participants expressed an objection to sharing data being collected on personal private property with the public due to privacy concerns. Participants in the study of Ganzevoort et al. 2017 also mentioned that they were concerned about volunteer privacy.
Some participants mention that they would like to know how their collected data are used in scientific research and what their data are showing (Domroese and Johnson 2017). In the study of Baruch, May, and Yu (2016) on a project aiming at identifying objects and places in satellite images, 28% of the participants noted that they would like to have a follow-up on how the data they collected were used. In the research of Ganzevoort et al. (2017), 69% of the participants were interested in getting insight into how others use their data. Research performed by Land-Zandstra et al. (2016b) indicated that 87% of their participants, who engaged in a citizen science project for measuring aerosols with their smartphone, wanted to know more about what happened with their data. In the study of Alender (2016) on a citizen science project for water quality monitoring, more than 90% of the participants agreed or strongly agreed with the statement “I feel good when data and/or results are shared with me.” Price and Lee (2013) therefore advise to clearly illustrate the participants’ contribution in the overall research project. The following subsections focus on different aspects of communicating findings.
According to Bruyere and Rappe (2007), people are often motivated to contribute to a project when they feel that their invested time is well spent. Participants feel satisfied when their observations are useful (Haywood 2016). Participants often engage in projects with the intention that their contributions directly affect the issue at hand (Alender 2016). Correspondingly, participants want to know in what way their contribution has made an impact (Alender 2016; Baruch, May, and Yu 2016; Bowser et al. 2013). Some participants stop contributing because of a perceived lack of value of their data (Carballo-Cárdenas and Tobi 2016). Hence, communicating to participants the “usefulness” and value of their data is important (Bell et al. 2008; Land-Zandstra et al. 2016a). This idea was also expressed by participants in interviews:
“It’s not about spending time or money. It’s more about the constant feedback to the volunteers that what we’re doing is useful and being used.” (Rotman et al. 2012: 221)
“If you feel like you’ve done something that they [scientists] couldn’t possibly do because they don’t have enough hours in the day, but you’ve done it, and you’ve helped, then you do really feel part of it. It’s very rewarding.” (Iacovides et al. 2013: 1104)
Another effect of sharing findings with participants may be an increased learning impact (Land-Zandstra et al. 2016a), thereby responding to participants’ motivations to learn something new. In the study of Hobbs and White (2012), 29% of participants in the Leeds Garden project and 11% of participants in the BirdWatch project stated that the results helped them learn about a bigger picture. Participants in the study of Haywood (2016) on a citizen science project for documenting seabird population health also indicated that they would like to learn about the “big picture” in which their collected data are situated. Haywood (2016) mentions that communicating information on the project as a whole and the position of participants’ data in the project can increase the learning and knowledge gains that participants associate with the project. Citizen science projects aiming for an increased understanding of how science works among participants need to make participants aware of the scientific process and the ways that participants are involved in it (Brossard, Lewenstein, and Bonney 2005).
Feedback on a project also can be a prominent motivational factor (Rotman et al. 2012). Interviewed participants in the study of Iacovides et al. (2013) on Foldit and Eyewire felt encouraged to contribute to the project when they were given evidence of project progress. Additionally, according to Bell et al. (2008), communicating information on the project can enhance participants’ satisfaction associated with their participation. Martin et al. (2016) mention that clarifying a project’s findings also can confirm and strengthen the motivation of potential new participants. Correspondingly, demonstrating project findings in a public space may motivate other citizens to participate (Bonney et al. 2009b).
Demonstrating a project’s findings also may impact sustained participation. Communicating to participants the extent to which their data are used and valued by scientists or policy makers can be considered a key strategy for participant retention (Bell et al. 2008). Additionally, publicizing findings to non-active participants also may have positive effects. In the study of Eveleigh et al. (2014) on the Old Weather project focusing on transcription of old documents such as weather observations, many non-active participants expressed an ongoing interest in the project even when they stop contributing. One participant also mentioned this during interviews:
“I mean, I get all the emails, you know, so I’ll read them and see, you know, what has Old Weather’s community discovered thus far […] the community, as it is, is contributing to science.” (Eveleigh et al. 2014: 2991)
When communicating project findings to non-active participants, some may regain their motivation to contribute. Others may remain non-active but still interested in the project, and are possibly great advocates for the project who spread enthusiasm to potential new participants (Eveleigh et al. 2014).
When findings are not clearly communicated, participants can become dissatisfied and demotivated (Krebs 2010; Rotman et al. 2012). A lack of clarity on how participants’ data are used is also mentioned by participants as a reason to be less active in the project or even to stop contributing completely (Baruch, May, and Yu 2016; Rotman et al. 2012). Dissatisfaction when findings are not communicated also was expressed during interviews:
“There was no feedback and it made me feel as though what I was doing wasn’t even for real.” (Baruch, May, and Yu 2016: 927)
“People won’t come back if there isn’t that loop of credibility and things that they can see that are being accomplished as a result of the data that they are collecting.” (Rotman et al. 2012: 223)
Even though the overall consensus of the literature is that more communication of findings is better, we also specifically searched for possible negative effects of communicating findings to prevent bias in our findings. As a result, we found two cases in which may have dampened motivation. In the study of Domroese and Johnson (2017) on participants’ motivations for engaging in the Great Pollinator Project on monitoring bees, the researchers note two participants who indicated that their motivation changed after they received project results. The authors do not, however, elaborate upon this finding. In the study of Carballo-Cárdenas and Tobi (2016) on a project to monitor lionfish, participants also decided to stop participating because they did not see the importance of monitoring after they learned about initial results. Both cases show that communicating findings can possibly have a negative effect on participation.
Participants also have opinions regarding scientific publications that result from their citizen science projects. In the research of Alender (2016), slightly more than half of the participants agreed or strongly agreed with the statement “It is important for me that our data are used for scientific publications.” Many participants who strongly agreed that one of their motivations was to contribute to science also agreed that using the data for scientific publications is very important (Alender 2016).
Most importantly, participants would like to be acknowledged in scientific publications, understanding that the data are available only due to their commitment (Haywood 2016). Accordingly, in the research of Ganzevoort et al. (2017), 41% of participants indicated that they like to be cited by name when their data are used. Similarly, approximately 40% of the participants in the study of Alender (2016) found “Name recognition in a scientific publication” meaningful. Participants also reported that they were disappointed upon learning they were not acknowledged in scientific publications (Rotman et al. 2014). Likewise, participants in the study of Rotman et al. (2012) expressed the importance of acknowledging them when scientists use their data for publications. Participants also expressed this during interviews:
“If a name ends up in the acknowledgments, the name ends up in a poster, it’s a measurable thing … I can show the family members and make it more of a positive experience.” (Rotman et al. 2012: 221)
“Just a name and this X and that Y was contributed by this or that person. Something simple … is like a big thing for a normal person, this kind of thing makes it a very personal thing, and that way we encourage all to do it more …” (Rotman et al. 2014: 116)
Participants also have an opinion on the type of journal that a publication in which they are acknowledged gets published in. The participants in the study of Iacovides et al. (2013) point out that they would like to be credited in journals such as Nature. For participants in the study of Rotman et al. (2012), it did not matter whether the resulting publication was peer-reviewed.
In the study of Alender (2016), the youngest age group (age 21–29) scored higher on the meaningfulness of name recognition in scientific publications than all other age groups, with an average agreement score of 4.62 out of 5. Alender (2016) notes that younger participants are usually more motivated by advancing their reputation and career than older participants, which may explain this finding. Franzoni and Sauermann (2014) also argue that recognition in publications is most important for those that seek peer recognition.
Tulloch et al. (2013) have also expressed that the contribution of participants should be clearly acknowledged, for example by being recognized in papers or even being named as co-authors in a scientific paper (Curtis 2015). Both participants and project coordinators of citizen science projects should be recognized in the acknowledgements section of a resulting paper (Dickinson et al. 2012).
Granting all contributing participants authorship in a scientific paper may dilute the value of being an author (Franzoni and Sauermann 2014), but other ways to recognize the contributions of participants in scientific publications exist. One option is to reward participants who have contributed with a badge and to communicate a list of publications which have used the dataset to participants (Bowser et al. 2013). Another method is to use a group pseudonym as (co-)author in order to acknowledge the efforts of the whole group of participants without giving individual credit (Franzoni and Sauermann 2014).
This literature review shows a consensus that communication of scientific output is appreciated by citizen science participants, even though some practical and ethical issues are mentioned. In the papers studying agreement of participants with statements on the importance of communicating scientific output, not all participants (100%) ever agreed. However, it is important to take notice of those participants that do value communication of scientific output. Next we will discuss our findings with regard to Vroom’s Expectancy Theory; consider ethical issues; and provide some advice on how to include communication about scientific output in a project.
In the introduction, we discussed Vroom’s Expectancy Theory to understand factors that motivate citizen science participants. Our review emphasizes the importance of the second component instrumentality, which is based on a participant’s perceived probability that their contribution is actually valuable and will lead to scientific output. Throughout the review, participants have indicated that they like seeing how their contribution has led to an outcome, whether it is a collected dataset, important or interesting findings, or an acknowledgement in a publication. In the case that they do not perceive what happened with their efforts, participants also indicate that their motivation may decrease, leading to a negative effect on sustained participation. This effect is perfectly illustrated by the following quote:
“I’ve done other stuff and you know you don’t get feedback, like “here is what we did with the work you did,” and so you don’t feel like it’s being used well and you don’t feel like you want to continue to contribute.” (Rotman et al. 2012: 221)
Hence, by communicating a project’s output, participants can perceive that they contribute to science, which enhances their level of instrumentality and thereby motivation and sustained participation. In addition to the instrumentality component of motivation, project organizers also need to take the first component expectancy into account by making sure that sufficient information is provided and that the interface or activity is clear. Organizers can also have an effect on the valence component, by clearly communicating the importance of the project and its outcomes, for example the importance for monitoring the environment or advancing scientific knowledge. By taking all three components of motivation into account, citizen science projects can strengthen participants’ motivation and thereby participation.
The results of our review also suggest some ethical considerations. On the one hand, ethical guidelines agree with participants that communication of output is vital. According to the British Psychological Society (2014), researchers should consider making the results of their research available to participants. Sales and Folkman (2014) also underline this in their book “Ethics in Research with Human Participants.” They write that participants should be seen as respected partners in the research, and that respect should be communicated toward participants. Additionally, researchers should consider providing information about the nature of the research and available results to participants after their contribution, in order to enhance the educational value of their participation. Moreover, Sales and Folkman (2014) write that researchers should share the collected data whenever possible.
On the other hand, sharing data also may introduce ethical issues in itself. As the 58% of the participants in the study of Ferster et al. (2013) indicated, privacy concerns emerge when data collected on personal private property is shared with the public. The British Psychological Society (2014) also states that privacy of participants should be respected. Sales and Folkman (2014) also note that researchers should not share private information with others and that data should be anonymous without any personal identifiers. Taking privacy into account is especially important because of the new General Data Protection Regulation for EU and EEA citizens (European Commission n.d.).
Because these guidelines hold for participants in research in general, they also should hold for participants in citizen science, as they should be seen as fellow scientists in the project. To conclude, it is considered ethical to make scientific output as accessible as possible, but personal privacy has to be taken into account at all times.
Because the general consensus of this review is that communicating citizen science output is important for participants, considering how to do this also is important. However, no study in our review has evaluated different ways to communicate project output. Rather, studies have focused on either the method currently used or the possibility of communicating data and output in general. We propose that future research should focus on communication of output to maximize the impact of citizen science projects, including evaluating different data sharing and data comparison options and studying preferred methods of communication.
Based on studies from the gamification field, which is similar to (online) citizen science, one guideline can be to give individual feedback to participants about their own contributions to monitor their own performance (Goh and Lee 2011; Jung, Schneider, and Valacich 2010). Individual performance feedback can enhance their motivation to continue, because it communicates the performance as a result of a participant’s efforts (Jung, Schneider, and Valacich 2010). If we translate this back to the citizen science field, an example of feedback in a bird monitoring project could be to generate a visually appealing overview of the different types of birds and their frequencies monitored by an individual participant, preferably with an additional comparison of one’s individual contributions to the findings on a more general (e.g., regional or national) level. Providing individual feedback also can strengthen the instrumentality component of motivation in Vroom’s theory, by clearly showing how one’s individual performance leads to a contribution to science. One direction of future research could be to study the effect of individual versus more general feedback in different types of projects.
We have aimed at finding all relevant papers for this review with a systematic approach of searching through Web of Science and literature review sections. Still, some relevant papers may have been left out. We believe, however, that the content of the current review is dense enough to draw conclusions for future research and to make recommendations for project coordinators.
Because not many papers have focused on communication of scientific output, we also have included comments from discussion sections next to information from results and interview sections. We note that the information from discussion sections may be positively biased, because the general consensus in the citizen science field is that communication about scientific output is important.
Guidelines such as the “Ten principles of citizen science” defined by the European Citizen Science Association (ECSA) emphasize the importance of communicating feedback on the output of a project to participants and recognizing them in results and publications. However, little empirical evidence supports this guideline. From the evidence we could find, we conclude that participants find it important that scientific output of citizen science projects is communicated to them, if ethical issues are taken into account.
The key recommendations for project organizers are displayed in Figure 3. The main recommendation for those designing and executing citizen science projects is to be aware that participants value communication of their collected data, findings of the project, and publications. This preference is supported and explained by Vroom’s Expectancy Theory. Taking this preference into account can pay off, as sharing data and findings may increase the learning impact of a project. It also may enhance participants’ motivation to engage in the project, sustain their participation, and give participants the feeling that their time was well-spent. Moreover, many participants indicate a desire for being acknowledged in publications. Some practical and ethical issues, however, need to be taken into account when developing tools for sharing scientific output, especially in the case of sharing data.
More research on this topic is strongly needed. Most of the papers in our literature review only evaluate communication of scientific output in their discussion section based on miscellaneous data, rather than incorporating the topic into their questionnaires or interviews. Only a few authors have focused on the topic during interviews, with open questions in surveys or with quantitative analyses. No study in our literature review has evaluated how to communicate scientific output to participants. Because the desire of participants for communication of scientific output is evident from the literature and shared among project coordinators, future research should study different methods for communicating citizen science datasets, findings, and publications to maximize the impact of these projects.
We would like to thank the reviewers for their thoughtful feedback, which has improved this literature review.
The authors have no competing interests to declare.
Alender, B. 2016. Understanding volunteer motivations to participate in citizen science projects: A deeper look at water quality monitoring. Journal of Science Communication, 15(3): A04. Available at: jcom.sissa.it/archive/15/03/JCOM_1503_2016_A04 [Last accessed 8 Aug 2017]. DOI: https://doi.org/10.22323/2.15030204
Andereck, K, McGehee, NG, Lee, S and Clemmons, D. 2012. Experience Expectations of Prospective Volunteer Tourists. Journal of Travel Research, 51(2): 130–141. DOI: https://doi.org/10.1177/0047287511400610
Baruch, A, May, A and Yu, D. 2016. The motivations, enablers and barriers for voluntary participation in an online crowdsourcing platform. Computers in Human Behavior, 64: 923–931. DOI: https://doi.org/10.1016/j.chb.2016.07.039
Bell, S, Marzano, M, Cent, J, Kobierska, H, Podjed, D, Vandzinskaite, D, et al. 2008. What counts? Volunteers and their organisations in the recording and monitoring of biodiversity. Biodiversity and Conservation, 17(14): 3443–3454. DOI: https://doi.org/10.1007/s10531-008-9357-9
Bonney, R, Ballard, H, Jordan, R, McCallie, E, Philips, T, Shirk, J, et al. 2009a. Public Participation in Scientific Research: Defining the Field and Assessing Its Potential for Informal Science Education. A CAISE Inquiry Group Report. Washington, DC: Center for Advancement of Informal Science Education (CAISE). Available at: www.birds.cornell.edu/citscitoolkit/publications/ [Last accessed 9 Aug 2017].
Bonney, R, Cooper, CB, Dickinson, J, Kelling, S, Phillips, T, Rosenberg, KV, et al. 2009b. Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy. BioScience, 59(11): 977–984. DOI: https://doi.org/10.1525/bio.2009.59.11.9
Bowser, A, Hansen, D, He, Y, Boston, C, Reid, M, Gunnell, L, et al. 2013. Using gamification to inspire new citizen science volunteers. In: Proceedings of the First International Conference on Gameful Design, Research, and Applications. Toronto, Canada on 02–04 October 2013. 18–25. DOI: https://doi.org/10.1145/2583008.2583011
British Psychological Society. 2014. Code of Human Research Ethics. Leicester, United Kingdom. Available at: bps.org.uk/news-and-policy/bps-code-human-research-ethics-2nd-edition-2014 [Last accessed 19 June 2018].
Brossard, D, Lewenstein, B and Bonney, R. 2005. Scientific knowledge and attitude change: The impact of a citizen science project. International Journal of Science Education, 27(9): 1099–1121. DOI: https://doi.org/10.1080/09500690500069483
Bruyere, B and Rappe, S. 2007. Identifying the motivations of environmental volunteers. Journal of Environmental Planning and Management, 50(4): 503–516. DOI: https://doi.org/10.1080/09640560701402034
Budhathoki, NR and Haythornthwaite, C. 2012. Motivation for Open Collaboration: Crowd and Community Models and the Case of OpenStreetMap. American Behavioral Scientist, 57(5): 548–575. DOI: https://doi.org/10.1177/0002764212469364
Cappa, F, Laut, J, Nov, O, Giustiniano, L and Porfiri, M. 2016. Activating social strategies: Face-to-face interaction in technology-mediated citizen science. Journal of Environmental Management, 182: 374–384. DOI: https://doi.org/10.1016/j.jenvman.2016.07.092
Carballo-Cárdenas, EC and Tobi, H. 2016. Citizen science regarding invasive lionfish in Dutch Caribbean MPAs: Drivers and barriers to participation. Ocean & Coastal Management, 133: 114–127. DOI: https://doi.org/10.1016/j.ocecoaman.2016.09.014
Causer, T and Wallace, V. 2012. Building A Volunteer Community: Results and Findings from Transcribe Bentham. Digital Humanities Quarterly, 6(2). Available at: http://www.digitalhumanities.org/dhq/vol/6/2/000125/000125.html [Last accessed 22 Aug 2017].
Cooper, S, Khatib, F, Treuille, A, Barbero, J, Lee, J, Beenen, M, et al. 2010. Predicting protein structures with a multiplayer online game. Nature, 466(7307): 756–760. DOI: https://doi.org/10.1038/nature09304
Curtis, V. 2015. Motivation to Participate in an Online Citizen Science Game: A Study of Foldit. Science Communication, 37(6): 723–746. DOI: https://doi.org/10.1177/1075547015609322
Dickinson, JL and Bonney, R. (eds.) 2012. Citizen science: public participation in environmental research. Ithaca, NY: Cornell University Press. DOI: https://doi.org/10.7591/cornell/9780801449116.001.0001
Dickinson, JL, Shirk, J, Bonter, D, Bonney, R, Crain, RL, Martin, J, et al. 2012. The current state of citizen science as a tool for ecological research and public engagement. Frontiers in Ecology and the Environment, 10(6): 291–297. DOI: https://doi.org/10.1890/110236
Domroese, MC and Johnson, EA. 2017. Why watch bees? Motivations of citizen science volunteers in the Great Pollinator Project. Biological Conservation, 208: 40–47. DOI: https://doi.org/10.1016/j.biocon.2016.08.020
Druschke, CG and Seltzer, CE. 2012. Failures of Engagement: Lessons Learned from a Citizen Science Pilot Study. Applied Environmental Education & Communication, 11(3–4): 178–188. DOI: https://doi.org/10.1080/1533015X.2012.777224
Eitzel, MV, Cappadonna, JL, Santos-Lang, C, Duerr, RE, Virapongse, A, West, SE, et al. 2017. Citizen Science Terminology Matters: Exploring Key Terms. Citizen Science: Theory and Practice, 2(1): 1. DOI: https://doi.org/10.5334/cstp.96
European Citizen Science Association (ECSA). 2015. Ten principles of citizen science. London, United Kingdom. Available at: ecsa.citizen-science.net/sites/default/files/ecsa_ten_principles_of_citizen_science.pdf [Last accessed 6 Sep 2017].
European Commission. n.d. Data protection in the EU. Available at: ec.europa.eu/info/law/law-topic/data-protection/data-protection-eu_en [Last accessed 7 Jan 2019].
Evans, C, Abrams, E, Reitsma, R, Roux, K, Salmonsen, L and Marra, PP. 2005. The Neighborhood Nestwatch Program: Participant Outcomes of a Citizen-Science Ecological Research Project. Conservation Biology, 19(3): 589–594. DOI: https://doi.org/10.1111/j.1523-1739.2005.00s01.x
Eveleigh, A, Jennet, C, Blandford, A, Brohan, P and Cox, AL. 2014. Designing for dabblers and deterring drop-outs in citizen science. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Toronto, Canada on 26 April – 01 May 2014. 2985–2994 DOI: https://doi.org/10.1145/2556288.2557262
Fernandez, CV, Kodish, E and Weijer, C. 2003. Informing study participants of research results: An ethical imperative. IRB: Ethics and Human Research, 25(3): 12–19. Available at: jstor.org/stable/3564300 [Last accessed 6 Sep 2017]. DOI: https://doi.org/10.2307/3564300
Ferster, CJ, Coops, NC, Harshaw, HW, Kozak, RA and Meitner, MJ. 2013. An Exploratory Assessment of a Smartphone Application for Public Participation in Forest Fuels Measurement in the Wildland-Urban Interface. Forests, 4(4): 1199–1219. DOI: https://doi.org/10.3390/f4041199
Franzoni, C and Sauermann, H. 2014. Crowd science: The organization of scientific research in open collaborative projects. Research Policy, 43(1): 1–20. DOI: https://doi.org/10.1016/j.respol.2013.07.005
Ganzevoort, W and van den Born, RJG. 2016. Citizen scientists: Een onderzoek naar de motivaties en visies op data delen van vrijwillige natuurwaarnemers (translation: ‘Citizen scientists: A study on the motivations and opinions on sharing data of biodiversity monitoring volunteers’). Nijmegen, the Netherlands: Radboud University. Available at: repository.ubn.ru.nl/handle/2066/158136 [Last accessed 8 Aug 2017].
Ganzevoort, W, van den Born, RJG, Halffman, W and Turnhout, S. 2017. Sharing biodiversity data: citizen scientists’ concerns and motivations. Biodiversity and Conservation, 26(12): 2821–2837. DOI: https://doi.org/10.1007/s10531-017-1391-z
Goh, DH and Lee, CS. 2011. Perceptions, quality and motivational needs in image tagging human computation games. Journal of Information Science, 37(5): 515–531. DOI: https://doi.org/10.1177/0165551511417786
Haywood, BK. 2016. Beyond Data Points and Research Contributions: The Personal Meaning and Value Associated with Public Participation in Scientific Research. International Journal of Science Education, Part B, 6(3): 239–262. DOI: https://doi.org/10.1080/21548455.2015.1043659
Hobbs, SJ and White, PCL. 2012. Motivations and barriers in relation to community participation in biodiversity recording. Journal for Nature Conservation, 20(6): 364–373. DOI: https://doi.org/10.1016/j.jnc.2012.08.002
Iacovides, I, Jennet, C, Cornish-Trestrail, C and Cox, AL. 2013. Do Games Attract or Sustain Engagement in Citizen Science? A Study of Volunteer Motivations. In: CHI ‘13 Extended Abstracts on Human Factors in Computing Systems. Paris, France on 27 April–02 May 2013. 1101–1106. DOI: https://doi.org/10.1145/2468356.2468553
Jennet, C, Kloetzer, L, Schneider, D, Iacovides, I, Cox, AL, Gold, M, et al. 2016. Motivations, learning and creativity in online citizen science. Journal of Science Communication, 15(3): A05. Available at: jcom.sissa.it/archive/15/03/JCOM_1503_2016_A05 [Last accessed 22 Aug 2017]. DOI: https://doi.org/10.22323/2.15030205
Jordan, RC, Gray, SA, Howe, DV, Brooks, RW and Ehrenfeld, JG. 2011. Knowledge Gain and Behavioral Change in Citizen-Science Programs. Conservation Biology, 25(6): 1148–1154. DOI: https://doi.org/10.1111/j.1523-1739.2011.01745.x
Jung, JH, Schneider, C and Valacich, J. 2010. Enhancing the Motivational Affordance of Information Systems: The Effects of Real-Time Performance Feedback and Goal Setting in Group Collaboration Environments. Management Science, 56(4): 724–742. DOI: https://doi.org/10.1287/mnsc.1090.1129
Krebs, V. 2010. Motivations of cybervolunteers in an applied distributed computing environment: Malariacontrol.net as an example. First Monday, 15(2). Available at: firstmonday.org/ojs/index.php/fm/article/view/2783 [Last accessed 9 Aug 2017]. DOI: https://doi.org/10.5210/fm.v15i2.2783
Kullenberg, C and Kasperowski, D. 2016. What Is Citizen Science? – A Scientometric Meta-Analysis. PLoS One, 11(1): e0147152. DOI: https://doi.org/10.1371/journal.pone.0147152
Land-Zandstra, AM, Beusekom, MM, van Koppeschaar, CE and van den Broek, JM. 2016a. Motivation and learning impact of Dutch flu-trackers. Journal of Science Communication, 15(1): A04. Available at: jcom.sissa.it/archive/15/01/JCOM_1501_2016_A04 [Last accessed 4 Aug 2017].
Land-Zandstra, AM, Devilee, JLA, Snik, F, Buurmeijer, F and van den Broek, JM. 2016b. Citizen science on a smartphone: Participants’ motivations and learning. Public Understanding of Science, 25(1): 45–60. DOI: https://doi.org/10.1177/0963662515602406
Martin, V, Smith, L, Bowling, A, Christidis, L, Lloyd, D and Pecl, G. 2016. Citizens as Scientists: What Influences Public Contributions to Marine Research? Science Communication, 38(4): 495–522. DOI: https://doi.org/10.1177/1075547016656191
Martin, VY. 2017. Citizen Science as a Means for Increasing Public Engagement in Science: Presumption or Possibility. Science Communication, 39(2): 142–168. DOI: https://doi.org/10.1177/1075547017696165
Nov, O, Arazy, O and Anderson, D. 2014. Scientists@Home: What Drives the Quantity and Quality of Online Citizen Science Participation? PloS ONE, 9(4): e90375. DOI: https://doi.org/10.1371/journal.pone.0090375
Price, CA and Lee, H. 2013. Changes in Participants’ Scientific Attitudes and Epistemological Beliefs During an Astronomical Citizen Science Project. Journal of Research in Science Teaching, 50(7): 773–801. DOI: https://doi.org/10.1002/tea.21090
Raddick, MJ, Bracey, G, Gay, PL, Lintott, CJ, Cardamone, C, Murray, P, et al. 2013. Galazy Zoo: Motivations of Citizen Scientists. Astronomy Education Review, 12(1). DOI: https://doi.org/10.3847/AER2011021
Raddick, MJ, Bracey, G, Gay, PL, Lintott, CJ, Murray, P, Schawinski, K, et al. 2010. Galaxy Zoo: Exploring the Motivations of Citizen Science Volunteers. Astronomy Education Review, 9(1). DOI: https://doi.org/10.3847/AER2009036
Riesch, H, Potter, C and Davies, L. 2013. Combining citizen science and public engagement: The Open AirLaboratories Programme. Journal of Science Communication, 12(3): A03. Available at: jcom.sissa.it/archive/12/3-4/JCOM1203%282013%29A03 [Last accessed 22 Aug 2017]. DOI: https://doi.org/10.22323/2.12030203
Rotman, D, Hammock, J, Preece, J, Hansen, D, Boston, C, Bowser, A, et al. 2014. Motivations Affecting Initial and Long-Term Participation in Citizen Science Projects in Three Countries. In: iConference 2014 Proceedings. Berlin, Germany on 04–07 March 2014. 110–124. DOI: https://doi.org/10.9776/14054
Rotman, D, Preece, J, Hammock, J, Procita, K, Hansen, D, Parr, C, et al. 2012. Dynamic changes in motivation in collaborative citizen-science projects. In: Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work. Seattle, WA on 11–15 February 2012. 217–226. DOI: https://doi.org/10.1145/2145204.2145238
See, L, Mooney, P, Foody, G, Bastin, L, Comber, A, Estima, J, et al. 2016. Crowdsourcing, Citizen Science or Volunteered Geographic Information? The Current State of Crowdsourced Geographic Information. International Journal of Geo-Information, 5(5): 55. DOI: https://doi.org/10.3390/ijgi5050055
Seeberger, A. 2014. There’s No Such Thing as Free Labor: Evaluating Citizen Science Volunteer Motivations. Master thesis, University of Colorado. University of Colorado Museum of Natural History Graduate Theses & Dissertations. 15. Available at: scholar.colorado.edu/cumuse_gradetds/15 [Last accessed 22 Aug 2017].
Shalowitz, DI and Miller, FG. 2008. The search for clarity in communicating research results to study participants. Research Ethics, 34(9): e17. DOI: https://doi.org/10.1136/jme.2008.025122
Tinati, R, Luczak-Roesch, M, Simperl, E and Hall, W. 2017. An investigation of player motivations in Eyewire, a gamified citizen science project. Computers in Human Behavior, 73: 527–540. DOI: https://doi.org/10.1016/j.chb.2016.12.074
Tulloch, AIT, Possingham, HP, Joseph, LN, Szabo, J and Martin, TG. 2013. Realising the full potential of citizen science monitoring programs. Biological Conservation, 165: 128–138. DOI: https://doi.org/10.1016/j.biocon.2013.05.025
Weerts, DJ and Ronca, JM. 2007. Profiles of Supportive Alumni: Donors, Volunteers, and Those Who “Do It All”. International Journal of Educational Advancement, 7(1): 20–34. DOI: https://doi.org/10.1057/palgrave.ijea.2150044
West, S and Pateman, R. 2016. Recruiting and Retaining Participants in Citizen Science: What Can Be Learned from the Volunteering Literature? Citizen Science: Theory and Practice, 1(2): 15. DOI: https://doi.org/10.5334/cstp.8