Start Submission

Reading: Co-Creating and Implementing Quality Criteria for Citizen Science

Download

A- A+
Alt. Display

Case Studies

Co-Creating and Implementing Quality Criteria for Citizen Science

Authors:

Florian Heigl ,

Institute of Zoology, University of Natural Resources and Life Sciences, Vienna, AT
X close

Barbara Kieslinger,

Centre for Social Innovation – ZSI, Vienna, AT
X close

Katharina T. Paul,

Department of Political Science, Faculty of Social Sciences, University of Vienna, AT
X close

Julia Uhlik,

accent Inkubator GmbH, Tough Tech Incubator of Lower Austria, Wiener Neustadt, AT
X close

Didone Frigerio,

Konrad Lorenz Research Center for Behaviour and Cognition, University of Vienna, Grünau im Almtal, AT; Department of Behavioural and Cognitive Biology, University of Vienna, Vienna, AT
X close

Daniel Dörler

Institute of Zoology, University of Natural Resources and Life Sciences, Vienna, AT
X close

Abstract

Citizen science is increasingly recognized as a valid research methodology by the research community and policy makers alike. In our experience, however, citizen science is sometimes used as a catchall term for activities that involve scientific and democratic innovation, resource efficiency in scientific processes, outreach, and education. We fear that this use of the term citizen science risks undermining the recognition of citizen science in academia as well as among citizen scientists and the general public in the longer term. Informed by these concerns, we report on a transdisciplinary attempt to establish quality criteria to decide in a transparent manner which citizen science projects are listed on Österreich forscht, an Austrian citizen science platform that is based on an established network of citizen scientists, academic researchers, funding institutions, and research institutions. We present 20 quality criteria and their relationship to existing literature, and describe the process by which they were formulated over a one-year period: a series of transdisciplinary exchanges, concerning what shape citizen science should take in the particular context of Austria, and the potential implications of certain quality criteria for individual disciplines and practitioners. While we realize that any demarcation process is bound to produce exclusionary effects, we argue that the bottom-up, transdisciplinary nature of our working group was a necessary step for Österreich forscht to strengthen its identity and purpose.

How to Cite: Heigl, F., Kieslinger, B., Paul, K.T., Uhlik, J., Frigerio, D. and Dörler, D., 2020. Co-Creating and Implementing Quality Criteria for Citizen Science. Citizen Science: Theory and Practice, 5(1), p.23. DOI: http://doi.org/10.5334/cstp.294
673
Views
204
Downloads
1
Citations
15
Twitter
  Published on 25 Nov 2020
 Accepted on 11 Sep 2020            Submitted on 12 Nov 2019

Introduction

Citizen science has recently gained currency in the research landscape among researchers, research institutions, funders, and policymakers (Fritz et al. 2017; Haklay 2014; Hecker et al. 2018). Citizen Science Associations have been established across the globe to support and promote citizen science practices (European Citizen Science Association 2019a; Citizen Science Association 2019; Australian Citizen Science Association 2019; Citizen Science Global Partnership 2019). For some, citizen science methodologies promise to deliver data and analysis more efficiently and to contribute to science education. For others, however, it appears to be used as a catch-all term that combines scientific innovation with democratic innovation while also appearing to be a more efficient use of resources (Guerrini et al. 2018; Turrini et al. 2018). Elsewhere, citizen science is understood as an outreach activity (European Commission 2017). Although different understandings of citizen science are in use (Heigl et al. 2019a), this has not prevented the term from being taken up widely by the general media and becoming popular in science policy discourse (Strasser et al. 2019).

Research funding programs, such as the European Union’s (EU’s) Horizon 2020, have contributed to the growing popularity and adoption of citizen science as a research methodology through dedicated grants (European Commission 2017). In addition, national funding agencies offer support for citizen science (e.g., Fonds zur Förderung der wissenschaftlichen Forschung [FWF], or Austrian Science Fund, 2019; Dörler and Heigl 2019).

The fact that the term citizen science is loosely defined is both an advantage and a challenge for the growing community of practitioners of citizen science, who are continuing to establish the methods of citizen science as recognized scientific practices (Eitzel et al. 2017; Elliott and Rosenberg 2019; Heigl et al. 2019a). On the one hand, this openness enables novel, inclusive and innovative approaches; on the other hand, scientists across disciplines doubt whether citizen science can truly live up to standards of good scientific practice (Elliott and Rosenberg 2019). Additionally, citizen science is often confused with other forms of participatory projects, such as science education, science communication, and outreach activities (Kasperowski et al. 2017). Indeed, there seems to be a mismatch between the very premise of citizen science embracing methodological innovation (Riesch and Potter 2013) and the need to establish a set of shared criteria to practice it (see discussion in (Auerbach et al. 2019; Heigl et al. 2019a, b).

The range of projects listed on citizen science platforms across the world (see, e.g., www.citizen-science.at for Austria, www.buergerschaffenwissen.de for Germany, or www.citizenscience.org.au for Australia) confirms that citizen science can be applied across all disciplines. In fact, platforms listing citizen science projects face two main challenges. First, they are responsible to the general public for the quality of the projects listed. Such responsibility should not be left to the potential citizen scientist alone, as his/her participation relies in part on their trust in citizen science itself. Second, the credibility of citizen science for the scientific community in general can be preserved with the inclusion of clear statements on each platform about what users can expect from citizen science. Therefore, the coordinators of such platforms need transparent guidelines when it comes to deciding whether to list a project or not. This requirement has recently also been taken up by the European Citizen Science Association (ECSA) working group on citizen science networks (European Citizen Science Association 2019b).

In this paper, we reflect on the development and implementation of quality criteria in the context of the Austrian citizen science platform Österreich forscht, a scientist-led platform that details a range of citizen science projects, facilitates knowledge exchange between citizen science actors, and promotes awareness of citizen science (Dörler and Heigl 2019; Pettibone et al. 2017; Richter et al. 2018). We present the factors that drove a one-year-long open co-creation process, resulting in a set of quality criteria that represent the requirements to which projects need to conform to be listed on Österreich forscht. We follow the definition of Crosby (1978), which states, “quality is conformance with requirements.” We are well aware that such requirements can only be context-related, so we reflect on the initial experiences in applying these criteria to specific projects within the Austrian platform. While we aim to preserve the diversity of citizen science in Austria, we also feel the need to protect its status. In this sense, all criteria should be interpreted as minimum standards that all the projects listed on the Österreich forscht platform must meet. The criteria aim to ensure the quality of the platform and should thus strengthen the citizen science community in Austria. We conclude by presenting a set of caveats for further similar efforts elsewhere.

What does citizen science stand for?

Although there seems to be general agreement that citizen science refers to the participation of the general public in scientific processes, this participation manifests itself in highly varied ways (Bonney 1996; Irwin 1995). Other terms currently in use for similar activities include community science, participatory science, and participatory action research. These terms share distinctive but often tacit ideals of good scientific practice, and particularly ideals of participation. Schäfer and Kieslinger (2016) underline the heterogeneity of citizen science projects, which range from science-driven contributory projects to innovative participatory approaches and citizen-driven action projects. A helpful analysis of the terminology of citizen science and related concepts has been provided by Eitzel et al. (2017) who suggest using the term as broadly as possible, and including a whole set of related terms referring to science that involves the public. Strasser et al. (2019), however, express concern that citizen science is currently being used as a fashionable term that does not delineate itself clearly enough from other scientific practices and thus runs the risk of becoming an obscure label that could be applied to almost anything.

In an effort to develop a set of minimum criteria for good practice in citizen science, the ECSA has identified ten core principles (Robinson et al. 2018), which have been adopted by the United States and Australian Citizen Science associations as well. Although the ten principles provide a sound basis for good practice in citizen science, we as a platform needed assessable and explicit criteria to decide whether to list a project or not. Therefore, in 2017, we established a working group on quality criteria for citizen science projects listed on Österreich forscht.

Methods

The Österreich forscht platform was established in 2014 by two of the authors (FH and DD) with the objectives of (1) connecting citizen science actors in Austria, (2) providing the broadest possible overview of citizen science projects, and (3) further developing citizen science as a methodology (Dörler and Heigl 2018; Richter et al. 2018). The platform originated from an independent consortium of project leaders, showcasing their activities on a shared website, and it formally turned into the Citizen Science Network Austria (CSNA) in 2017 (Citizen Science Network Austria, 2019). The network consists of more than 40 members from universities, public authorities, museums, associations, companies, funding bodies, and NGOs all working together for the advancement of citizen science across institutional and disciplinary boundaries.

As a platform for citizen science projects in Austria, Österreich forscht is committed to guaranteeing the quality of the projects listed for the general public. This meets the expectations of the project leaders who asked for transparent criteria to provide a common baseline for the listed projects. When the platform was launched, the coordinators (FH and DD) evaluated projects for scientific and participatory aspects; however, this was done without transparent documentation of the decision process. Thus, in 2017, the CSNA, as a collective platform owner, reconsidered the selection process for projects listed on the platform.

Developing procedures to promote a joint understanding

Between March 2017 and March 2018, a working group within the CSNA, consisting of representatives from 17 institutions (see Heigl et al. 2018a for the detailed list of contributing institutions), developed criteria for the transparent evaluation of projects in the application phase for listing on Österreich forscht. The starting point of this collaborative process was based on existing classifications of citizen science projects (Cohn 2008; Haklay 2013; Sanz et al. 2014; Wiggins and Crowston 2011), the ECSA Ten Principles of Citizen Science (Robinson et al. 2018), and the Vienna Principles on Scholarly Communication (Kraker et al. 2016). From the outset, the intention was to make this process as open and transparent as possible (Figure 1).

Figure 1 

Process of the development of the criteria catalogue. The catalogue was developed over the course of six meetings (blue framed boxes). Between meetings 2 and 3, and between 4 and 5, the leaders of projects listed on Österreich forscht were asked to comment on the respective versions (project-leader feedback). Between meetings 3 and 4, a public consultation on Version 0.2 of the criteria was held on the website Österreich forscht to give interested citizens the opportunity to comment on the criteria. At meeting 6, feedback collected from the international citizen science community via a workshop on Version 1.0 organized by colleagues from the German platform Bürger schaffen Wissen at the Austrian Citizen Science Conference 2018 was incorporated. CS: citizen science.

All project leaders listed on Österreich forscht were given the option of joining the working group at any stage in the process, either by attending working group meetings in person or through online communication tools. In this multi-stage process, the conveners collected different forms of knowledge (e.g., scientific, experiential) from working group members as well as the feedback repeatedly provided by external experts from science and technology studies. During each stage of the process, detailed minutes were sent to all project leaders listed on Österreich forscht to get feedback on both the meeting minutes and the most recent version of the criteria. Additionally, throughout September 2017, the general public was invited to comment on the draft criteria through an online consultation process similar to a public opinion survey (Rowe and Frewer 2000) (Figure 1). The platform coordinators issued invitations in a press release, in social media advertisements, and on the Österreich forscht website in a dedicated public consultation section called “Diskutieren Sie mit!” (Join the discussion!). From previous media analysis of our channels, we know that the audience reached through these channels are citizens with a genuine interest in science. In addition, project leaders were invited via personal e-mails to give feedback. The online consultation resulted in 57 comments by 15 anonymous users on version 0.2 of the quality criteria catalogue and 6 e-mails from project leaders.

The combined approach of open working group meetings, their minutes, feedback loops involving project leaders, and the public consultation allowed us to merge different perspectives and research traditions. In particular, we discussed the nature and shape of public participation in scientific processes across disciplines as well as its value for practitioners (such as public authorities or conservation associations). The heterogeneous composition of the working group also challenged members to critically reflect on their traditions, convictions, and experiences as scientists and practitioners. For example, the notion of scientific rigor was the subject of several challenging discussions, and turned out to be understood differently in the different disciplines. In fact, the group was able to formulate the respective criteria only after reaching a common understanding of the research process in different academic fields (natural sciences, humanities, social sciences, and art sciences).

Version 1.0 of the quality criteria catalogue was published on Zenodo (https://zenodo.org/) in German and in English (Heigl et al. 2018a) and was presented at the 4th Austrian Citizen Science Conference in February 2018. The German citizen science platform hosted a workshop at the same conference to discuss the criteria from various perspectives. The results of the conference workshop were incorporated into version 1.1 of the catalogue and published in the Open Science Framework (Heigl et al. 2018b). Version 1.1 is the current version of the quality criteria catalogue. Since 1 February 2018, new projects wishing to be listed on Österreich forscht have been required to meet the criteria at the time of listing. Projects already listed on the platform were asked to adapt to meet the criteria within 16 months.

To ease the implementation of the new criteria, each criterion was converted into a question, i.e., a questionnaire was generated (Heigl et al. 2019). Guidelines, frequently asked questions (FAQs), and links (e.g., to Data Management Plan (DMP)–templates) were provided as additional support (Heigl et al. 2019). Following submission of the completed questionnaire, the platform coordinators consult with the working group, and in case of ambiguities, contact the project leaders for clarification and to provide any support needed. The aim of this process is to ensure a shared understanding of the characteristics of citizen science projects, and work jointly towards maintaining and improving this. An open dialogue and respectful interaction between all actors involved are prerequisites for this process.

Results and Discussion

Description and scientific background of the criteria catalogue

The current version (Version 1.1) of the catalogue consists of 20 criteria covering seven areas: (1) what is not citizen science; (2) scientific standards; (3) collaboration; (4) open science; (5) communication; (6) ethics; and (7) data management (Heigl et al. 2018b). While these criteria represent the outcome of the transdisciplinary co-creation process described above, current citizen science literature was also incorporated.

In the following section, we discuss each criterion in relation to existing experiences and literature (Table 1).

Table 1

Criteria for inclusion in Österreich forscht, their relationship to the ECSA and/or Vienna Principles, and the relevant references that formed the basis of discussions in the meetings of the working group. EP: ECSA Ten Principles of Citizen Science; VP: Vienna Principles: A Vision for Scholarly Communication.

Set of criteria Specific criterion Based on principle References

What is not citizen science A. The catalogue excludes projects that exclusively involve people with project-specific professional and scientific backgrounds. EP1 Cohn 2008; Haklay 2013; Sanz et al. 2014
B. The catalogue excludes projects by professional scientists or scientific institutions, in which people are merely interviewed regarding their opinion/attitude, way of life, etc. Haklay 2013
C. The catalogue excludes projects by professional scientists or scientific institutions, which merely collect data on participants. EP1 Haklay 2013
D. The catalogue excludes projects by professional scientists or scientific institutions, in which participants provide resources only passively. Wiggins and Crowston 2011
Scientific standards 1. There must be a stated scientific question, hypothesis or goal that can be answered, tested or achieved with the project. VP10; EP2
2. The methods must be presented in a field-specific, appropriate and comprehensible way. VP8
3. New knowledge must be generated (e.g. improved understanding of certain relationships), or new methods developed. VP10; EP2
Collaboration 4. There must be an added value for all participants, both citizen scientists and professional scientists. EP3 Tweddle et al. 2012
5. The objectives of the project must be unachievable without the citizen scientists’ collaboration. Lave 2012
6. Citizen scientists must be involved during at least one project element. Common elements of research projects include:
  • Search for a topic and formulation of research questions
  • Method design
  • Data collection
  • Data analysis and interpretation
  • Publication and communication of results
  • Project governance
VP7; EP1&4 Shirk et al. 2012
7. The project definition and objectives are open, clear, easily found and communicated in a generally comprehensible manner. VP6 Tulloch et al. 2013
8. The assignment of tasks must be clear and transparent. Newman et al. 2010
Open Science 9. All data and metadata is made publicly available, provided there are no legal or ethical arguments against doing so. VP1–4&12; EP7 European Research Council 2017; Wilkinson et al. 2016
10. The results are published in an open-access format, provided there are no legal or ethical arguments against doing so. VP1–4&12; EP7 Berlin Declaration 2003; Chan et al. 2002; European Research Council 2017
11. The results are findable, reusable, comprehensible and transparent. VP1–4 Berlin Declaration 2003; Chan et al. 2002; European Research Council 2017; Wilkinson et al. 2016
Communication 12. Different interest groups are addressed accordingly. VP6 Bonney et al. 2009; Pace et al. 2010
13. Contact details (e.g. e-mail address, phone number or contact form on the website) are easy to find, in case of questions or feedback. Interaction between project management and citizen scientists must be possible at all times. Newman et al. 2010
14. Citizen scientists receive feedback on the progress and the results of the project. VP5; EP5 Mackechnie et al. 2011
15. The project results are published in a generally comprehensible manner. VP6 Bonney et al. 2009
Ethics 16. The project objectives must be ethically sound (i.a., in compliance with human and basic rights). EP10 European Parliament 2000
17. The project must follow transparent ethical principles in compliance with ethical standards, such as obtaining informed consent from participants or the parents of participating children, among others. EP10 Kupper et al. 2015
18. Clear information on data policy and governance (regarding personal and research data) must be published within the project, and participants must consent to this information prior to participation. EP10 Kupper et al. 2015
19. Project management must reflect and consider ethical aspects (e.g., diversity, inclusion, gender equality, reflection on in- or exclusion of specific groups). Kupper et al. 2015
Data management 20. Prior to data collection, all projects must have established a data management plan which conforms to the European General Data Protection Regulation. European Research Council 2017

The table is based on (Heigl et al. 2018b). Wording appears exactly as in the published criteria.

The first set of criteria (A–D) is aimed primarily at framing what the core of a citizen science project is (see Cohn 2008; Haklay 2013; Sanz et al. 2014; Wiggins and Crowston 2011). The working group opted for a negative list for this set (i.e., projects that are not citizen science), to retain as much openness as possible to a range of different concepts and disciplines. As a consequence, any project that is not excluded by these four criteria is considered potentially eligible for inclusion on Österreich forscht. Because this first set of criteria relates to the very core of a citizen science project, they are numbered separately (i.e., A–D versus 1–20 for the remaining parts of the catalogue).

The second set of criteria (1–3) refers to scientific standards as they apply to the different characteristics of the research conducted: How is scientific rigor defined and what can scientific rigor mean in different disciplines? Who defines the nature of scientific standards and what implications do these standards have for the participatory nature of citizen science, particularly beyond the natural sciences? These questions were discussed extensively in group meetings, initially producing a diverse range of notions of scientific standards across disciplines. Broadly informed by a Science and Technology Studies perspective (Felt et al. 2016), the working group members realized the contingent nature of standards in their own disciplines, and the performative nature of standards in allowing for some forms of knowledge to be produced but not others. This does not mean that anything goes, but indicates the need for a decentralized approach when it comes to scientific standards: Criteria 1 and 2 thus remain more flexible than the first set (A–D), allowing for hypotheses, research questions, or scientific goals to drive the research design of a project including field-specific methods. It is essential that new knowledge is generated by citizen science projects (criterion 3) even if the purpose is to confirm previous findings.

Collaboration between professional scientists and non-professional scientists (i.e., citizen scientists) is essential in most citizen science projects. As a consequence, the way in which this collaboration is organized is one key area in our criteria catalogue. To collaborate on a research project, people need to see the benefit of this collaboration (criterion 4) (Robinson et al. 2018; Tweddle et al. 2012).

Citizen science is frequently criticized as a way of gathering large amounts of data without financial compensation for participants (e.g., Lave 2012). However, citizen science may also generate insights that could never be achieved without the participation of citizen scientists. To address this aspect, we included criterion 5, which states that the project’s goals are not realistically achievable without the participation of citizen scientists. This ensures that the collaboration between scientists and citizens is an integral part of the project, if professional scientists play the lead role (as this could be otherwise).

Collaboration may take place at different stages in the research process. Citizen scientists may contribute to, or indeed lead, the formulation of research questions, methods design, the collection or interpretation of data, and the communication of results, or they may manage the whole project (Shirk et al. 2012). In criterion 6, we address the specific elements of the research process that allow for this participation. In fact, the goals and objectives of the project must be clearly understood by all parties (e.g., Kraker et al. 2016; Tulloch et al. 2013), allowing potential participants to make an informed decision about whether to participate or not. The clear formulation of the goals and objectives of the project is therefore addressed in criterion 7.

For collaboration to function smoothly, all those involved should be aware of their roles in the project (Newman et al. 2010). Criterion 8 therefore addresses the need to ensure that tasks are communicated clearly and in accessible language.

Because the data and results obtained through citizen science rely on voluntary collaboration, data and results should be openly available, except in cases where publication would lead to legal or ethical issues. This request is consistent with the fundamental demands of the open science movement (e.g., FAIR principles, Budapest Open Access Initiative, Berlin Declaration), funding requirements at the EU level (e.g., European Research Council 2017), the ECSA principles on citizen science (Robinson et al. 2018), and the Vienna Principles (Kraker et al. 2016). Three criteria (9, 10, and 11) therefore relate to open data, open access publication, and the transparent and accessible communication of results.

Communication tasks in citizen science projects do not exclusively address participating citizen scientists. We also request transparent communication to the general public. Any interested person should be able to contact a project delegate and find information about the project in an easily accessible and comprehensible manner (criteria 13 and 15; (Bonney et al. 2009; Newman et al. 2010). Participating citizen scientists need bi-directional communication; e.g., they need to receive feedback on the process and results, and they need to be able to interact with others in the project team (criterion 14) (Mackechnie et al. 2011). Finally, good communication in citizen science projects is also defined by targeted communication strategies and methods (Bonney et al. 2009; Pace et al. 2010). Project leaders need to address specific target groups (e.g., school children or senior citizens) in an appropriate manner (criterion 12).

Basic research ethics have become standard and are part of the majority of research funding programs today. In quality criteria 16, 17, 18, and 19, the ethical requirements are rooted in the Charter of Fundamental Rights of the European Union (European Parliament 2000) and in Responsible Research and Innovation (RRI) literature (e.g., Kupper et al. 2015) and practice (e.g., RRI Tools 2019). In addition, the importance of diversity and inclusion was a key finding in the co-creation process of the quality criteria. Because diversity and inclusion are not yet considered in standard scientific practice, these criteria help to raise awareness of an inclusive community and of barriers to participation.

Many research funders now require a data management plan (DMP) that describes how data will be handled during the research project—e.g., the Austrian Science Fund [FWF] (Rieck 2019) and the European Research Council (2017). The working group agreed that citizen science projects would benefit from a DMP, which has manifested in criterion 20.

Current status of adoption, implementation, and reactions

When the criteria were presented in February 2018, 64 projects were listed on Österreich forscht. When the criteria went into effect in June 2019, 15 projects were finalized and archived and 12 projects were newly listed, resulting in 61 active projects. Of these 61 projects, 39 managed to adapt to the criteria within the given timeframe; 13 were still in the process of adapting to the criteria; and 9 did not send a complete questionnaire for their projects. Those projects were therefore removed from the platform (Figure 2).

Figure 2 

Current status of the implementation of the quality criteria by projects in absolute numbers as of June 26, 2019. The institutional backgrounds of the project leaders are represented by different color codes.

Figure 2 indicates that the criteria in general are applicable regardless of whether the project is led by a university, by an association, by a citizen, or by another entity. Some projects have still not been able to complete the implementation process, often because of a lack of time, as personal communication with project leaders revealed. Initial feedback from those who have applied the quality criteria, however, confirms that the process was feasible, although not always easy, to complete. Admittedly, the obligation to make a case for inclusion on the website can prove onerous for some and may create or further exacerbate inequalities of representation between different disciplines and practices. We want to emphasize that despite this hurdle, we have observed an increase in social science and humanities projects (as newer and more experimental forms of citizen science).

At the national level, the quality criteria have been critically reviewed by the working group since they were first established. Based on the experiences of the project leaders who were responsible for implementing the criteria, additional support resources have been put in place. These include explanatory guidelines, templates, and targeted workshops on how to implement the criteria, requirements, and recommendations. Furthermore, the process is being evaluated externally by a consultancy company that is not part of the CSNA to enable changes to the criteria and to the process on an objective basis. The evaluation process should also provide insights into why some projects did not implement the criteria.

Reflections on the process

We have herein described the collaborative development of a set of quality criteria for citizen science projects within the Austrian citizen science platform Österreich forscht. The quality criteria developed have been presented in the contexts of the relevant literature, of previously formulated principles of citizen science and open science, and of discussions within the working group. We now critically reflect on those criteria in light of their applications and of their impacts on perceptions of citizen science in the CSNA.

The contradiction between the very notion of citizen science, which is open to a range of different stakeholders and approaches, and the exclusionary nature of establishing quality criteria was a constant subject of discussion before and during the development process. Defining what constitutes citizen science by establishing a set of criteria may therefore seem a counterintuitive step. The application of such criteria could stifle new developments, especially if this is done in a top-down manner and without any regulatory process that can take new developments into account (Ottinger 2010). In the case of the citizen-led documentation of environmental degradation, for instance, bottom-up initiatives serve an important function in democratizing institutionalized science, and standardization may hamper such efforts to innovate (Ottinger 2010). Moreover, this could lead to a situation in which the criteria are mainly addressed by projects led by universities, and bottom-up initiatives may feel excluded by the complexity of the criteria catalogue. We found that this drawback can be overcome by including bottom-up initiatives in the process of developing such criteria. All stakeholders involved in the process should be treated as equals, regardless of their background. Everyone involved in our process of formulating the criteria for Österreich forscht shared the concerns described above and contributed to making the process as open as possible. We sought to actively engage non-university stakeholders in the development of the criteria, which led to the direct involvement of 12 non-university members (compared with five university members) in the working group, as well as other interested citizens who provided feedback during the online consultation. In hindsight, the anonymous collection of feedback from the general public could have been improved by collecting some demographic data, since this would have allowed us to better understand the comments made in context and to learn more about those who took an interest.

The development of quality criteria for citizen science projects and the possible exclusion of projects from Österreich forscht does, of course, pose risks: Excluded projects may not have access to the resources of the platform (e.g., an active community, public relations activities, or recruitment support). Moreover, the platform itself could suffer reputational costs if the quality criteria are so difficult to achieve that many projects are excluded. Indeed, developing quality criteria for citizen science projects implies a trade-off between openness and minimum standards (Ottinger 2010). For example, one of the goals of the criteria was to demarcate citizen science projects from purely educational or non-scientific participatory projects, which are equally important to society but usually have no research goal or question (Kasperowski et al. 2017). In our view, the distinction between citizen science and other forms of participatory projects can only be made provisionally, should always be made pragmatically, and be driven by transdisciplinary input in a bottom-up manner. To address these concerns, the set of quality criteria that we have presented is reviewed continuously based on the experience of applying them to new projects, and on feedback from project leaders, engaged citizens, and new initiatives in the field of citizen science. The working group will therefore evaluate and jointly reflect on the criteria on a regular basis to identify and correct any shortcomings.

As previously stated, citizen science as a whole has a threefold responsibility to citizen scientists, to the scientific community, and to decision makers (Hecker et al. 2018; Kasperowski et al. 2017): (1) citizen scientists must be able to trust that the time and effort they invest in a project serve a greater goal (Land-Zandstra et al. 2016), that their contributions are recognized (Rotman et al. 2012) and that their privacy is not misused in any way (Bowser et al. 2014); (2) the scientific community must have confidence that the results generated through citizen science projects are valid and reliable (Elliott and Rosenberg 2019; Guerrini et al. 2018; Hecker et al. 2018; Newman et al. 2012); and (3) policy-makers have to be sure that citizen science projects are both scientifically and ethically sound (Haklay 2014, 2020; Hecker et al. 2018; Kasperowski et al. 2017). These expectations can be met by applying quality criteria and continuously evaluating citizen science and the criteria applied.

When seeking to establish quality criteria for citizen science that are equally valid across all disciplines, interdisciplinarity itself is another obstacle. Any attempt to impose quality criteria would most likely favor certain disciplines at the expense of others. For example, conceptions of what constitutes a scientific approach in the natural sciences, humanities, social sciences, and art sciences will not necessarily be consistent (Bauer 1990; Lélé and Norgaard 2005). Whereas natural scientists strive for the reproducibility of results (the extent to which consistent results are obtained when an experiment is repeated), humanities scholars emphasize the singularity of cases and the inherent perspectivism of qualitative research (e.g., in cultural anthropology). Taking the obstacle of interdisciplinarity into account, the process of formulating quality criteria for Österreich forscht was designed to be open to all citizen science actors. The composition of the group enabled a respectful and wide-ranging discussion of matters such as different conceptions of scientific approaches, and this ultimately led to a consensus on the definition of minimum scientific standards.

Conclusion

The diversity of the institutions that have successfully completed the process of adapting their project to the criteria reflects the diversity of citizen science as a whole (Pettibone et al. 2017; Pocock et al. 2017). This is a confirmation of the continued openness of our platform, which we aimed to preserve, while at the same time introducing minimum standards. The criteria were developed in collaboration with the project leaders listed on the platform, which promoted personal identification with the criteria, rooting them in the platform community so they are not perceived as a top-down imposition. Preliminary results of an initial evaluation carried out with project leaders (report in preparation) indicate that the process has strengthened the Austrian citizen science community, and the application of the criteria has created self-confidence. Project leaders appreciate being part of the community, and having their projects listed on the platform gives them more confidence in the work in which they are engaged in part because it adds another level of quality. The explicit criteria make listing decisions more transparent and understandable for all involved. Furthermore, these decisions rest on the support of the community in the form of the joint agreement on the quality criteria catalogue.

Finally, we conclude that quality criteria for citizen science are helping to promote its credibility and status in academia and with the general public. To maintain this positive effect, however, all stakeholders must be involved in formulating such criteria. Furthermore, the designers of such criteria need to reflect on the impact of such within local or regional cultural contexts and need to allow a certain degree of flexibility and openness to new developments that may challenge the validity of individual criteria. Different concepts of citizen science persist, depending on national and historical contexts (Eitzel et al. 2017; Scheliga et al. 2018). The current set of quality criteria is based on the experience of Austrian citizen science projects and the existing international literature. The wholesale adoption of the Austrian criteria in other countries or contexts is therefore not advisable without thorough reflection, but we hope that our experience can serve as a starting point for other initiatives facing similar challenges.

Acknowledgements

We would like to thank all the members of the working group on quality criteria for citizen science projects on Österreich forscht, as well as the external experts and all participants who contributed to the online consultation and the workshops. We especially thank the University of Natural Resources and Life Sciences for funding the Citizen Science Network Austria and the associated online platform Österreich forscht. Our sincere thanks also go to the dedicated project leaders who have collaborated with us in the adaptation process and to all the partner institutions that have supported the CSNA to date. Open access funding provided by BOKU Vienna Open Access Publishing Fund.

Competing Interests

The authors have no competing interests to declare.

Author Contributions

Florian Heigl, Barbara Kieslinger, Katharina T. Paul, Julia Uhlik, Didone Frigerio, and Daniel Dörler contributed equally to the writing and revising of the paper.

References

  1. Auerbach, J, Barthelmess, EL, Cavalier, D, et al. 2019. The problem with delineating narrow criteria for citizen science. Proceedings of the National Academy of Sciences, 116(31): 15336–15337. DOI: https://doi.org/10.1073/pnas.1909278116 

  2. Australian Citizen Science Association. 2019. Australian Citizen Science Association – Citizen science is redefining how we do science. Available at: https://citizenscience.org.au/ (accessed 7 November 2019). 

  3. Austrian Science Fund (FWF). 2019. Top Citizen Science Funding Initiative. Available at: https://www.fwf.ac.at/en/research-funding/fwf-programmes/top-citizen-science-funding-initiative/ (accessed 7 November 2019). 

  4. Bauer, HH. 1990. Barriers against interdisciplinarity: Implications for studies of science, technology, and society (STS). Science, Technology, & Human Values, 15(1): 105–119. DOI: https://doi.org/10.1177/016224399001500110 

  5. Berlin Declaration. 2003. Available at: https://openaccess.mpg.de/Berlin-Declaration (accessed 8 November 2019). 

  6. Bonney, R. 1996. Citizen science: A lab tradition. Living Bird 15(4): 7–15. 

  7. Bonney, R, Cooper, CB, Dickinson, J, et al. 2009. Citizen science: A developing tool for expanding science knowledge and scientific literacy. BioScience, 59(11): 977–984. DOI: https://doi.org/10.1525/bio.2009.59.11.9 

  8. Bowser, A, Wiggins, A, Shanley, L, et al. 2014. Sharing data while protecting privacy in citizen science. interactions, 21(1): 70–73. DOI: https://doi.org/10.1145/2540032 

  9. Chan, L, Cuplinskas, D, Eisen, M, et al. 2002. Budapest Open Access Initiative. Available at: https://www.budapestopenaccessinitiative.org/read (accessed 8 November 2019). 

  10. Citizen Science Association. 2019. About the Citizen Science Association – Citizen Science Association|Citizen Science. In: Citizen Science Association. Available at: https://www.citizenscience.org/association/about/ (accessed 7 November 2019). 

  11. Citizen Science Global Partnership. 2019. CSGP – About Us. Available at: http://citizenscienceglobal.org/about.html (accessed 7 November 2019). 

  12. Citizen Science Network Austria. 2019. Citizen Science Network Austria. Available at: https://www.citizen-science.at/netzwerk (accessed 7 November 2019). 

  13. Cohn, JP. 2008. Citizen science: Can volunteers do real research? BioScience 58(3): 192. DOI: https://doi.org/10.1641/B580303 

  14. Crosby, PB. 1978. Quality Is Free: The Art of Making Quality Certain. New York: McGraw-Hill Education – Europe. 

  15. Dörler, D and Heigl, F. 2018. Recent developments in the Austrian citizen science landscape. In: Austrian Citizen Science Conference 2017 – Expanding Horizons, Lausanne, 2018. Frontiers Media SA. 

  16. Dörler, D and Heigl, F. 2019. Citizen science in Austria. Mitteilungen der Vereinigung Österreichischer Bibliothekarinnen und Bibliothekare, 72(2). DOI: https://doi.org/10.31263/voebm.v72i2.2836 

  17. Eitzel, MV, Cappadonna, JL, Santos-Lang, C, et al. 2017. Citizen science terminology matters: Exploring key terms. Citizen Science: Theory and Practice, 2(1): 1–20. DOI: https://doi.org/10.5334/cstp.96 

  18. Elliott, KC and Rosenberg, J. 2019. Philosophical foundations for citizen science. Citizen Science: Theory and Practice, 4(1): 9. DOI: https://doi.org/10.5334/cstp.155 

  19. European Citizen Science Association. 2019a. Community. Available at: https://ecsa.citizen-science.net/community/map (accessed 25 October 2019). 

  20. European Citizen Science Association. 2019b. ECSA Working Group: Citizen Science Networks. Available at: https://ecsa.citizen-science.net/working-groups/ecsa-working-group-citizen-science-networks (accessed 7 November 2019). 

  21. European Commission. 2017. EN Horizon 2020 Work Programme 2018–2020 – 16. Science with and for Society. 27 October. Available at: http://ec.europa.eu/research/participants/data/ref/h2020/wp/2018-2020/main/h2020-wp1820-swfs_en.pdf (accessed 23 July 2018). 

  22. European Parliament. 2000. Charter of Fundamental Rights of the European Union. Available at: http://www.europarl.europa.eu/charter/default_en.htm (accessed 8 November 2019). 

  23. European Research Council. 2017. Guidelines on Implementation of Open Access to Scientific Publications and Research Data in projects supported by the European Research Council under Horizon 2020. Version 1.1. European Commission. Available at: https://ec.europa.eu/research/participants/data/ref/h2020/other/hi/oa-pilot/h2020-hi-erc-oa-guide_en.pdf. 

  24. Felt, U, Fouché, R, Miller, CA, et al. (eds). 2016. The Handbook of Science and Technology Studies. fourth edition. Cambridge, Massachusetts: MIT Press Ltd. 

  25. Fritz, S, See, L, Perger, C, et al. 2017. A global dataset of crowdsourced land cover and land use reference data. Scientific Data, 4: 170075. DOI: https://doi.org/10.1038/sdata.2017.75 

  26. Guerrini, CJ, Majumder, MA, Lewellyn, MJ, et al. 2018. Citizen science, public policy. Science, 361(6398): 134–136. DOI: https://doi.org/10.1126/science.aar8379 

  27. Haklay, M. 2013. Citizen science and volunteered geographic information: Overview and typology of participation. In: Sui D, Elwood S, and Goodchild M (eds) Crowdsourcing Geographic Knowledge. Netherlands: Springer, pp. 105–122. Available at: http://link.springer.com/chapter/10.1007/978-94-007-4587-2_7 (accessed 18 February 2015). DOI: https://doi.org/10.1007/978-94-007-4587-2_7 

  28. Haklay, M. 2014. Citizen Science and Policy: A European Perspective. Available at: http://wilsoncenter.org/publication/citizen-science-and-policy-european-perspective (accessed 11 February 2015). 

  29. Haklay, M, et al. 2020. ECSA’s Characteristics of Citizen Science. DOI: https://doi.org/10.5281/zenodo.3758668 

  30. Hecker, S, Bonney, R, Haklay, M, et al. 2018. Innovation in citizen science – perspectives on science-policy advances. Citizen Science: Theory and Practice, 3(1): 4. DOI: https://doi.org/10.5334/cstp.114 

  31. Heigl, F, Dörler, D, Bartar, P, et al. 2018a. Quality Criteria for Citizen Science Projects on Österreich forscht. Zenodo. DOI: https://doi.org/10.31219/osf.io/2b5qw 

  32. Heigl, F, Dörler, D, Bartar, P, et al. 2018b. Quality Criteria for Citizen Science Projects on Österreich forscht|Version 1.1. Open Science Framework. DOI: https://doi.org/10.31219/osf.io/48j27 

  33. Heigl, F, Dörler, D, Bartar, P, et al. 2019. Quality criteria catalogue for citizen science projects on Österreich forscht – Questionnaire for project managers. preprint, 11 June. Open Science Framework. DOI: https://doi.org/10.31219/osf.io/2b5qw 

  34. Heigl, F, Kieslinger, B, Paul, KT, et al. 2019a. Opinion: Toward an international definition of citizen science. Proceedings of the National Academy of Sciences, 116(17): 8089–8092. DOI: https://doi.org/10.1073/pnas.1903393116 

  35. Heigl, F, Kieslinger, B, Paul, KT, et al. 2019b. Reply to Auerbach et al.: How our Opinion piece invites collaboration. Proceedings of the National Academy of Sciences, 116(31): 15338–15338. DOI: https://doi.org/10.1073/pnas.1909628116 

  36. Irwin, A. 1995. Citizen Science: A Study of People, Expertise and Sustainable Development. London and New York: Routledge Chapman & Hall. 

  37. Kasperowski, D, Kullenberg, C and Mäkitalo, Å. 2017. Embedding Citizen Science in Research: Forms of engagement, scientific output and values for science, policy and society. DOI: https://doi.org/10.31235/osf.io/tfsgh (accessed 11 November 2019). 

  38. Kraker, P, Dörler, D, Ferus, A, et al. 2016. The Vienna Principles: A Vision for Scholarly Communication in the 21st Century. Zenodo. DOI: https://doi.org/10.31263/voebm.v69i3.1733 

  39. Kupper, F, Klaassen, P, Rijnen, M, et al. 2015. Report on the quality criteria of Good Practice Standards in RRI. Athena Institute, VU University Amsterdam. Available at: https://www.fosteropenscience.eu/content/report-quality-criteria-good-practice-standards-rri (accessed 8 November 2019). 

  40. Land-Zandstra, AM, Devilee, JLA, Snik, F, et al. 2016. Citizen science on a smartphone: Participants’ motivations and learning. Public Understanding of Science, 25(1): 45–60. DOI: https://doi.org/10.1177/0963662515602406 

  41. Lave, R. 2012. Neoliberalism and the production of environmental knowledge. Environment and Society, 3(1): 19–38. DOI: https://doi.org/10.3167/ares.2012.030103 

  42. Lélé, S and Norgaard, RB. 2005. Practicing interdisciplinarity. BioScience, 55(11): 967–975. DOI: https://doi.org/10.1641/0006-3568(2005)055[0967:PI]2.0.CO;2 

  43. Mackechnie, C, Maskell, L, Norton, L, et al. 2011. The role of ‘Big Society’ in monitoring the state of the natural environment. Journal of Environmental Monitoring, 13(10): 2687–2691. DOI: https://doi.org/10.1039/c1em10615e 

  44. Newman, G, Zimmerman, D, Crall, A, et al. 2010. User-friendly web mapping: Lessons from a citizen science website. International Journal of Geographical Information Science, 24(12): 1851–1869. DOI: https://doi.org/10.1080/13658816.2010.490532 

  45. Newman, G, Wiggins, A, Crall, A, et al. 2012. The future of citizen science: emerging technologies and shifting paradigms. Frontiers in Ecology and the Environment, 10(6): 298–304. DOI: https://doi.org/10.1890/110294 

  46. Ottinger, G. 2010. Buckets of resistance: Standards and the effectiveness of citizen science. Science, Technology, & Human Values, 35(2): 244–270. DOI: https://doi.org/10.1177/0162243909337121 

  47. Pace, ML, Hampton, SE, Limburg, KE, et al. 2010. Communicating with the public: Opportunities and rewards for individual ecologists. Frontiers in Ecology and the Environment, 8(6): 292–298. DOI: https://doi.org/10.1890/090168 

  48. Pettibone, L, Vohland, K and Ziegler, D. 2017. Understanding the (inter)disciplinary and institutional diversity of citizen science: A survey of current practice in Germany and Austria. PLOS ONE, 12(6): e0178778. DOI: https://doi.org/10.1371/journal.pone.0178778 

  49. Pocock, MJO, Tweddle, JC, Savage, J, et al. 2017. The diversity and evolution of ecological and environmental citizen science. PLOS ONE, 12(4): e0172579. DOI: https://doi.org/10.1371/journal.pone.0172579 

  50. Richter, A, Dörler, D, Hecker, S, et al. 2018. Capacity building in citizen science. In: Citizen Science – Innovation in Open Science, Society and Policy. London, UK: UCL Press, pp. 269–283. DOI: https://doi.org/10.2307/j.ctv550cf2.26 

  51. Rieck, K. 2019. Austrian Science Fund (FWF) – Research Data Management. Available at: https://www.fwf.ac.at/en/research-funding/open-access-policy/research-data-management/ (accessed 8 November 2019). 

  52. Riesch, H and Potter, C. 2013. Citizen science as seen by scientists: Methodological, epistemological and ethical dimensions. Public Understanding of Science: 0963662513497324. DOI: https://doi.org/10.1177/0963662513497324 

  53. Robinson, LD, Cawthray, JL, West, SE, et al. 2018. Ten principles of citizen science. In: Citizen Science – Innovation in Open Science, Society and Policy. London, UK: UCL Press, pp. 27–40. DOI: https://doi.org/10.2307/j.ctv550cf2.9 

  54. Rotman, D, Preece, J, Hammock, J, et al. 2012. Dynamic changes in motivation in collaborative citizen-science projects. In: Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, New York, NY, USA, 2012, pp. 217–226. CSCW ’12. ACM. DOI: https://doi.org/10.1145/2145204.2145238 

  55. Rowe, G and Frewer, LJ. 2000. Public participation methods: A framework for evaluation. Science, Technology, & Human Values, 25(1): 3–29. DOI: https://doi.org/10.1177/016224390002500101 

  56. RRI Tools. 2019. Available at: https://www.rri-tools.eu/de (accessed 8 November 2019). 

  57. Sanz, FS, Holocher-Ertl, T, Kieslinger, B, et al. 2014. White Paper on Citizen Science for Europe. Socientize Consortium. Available at: https://ec.europa.eu/futurium/en/content/white-paper-citizen-science. 

  58. Schäfer, T and Kieslinger, B. 2016. Supporting emerging forms of citizen science: a plea for diversity, creativity and social innovation. Journal of Science Communication, 15(2): Y02. DOI: https://doi.org/10.22323/2.15020402 

  59. Scheliga, K, Friesike, S, Puschmann, C, et al. 2018. Setting up crowd science projects. Public Understanding of Science, 27(5): 515–534. DOI: https://doi.org/10.1177/0963662516678514 

  60. Shirk, JL, Ballard, HL, Wilderman, CC, et al. 2012. Public participation in scientific research: a framework for deliberate design. Ecology and Society, 17(2). DOI: https://doi.org/10.5751/ES-04705-170229 

  61. Strasser, BJ, Baudry, J, Mahr, D, et al. 2019. “Citizen science”? Rethinking science and public participation. Science & Technology Studies, 52–76. DOI: https://doi.org/10.23987/sts.60425 

  62. Tulloch, AIT, Possingham, HP, Joseph, LN, et al. 2013. Realising the full potential of citizen science monitoring programs. Biological Conservation, 165: 128–138. DOI: https://doi.org/10.1016/j.biocon.2013.05.025 

  63. Turrini, T, Dörler, D, Richter, A, et al. 2018. The threefold potential of environmental citizen science – Generating knowledge, creating learning opportunities and enabling civic participation. Biological Conservation, 225: 176–186. DOI: https://doi.org/10.1016/j.biocon.2018.03.024 

  64. Tweddle, JC, Robinson, LD, Pocock, MJ, et al. 2012. Guide to citizen science: developing, implementing and evaluating citizen science to study biodiversity and the environment in the UK. London: Natural History Museum. 

  65. Wiggins, A and Crowston, K. 2011. From conservation to crowdsourcing: A typology of citizen science. In: 2011 44th Hawaii International Conference on System Sciences, January 2011, pp. 1–10. DOI: https://doi.org/10.1109/HICSS.2011.207 

  66. Wilkinson, MD, Dumontier, M, Aalbersberg, IJ, et al. 2016. The FAIR Guiding Principles for scientific data management and stewardship. Scientific Data, 3: 160018. DOI: https://doi.org/10.1038/sdata.2016.18