Start Submission

Reading: Building Capacity to Apply Citizen Science Approaches in Policy and Practice for Public Heal...

Download

A- A+
Alt. Display

Methods

Building Capacity to Apply Citizen Science Approaches in Policy and Practice for Public Health: Protocol for a Developmental Evaluation of Four Stakeholder-Led Projects

Authors:

Samantha Rowbotham ,

University of Sydney, AU
X close

Yvonne Laird,

University of Sydney, AU
X close

Leah Marks,

University of Sydney, AU
X close

Pippy Walker,

University of Sydney, AU
X close

Katherine Pontifex,

Wellbeing SA, AU
X close

Amani Sobhan,

South Western Sydney Local Health District, AU
X close

Karen Wardle,

South Western Sydney Local Health District, AU
X close

Kim Jose,

University of Tasmania, AU
X close

Kate Garvey,

Tasmanian Department of Health, AU
X close

Sean O’Rourke,

VicHealth, AU
X close

Ben Smith

University of Sydney, AU
X close

Abstract

Citizen science is gaining attention as an approach to involving communities in gathering data and contributing to decision-making in public health. Stakeholders interested in citizen science have identified a need for support in applying these approaches and in obtaining evidence of their value. However, there have been few attempts to evaluate citizen science approaches within policy and practice contexts in public health. Within this protocol paper, we outline an approach to evaluating stakeholder-led citizen science projects that focuses on fostering innovation and building capacity in the use of citizen science approaches by these stakeholders.

We will use developmental evaluation, which focuses on ongoing reflection and adaptation, to guide the development and delivery of four stakeholder-led citizen science projects in public health. We will employ a multiple embedded case study design, using surveys and interviews, observations of project meetings, reflective journaling, and document review to gather perspectives from a range of stakeholders across the four projects. Data will be synthesised to explore how projects using citizen science approaches operate within policy and practice contexts, including the barriers and facilitators to their application, the circumstances under which they are most useful, and the impacts of these approaches.

A developmental approach to evaluation will enable us to build capacity in the use of citizen science approaches by sharing insights and learnings as project teams navigate their individual projects. We hope that this paper will stimulate further discussion about the application and evaluation of citizen science approaches in public health and beyond.

How to Cite: Rowbotham, S., Laird, Y., Marks, L., Walker, P., Pontifex, K., Sobhan, A., Wardle, K., Jose, K., Garvey, K., O’Rourke, S. and Smith, B., 2022. Building Capacity to Apply Citizen Science Approaches in Policy and Practice for Public Health: Protocol for a Developmental Evaluation of Four Stakeholder-Led Projects. Citizen Science: Theory and Practice, 7(1), p.31. DOI: http://doi.org/10.5334/cstp.488
235
Views
40
Downloads
13
Twitter
  Published on 19 Jul 2022
 Accepted on 09 Jun 2022            Submitted on 21 Dec 2021

Background

Public engagement approaches that bring together diverse stakeholders to collectively identify problems and generate solutions are recognised as a vital strategy to strengthen research-informed policy and practice to improve health and reduce inequities (Australian Public Service Commission 2007; Frieden 2014; National Health and Medical Research Council 2016; Todd and Nutbeam 2018; World Health Organisation 2017; World Health Organisation 2019). For example, in Australia, the National Health and Medical Research Council (NHMRC) outlines a vision of “community members, researchers and research organisations working in partnership to improve the health and well-being of all Australians through health and medical research” (National Health and Medical Research Council 2016), and public engagement is a key pillar of the strategies of health promotion agencies around Australia (Department for Health and Ageing and Government of South Australia 2017; Tasmanian Government: Department of Health and Human Services 2016; VicHealth 2019; Wellbeing SA and Government of South Australia 2020) However, incorporating community perspectives into research and policy-making in public health has proven challenging, and there is a need to develop capacity and infrastructure to enable policy and practice stakeholders and researchers to engage with the public in meaningful ways (Cacari-Stone et al. 2014; Gudes et al. 2015).

Citizen science approaches are a means of actively involving members of the public in scientific research, for example in collecting and analysing data and contributing to research design (Haklay et al. 2021). While originating in the natural sciences, citizen science approaches are increasingly being used to involve the public in gathering and making sense of data to address a range of public health issues (Kullenberg and Kasperowski 2016; Marks et al. 2022). For example, a recent scoping review (Marks et al. 2022) revealed a growing number of projects in which members of the public gather data on the features of their physical environments that help or hinder them to be healthy (e.g., neighbourhood walkability, green space, access to healthy foods). Such approaches provide opportunities for meaningful involvement of the public in research on issues that affect their health and wellbeing and can lead to community mobilisation and advocacy to address these issues (Barrie et al. 2019; Stanford Medicine 2020).

By involving members of the public as researchers, citizen science approaches can increase the amount of data that is gathered and analysed and increase the cost-effectiveness of research (Hecker et al. 2019; Theobald et al. 2015). The large and novel data sets obtained through citizen science approaches have already made a significant impact on ecology and environmental research (Dickinson, Zuckerberg, and Bonter 2010; Silvertown et al. 2011; Sullivan et al. 2014). Involving community members with diverse knowledge, experience, and perspectives can also lead to the development of new research questions and increase the relevance and applicability of evidence generated. This can lead to the joint discovery of solutions to societal and scientific problems at a local, national, and global scale. Citizen science can also act as a vehicle for public engagement, education, and empowerment, increasing scientific literacy, topic knowledge and interest in science, and awareness, concern, and support for action to address societal issues (Hecker et al. 2019; Resnik, Elliott, and Miller 2015). Further, the use of citizen science approaches can improve policy decision-making and implementation, bringing about benefits for society as a whole (Hecker et al. 2019; Schade et al. 2021).

While citizen science projects in public health have typically been led by academic researchers (Marks et al. 2022), there is growing interest in these approaches amongst policy and practice stakeholders, as a complementary approach to involving members of the public in their work. Amongst these stakeholders there is particular interest in the potential for citizen science to gain new perspectives on issues that affect the health and wellbeing of local communities and increase community support for actions to improve health and wellbeing. Despite increasing attention to the intersection between citizen science and policy, much of the work in this space has been in the environmental sciences (Hecker et al. 2019), and to date, little attention has been paid to understanding the feasibility of policy and practice stakeholder-led citizen science approaches in public health. Within this paper, we introduce the Citizen Science in Prevention project, which aims to address this gap by seeking to support and evaluate the use of citizen science approaches amongst policy and practice stakeholders, and provide the basis for an expansion of these approaches in public health.

Evaluation is vital to demonstrating the impacts of citizen science approaches and the factors that influence their success (Hecker et al. 2018; Kieslinger et al. 2017). While many citizen science projects incorporate evaluative components, there have been few commonly established evaluation indicators or frameworks, with significant variation in the focus of individual project evaluations, from focusing on learning of individual participants (citizen scientists) through to the scientific knowledge gained, limiting opportunities for comparability across projects (Kieslinger et al. 2017). As argued by Kieslinger et al. (2017), evaluations of citizen science need to assess the value of citizen science in terms of both processes and outcomes, across a range of domains, including the scientific domain (i.e., the knowledge gained through the project), the citizen scientist domain (i.e., involvement and impacts of individual citizen scientists), and the socio-ecological dimension (i.e., the societal impacts of the project). Further, in order to facilitate and demonstrate the success of citizen science approaches in achieving their potential, “thoughtful evaluation needs to be embedded into a project’s design … (and) careful design and definition of desired project outcomes, ongoing monitoring of outcomes and adaptive management, and publishing lessons learned will move the field of citizen science forward” (Hecker et al. 2018). In line with this call for more thoughtful and embedded evaluation, we outline how we will use a developmental approach to evaluation (Patton 2010) to ensure that evaluation is built into citizen science projects from the outset in a way that enables ongoing reflection and adaptation as well as provides rich insights into the feasibility and impacts of citizen science approaches in public health.

The Citizen Science in Prevention project

For citizen science to become embedded within policy and practice, it is vital that stakeholders have the knowledge and skills to apply these approaches, including an understanding of the impacts of these approaches and the barriers and facilitators to their successful implementation. The Citizen Science in Prevention (CSP) project (Australian Prevention Partnership Centre 2021) was established to build capacity in and strengthen the evidence base for the use of citizen science approaches in policy and practice in public health. CSP is a co-produced project, in which we are working closely with four health promotion agencies operating at the local or state level in Australia (South Western Sydney Local Health District, Tasmanian Public Health Services, the Victorian Health Promotion Foundation (VicHealth), and Wellbeing SA; collectively referred to as “project partners”) that had expressed interest in using citizen science approaches in their work. By working closely with these stakeholders, the CSP project aims to provide practice-based insights concerning the design, management, and impacts of citizen science approaches, and to guide capacity-building efforts for stakeholders in public health and related sectors. The CSP project began in April 2020 and will run until March 2023.

A core component of the CSP project is the developmental evaluation of four stakeholder-led citizen science projects. These projects are being resourced and led by the project partners as a means of trialling the use of citizen science approaches within their organisations. The projects span a range of issues in public health, from digital marketing of unhealthy products to community walkability. They engage members of the public in a variety of ways, from capturing screen shots on social media, to conducting audits of their local community and engaging in advocacy and action. Policy and practice stakeholders in each of the projects have partnered with councils and/or universities (referred to as “project implementers”) to enable the development and implementation of these projects. An overview of each of the four projects is provided in Figure 1.

Overview of the four citizen science projects included in the evaluation
Figure 1 

Overview of the four citizen science projects included in the evaluation.

Within this paper, we outline our protocol for the developmental evaluation of these four stakeholder-led citizen science projects, in which we will work closely with project partners to enable ongoing reflection and use emerging insights to guide the development and implementation of the projects. Through this approach, we seek to develop a rich understanding of how citizen science projects operate in policy and practice settings, and through the participatory nature of the evaluation process, we aim to build capacity of stakeholders in the use of citizen science approaches beyond the current projects. In this protocol paper, our aim is to contribute to the burgeoning field of citizen science evaluation by setting out an approach to the participatory evaluation of stakeholder-led projects that can be used to go beyond a focus on “what works” to gaining an in-depth understanding of how citizen science projects can be designed and conducted across different policy and practice contexts.

Aim

We aim to understand how citizen science projects operate within policy and practice contexts, including the barriers and facilitators to their application, the circumstances under which they are most useful, and the impacts of these approaches. The following questions will guide this evaluation:

  1. Perspectives: How are citizen science approaches perceived and valued by different actors (including project partners, implementers, other stakeholders within the partner organisations, and citizen scientists)?
  2. Processes: How are citizen science projects implemented in practice? How does the design and implementation of projects align with project goals? What are the barriers and facilitators to implementation?
  3. Impacts: What are the perceived impacts of the citizen science projects (for example, what are the benefits in terms of knowledge gained, change in policy and practice, and for citizen scientists) and how are these brought about?
  4. Context: What contextual factors influence the design, implementation and impacts of citizen science projects in public health? Under what circumstances are citizen science projects feasible (or not)?

Methods

Approach

We will use a developmental evaluation approach (Patton 2010) to examine how citizen science projects operate within policy and practice contexts. Developmental evaluation shares commonalities with other participatory, reflexive, and learning-oriented evaluation approaches (e.g., Arkesteijn, van Mierlo, and Leeuwis 2015; Guijt 2014; Klaassen et al. 2020; Mayne and Stern 2013), and provides a structured way to continually collect, analyse, and use data to support ongoing decision-making. It is particularly suited to innovative, complex, and dynamic projects, where inputs, activities, and outcomes are not known in advance (Patton 2010). Developmental evaluation differs from more traditional forms of evaluation by focusing on supporting the process of innovation rather than seeking to judge the merit and value of a standardised program or to assist in embedding programs into practice (Gamble and The J.W. McConnell Family Foundation 2008).

Within developmental evaluation, the evaluator is situated as part of the team that is responsible for developing and implementing a new approach, and their role is to bring evaluative thinking into the process of innovation, encouraging stakeholders to continually reflect on and learn from actions taken (Patton 2010). In line with this, the evaluation team will work closely with individual project teams to support the design and delivery of the citizen science projects. Several processes have been established to support the reflective and iterative nature of developmental evaluation and to enable feedback of insights and responsiveness to challenges as they emerge. The lead evaluators (SR and YL, who are experienced in evaluation and citizen science) will attend regular project meetings for each of the four projects, and the evaluation team (SR, YL, LM, PW and BS) will meet regularly to critically reflect on insights and issues emerging from each of the four projects.

To ensure that the processes and outcomes of the evaluation are tailored to the needs of the project partners, we have taken a participatory approach to evaluation in which project partners are named investigators on the CSP research team and are engaged regularly, through quarterly project meetings, regular email communications, and individual project meetings. Project partners were involved in developing and refining the overall approach to evaluation and provided input on the development of data collection instruments to ensure that the data that is gathered is relevant and useful, and will be involved in reviewing and interpreting data, and developing recommendations based on the findings from the evaluation. Given the focus on demonstrating the value of citizen science and building capacity in the use of these approaches in policy and practice contexts, this participatory approach is crucial to ensuring that evaluation findings are relevant, appropriate, and likely to be used (Guijt 2014).

Several key principles underpin our approach within this project, and these are outlined in Figure 2.

Principles underlying our approach to evaluation
Figure 2 

Principles underlying our approach to evaluation.

The evaluation protocol has been approved by the University of Sydney Human Research Ethics Committee (Ref: 2020/647).

Design

We will use a multiple embedded case study design, in which cases are examined at different levels of analysis. An embedded case study design is appropriate when trying to generate an in-depth understanding of complex phenomenon where there are perspectives of multiple stakeholders (Yin 2008). Within our study, each of the four citizen science projects will be a separate case, with each of the stakeholder groups representing a secondary unit of analysis (see Figure 3). This will enable us to undertake in-depth evaluations of each of the citizen science projects as well as comparing trends across the cases, examining the interplay between contextual factors and project-specific factors in shaping the processes and outcomes of the projects.

Overview of embedded multiple–case study design (adapted from Yin 2009)
Figure 3 

Overview of embedded multiple–case study design (adapted from Yin 2009).

Data collection

To enable a comprehensive evaluation of the citizen science initiatives from the perspective of the different actors involved, four groups of participants will be recruited across the four projects: project partners, project implementers, citizen scientists and other stakeholders (see Table 1 for a description of each of these groups and how they will be recruited). Consistent with our aim of understanding how citizen science projects operate within policy and practice contexts, the primary target of evaluation activities are the policy and practice stakeholders leading the citizen science projects, with these stakeholders engaged in formal and informal evaluation activities across the duration of the project.

Table 1

Overview of participants and recruitment.


PARTICIPANTS DESCRIPTION RECRUITMENT

Project partners Policy and practice stakeholders who are leading the citizen science projects. Project partners are co-investigators on the project and have agreed to be involved in evaluation

Project implementers People who are assisting with or responsible for implementing the citizen science projects, including council staff and university-based researchers. Project implementers will be contacted via project partners and invited to be involved in evaluation.

Citizen scientists People who have taken part in one of the citizen science projects Citizen scientists will be contacted via project partners and implementers and invited to be involved in evaluation.

Other stakeholders Relevant stakeholders from within the partner organisations leading the citizen science projects and other organisations who have engaged with or are likely to be influenced by the results of the citizen science projects. Other stakeholders will be identified through discussions with project partners and implementers and invited to take part in the evaluation.

A mixed-methods approach using a convergent parallel design (Creswell and Plano-Clark 2011) will be used, and methods will include surveys and interviews, observations of project meetings, reflective journaling, and document review. Table 2 provides an overview of the methods of data collection to address each of the evaluation questions and these are outlined in more detail below.

Table 2

Overview of evaluation domains, questions, participants, and data collection methods.


DOMAIN EVALUATION QUESTIONS PARTICIPANTS DATA COLLECTION METHODS

Perspectives How are citizen science approaches perceived and valued by different actors (including project partners, implementers, other stakeholders within the partner organisations, and citizen scientists)?
  • Project partners
  • Implementers
  • Citizen scientists
  • Other stakeholders
  • Interviews
  • Meetingobservations
  • Reflective journals
  • Surveys

Processes How are the citizen science projects implemented in practice? How does design and implementation of the projects align with project goals? What are the barriers and facilitators to implementation?
  • Project partners
  • Implementers
  • Interviews
  • Document review
  • Meeting observations
  • Reflective journals

Impacts What are the perceived impacts of the citizen science projects (for example on knowledge gained, policy and practice, and citizen scientists)?
  • Project partners
  • Implementers
  • Citizen scientists
  • Other stakeholders
  • Interviews
  • Document review
  • Meeting observations

Context What contextual factors influence the design, implementation and impacts of citizen science projects in public health? Under what circumstances are citizen science projects feasible (or not)?
  • Project partners
  • Implementers
  • Citizen scientists
  • Interviews
  • Meeting observations
  • Reflective journals
  • Interviews

Project partners and implementers

Semi-structured interviews will be the primary form of data collection within this project, and will enable us to explore the processes, perceptions, impacts, and contextual factors influencing each of the citizen science projects from a range of perspectives. Interviews with project partners and project implementers will be conducted at two timepoints over the course of each of the four projects, with the first interviews taking place during early stages of project development and implementation and follow up interviews within 6 to 12 months of project completion. The early interviews will focus on perceptions of citizen science approaches and what they can offer, the goals and expected impacts of each project, and early experiences in planning and implementing projects. Follow-up interviews will explore the implementation and impacts of each citizen science project, including barriers and facilitators to implementation, and contextual factors influencing how projects played out in practice.

In addition to interviews, we will gather data from observations of meetings with project partners and implementers to help us to document how decisions are made, what issues arise, and how the citizen science projects are conceptualised as they progress through development and implementation. Members of the evaluation team will host quarterly meetings with all project partners, and where possible will attend regular meetings with each of the project teams and will gather reflective notes on key insights and issues arising. Where appropriate, members of the evaluation team will prompt project partners and implementers to reflect on issues.

Through ongoing discussions with project partners and implementers we will also identify relevant documentation related to project design, implementation, and expected and actual impacts. These documents will include project plans, interim reports, and final reports. We will also complete a review of key documents from across the CSP project, including notes from Community of Practice sessions to capture any key reflections or insights related to the research questions.

Finally, project partners will be asked to complete a reflective journal as the projects progress to document their experiences in the process of engaging with citizen science, including emerging contextual factors, important decisions, and challenges faced. We will prompt partners to record these notes following key meetings as well as encouraging them to capture notes in an ongoing manner.

Citizen scientists

Upon completion of each project, we will invite the citizen scientists involved in each project to take part in an online survey and follow up interview to reflect on their motivations and experiences of participating in the respective citizen science projects. The online survey will allow us to gather quantitative data that can be compared across the four projects to identify similarities and differences in motivations and experiences across projects and will enable us to purposively select participants to invite to interviews. Through follow-up interviews we will further explore citizen scientists’ interests in relation to the project, their experiences of taking part, the perceived impacts of the project, and the likelihood of engaging in other citizen science projects in the future. Involvement in the evaluation is independent of involvement in the individual projects, and citizen scientists are not obliged to be involved in the evaluation process. To acknowledge their contribution to the evaluation, citizen scientists will receive a $25 gift card upon completion of a follow-up interview.

Other policy and practice stakeholders

We will undertake semi-structured interviews with a range of other policy and practice stakeholders to explore their perceptions of citizen science approaches and track the impacts of the projects being evaluated, including whether and how the citizen science projects have influenced practices or decision making. Other stakeholders will include relevant people from within the four partner organisations and other agencies who have engaged with or are likely to be influenced by the results of the citizen science projects. These interviews will take place within 6–12 months after completion of each of the four citizen science projects to allow time for individual project findings to be reported and disseminated and for impacts to emerge.

Data analysis

In line with a developmental evaluation approach, data analysis will be iterative and ongoing, with key insights fed back to project partners on a regular basis, as they emerge, through project newsletters and regular meetings. We will also provide opportunities throughout the project for project partners to provide input into data analysis, interpretation and reporting to ensure that the outputs from this project are relevant and applicable to key stakeholders. NVivo qualitative data analysis software (QSR International Pty Ltd 2020) will be used to manage data within this project.

Following each round of interviews, we will conduct thematic analysis of the data to construct themes inductively instead of restricting data analysis to preconceived categories (Braun and Clarke 2006). In accordance with the recommendations of the consolidated criteria for reporting qualitative research (COREQ; Tong, Sainsbury, and Craig 2007), we will use a collaborative and iterative process of analysis to maximize rigour and ensure credibility of our findings. This process will involve members of the evaluation team reading and reviewing the transcripts, and working collaboratively to develop the codebook, code data, and organise themes. Following the same process, we will undertake thematic analysis of project documents, meeting notes, and reflective journals in an ongoing manner to draw out key insights.

Quantitative data from the citizen scientist survey will be analysed using SPSS software (IBM Corp 2020). Descriptive statistics will be calculated for motivations and experiences of involvement in citizen science projects. Citizen science survey data will be analysed in parallel with citizen scientist interview data, with interview data providing more in-depth insights into the experiences of citizen scientists in the projects.

A variety of methods exist for integrating data in mixed-methods studies including merging, connecting, and embedding data (Bazeley 2018), and in this study, we will employ complementary integration of data from qualitative and quantitative sources with respect to our key research questions, by mapping the data against each of the four evaluation domains identified. Synthesis of data from the various data sources (interviews, survey, meeting notes, document review, and reflective journals) will be performed for each of the four case studies individually, with comparisons across the four projects to draw broader learnings about the process and impacts of stakeholder-led citizen science projects in public health. In doing so, we will also draw on the Kieslinger et al. (2017) framework to ensure we draw out the processes and impacts across the areas of scientific knowledge, citizen scientists, and socio-ecological impacts.

Member validation through presentation of emerging findings at meetings with project partners and implementers will provide opportunities for clarification and confirmation of interpretations, as well as helping to establish trust in the emerging analysis and providing opportunities for ongoing reflection.

Discussion

Within this paper, we present our planned approach to evaluating stakeholder-led citizen science in public health. There is a growing need for public health agencies to better understand the needs of the communities in which they operate and to be more responsive and involve them in planning programs and initiatives that will have an impact on them. Citizen science is one approach to enabling greater involvement of community members, but there is a need for more evidence on how these approaches work in practice and a need to build capacity amongst stakeholders in the application of these approaches (Marks et al. 2022). While there has been increasing interest in the application of citizen science approaches amongst these groups, to date there have been no systematic efforts to support and evaluate stakeholder-led citizen science approaches in public health. This project will provide in-depth, contextualised insights into how policy and practice stakeholders might incorporate citizen science approaches into their work and the value in doing so, as well as key issues to consider when embarking on the use of these approaches. Adoption of a developmental approach to evaluation will provide ongoing feedback to assist in decision making, facilitate capacity building within the project teams, and provide insights concerning barriers and facilitators to the use of these approaches and the contextual factors that influence their implementation and impacts.

The use of an evaluation approach in which project partners are involved in in the design of the evaluation and interpretation of data will ensure that findings are relevant to the needs of the partner agencies. Indeed, the developmental evaluation is a capacity-building exercise in itself (Harper and Dickson 2019), increasing the familiarity and confidence of partners to use citizen science approaches in their own settings. Most project partners involved in this work are trialling the use of citizen science for the first time, and so the opportunities for reflection and learning built into the evaluation process provide a forum for knowledge sharing and support across projects. Rather than simply providing an end-point evaluation, the developmental evaluation approach helps project teams to reflect on what is going well and what needs to be adapted in an ongoing manner, sensitising them to the factors that influence how a project plays out and needs to be adapted.

Given the paucity of comprehensive evaluations of citizen science projects using commonly established indicators, this project will draw on the citizen science evaluation framework by Kieslinger et al. (2017) to inform collection, analysis and reporting of data on processes and impacts across the scientific, citizen scientist, and socio-ecological domains. This will allow us to situate the findings of this project within the broader citizen science evaluation literature and explore similarities and differences in the processes and impacts of citizen science projects across disciplines.

Potential limitations

Findings from each project are necessarily context dependent as each is taking place within unique circumstances with a variety of factors that influence development, implementation, and impacts, limiting the generalisability of the findings from individual projects. However, the use of an approach that utilizes multiple embedded case studies, which seeks to elicit an in-depth understanding of the perspectives of multiple actors and the contextual factors that shape the processes and outcomes of the projects, will enable us to draw out insights that can inform the use of citizen science approaches across settings.

While this project will provide crucial insights into the application of citizen science approaches in policy and practice settings, it is important to note that the four partner organisations are somewhat similar in the sense that they are government health agencies (albeit a mix of local and state level, with differences in funding and governance structures). We will not capture the potential application of citizen science approaches by other key stakeholders, such as non-government organisations. Despite this, we anticipate that the insights will be useful across a range of settings, and through this evaluation and the broader CSP project, we hope to stimulate further application and evaluation of citizen science approaches in policy and practice settings in public health.

Within developmental evaluation, embedded evaluators work closely with project teams, and within our study, project partners are members of the broader research team and have input into the design of the evaluation. This participatory approach to evaluation may be seen to reduce objectivity and introduce potential bias into the research. However, developmental evaluation does not seek to answer the question “Does it work?,” but instead seeks to understand the complexity of how a project plays out in practice and to use the insights that emerge over the course of the project to enable ongoing adaptation. Within this context, our approach would be considered a strength rather than a limitation.

Conclusion

Within this paper we have outlined our planned approach to the evaluation of stakeholder-led citizen science projects. By adopting a approach to evaluation, we ask “What is going on here?”, seeking a deeper understanding of how citizen science approaches projects operate within policy and practice contexts, including the barriers and facilitators to their application, the circumstances under which they are most useful, and the impacts of these approaches from the perspective of different stakeholders. Through the adoption of a participatory approach to evaluation, and focusing on ongoing reflection, we aim to support stakeholders to utilise citizen science approaches within their work.

This detailed presentation of our evaluation protocol is intended to contribute to the growing literature on citizen science evaluation, offering practitioners and evaluators an example of an approach which focuses on fostering innovation and building capacity in the use of citizen science approaches. This approach is likely to be particularly suitable when supporting and evaluating the use of citizen science by stakeholders and agencies that are new to these approaches.

Ethics and Consent

This evaluation protocol has been approved by the University of Sydney Human Research Ethics Committee (Ref: 2020/647). All participants within the evaluation will provide written informed consent prior to participating.

Acknowledgements

We would like to acknowledge the support and input of a number of policy and practice stakeholders in planning and facilitating this program of work, including Dr Karen Turner, Emma Saleeba, Maya Rivis, and Professor Katina D’Onise.

Funding Information

This research was supported by the Australian Prevention Partnership Centre through the NHMRC partnership centre grant scheme (Grant ID: GNT9100003) with the Australian Government Department of Health, ACT Health, Cancer Council Australia, NSW Ministry of Health, Wellbeing SA, Tasmanian Department of Health, and VicHealth. It is administered by the Sax Institute.

Competing Interests

The authors have no competing interests to declare.

Author Contribution

SR and YL led the conception and design of the project, with critical input from BS, LM, PW, KW, AS, KP, SOR, KG, and KJ. SR led the drafting of the manuscript, with input from YL, LM, PW and BS. All authors contributed to critical revision of the article and have provided final approval of the version to be published.

References

  1. Arkesteijn, M, van Mierlo, B and Leeuwis, C. 2015. The need for reflexive evaluation approaches in development cooperation. Evaluation, 21(1): 99–115. DOI: https://doi.org/10.1177/1356389014564719 

  2. Australian Prevention Partnership Centre. 2021. Harnessing the power of citizen science for prevention. Available at: https://preventioncentre.org.au/research-projects/harnessing-the-power-of-citizen-science-for-prevention/ (Last accessed 21 December 2021). 

  3. Australian Public Service Commission. 2007. Changing behaviour: a public policy perspective. Canberra, ACT: Australian Public Service Commission. 

  4. Barrie, H, Soebarto, V, Lange, J, McCorry-Breen, F and Walker, L. 2019. Using citizen science to explore neighbourhood influences on ageing well: pilot project. Healthcare, 7(4). DOI: https://doi.org/10.3390/healthcare7040126 

  5. Bazeley, P. 2018. Integrating Analyses in Mixed Methods Research. London: Sage. DOI: https://doi.org/10.4135/9781526417190 

  6. Braun, V and Clarke, V. 2006. Using thematic analysis in psychology. Qualitative research in psychology, 3(2): 77–101. DOI: https://doi.org/10.1191/1478088706qp063oa 

  7. Cacari-Stone, L, Wallerstein, N, Garcia, AP and Minkler, M. 2014. The promise of community-based participatory research for health equity: A conceptual model for bridging evidence with policy. American Journal of Public Health, 104(9): 1615–1623. DOI: https://doi.org/10.2105/AJPH.2014.301961 

  8. Creswell, J and Plano-Clark, V. 2011. Designing and Conducting Mixed Methods Research. California: Sage Publications. 

  9. Department for Health and Ageing and Government of South Australia. 2017. SA Health Strategic Plan 2017 to 2020. Adelaide, SA: Government of South Australia. 

  10. Dickinson, JL, Zuckerberg, B and Bonter, DN. 2010. Citizen science as an ecological research tool: Challenges and benefits. Annual Review of Ecology, Evolution, and Systematics, 41(1): 149–172. DOI: https://doi.org/10.1146/annurev-ecolsys-102209-144636 

  11. Frieden, TR. 2014. Six components necessary for effective public health program implementation. American Journal of Public Health, 104(1): 17–22. DOI: https://doi.org/10.2105/AJPH.2013.301608 

  12. Gamble, JAA and The J.W. McConnell Family Foundation. 2008. A developmental evaluation primer. Available at: http://mcconnellfoundation.ca/report/a-developmental-evaluation-primer/ (Last accessed 23 Feb 2022). 

  13. Gudes, O, Yigitcanlar, T, Edwards, SJ and Pathak, V. 2015. Competitive smart cities through healthy decision-making. In Thomas, KD (ed.), Handbook of Research on Sustainable Development and Economics. IGI Global. DOI: https://doi.org/10.4018/978-1-4666-8433-1.ch003 

  14. Guijt, I. 2014. Participatory Approaches. Florence: UNICEF Office of Research. 

  15. Haklay, M, Dörler, D, Heigl, F, Manzoni, M, Hecker, S and Vohland, K. 2021. What Is Citizen Science? The Challenges of Definition. In Vohland, K, Land-Zandstra, A, Ceccaroni, L, Lemmens, R, Perelló, J, Ponti, M, Samson, R and Wagenknecht, K (eds.), The Science of Citizen Science. Cham, Switzerland: Springer International Publishing. DOI: https://doi.org/10.1007/978-3-030-58278-4_2 

  16. Harper, LM and Dickson, R. 2019. Using developmental evaluation principles to build capacity for knowledge mobilisation in health and social care. Evaluation, 25(3): 330–348. DOI: https://doi.org/10.1177/1356389019840058 

  17. Hecker, S, Bonney, R, Haklay, M, Hölker, F, Hofer, H, Goebel, C, Gold, M, Makuch, Z, Ponti, M and Richter, A. 2018. Innovation in citizen science: Perspectives on science-policy advances. Citizen Science: Theory and Practice, 3(1). DOI: https://doi.org/10.5334/cstp.114 

  18. Hecker, S, Wicke, N, Haklay, M and Bonn, A. 2019. How does policy conceptualise citizen science? A qualitative content analysis of international policy documents. Citizen Science: Theory and Practice, 4(1). DOI: https://doi.org/10.5334/cstp.230 

  19. IBM Corp. 2020. IBM SPSS Statistics for Windows, Version 27.0. Armonk, NY: IBM Corp. 

  20. Kieslinger, B, Schäfer, T, Heigl, F, Dörler, D, Richter, A and Bonn, A. 2017. The challenge of evaluation: An open framework for evaluating citizen science activities. SocArXiv. September 20. DOI: https://doi.org/10.31235/osf.io/enzc9 

  21. Klaassen, P, Verwoerd, L, Kupper, F and Regeer, B. 2020. Reflexive monitoring in action as a methodology for learning and enacting Responsible Research and Innovation. In Yaghmaei, E and Van De Poel, I (eds.), Assessment of Responsible Innovation: Methods and Practices. London: Routledge. DOI: https://doi.org/10.4324/9780429298998-15 

  22. Kullenberg, C and Kasperowski, D. 2016. What Is citizen science? A scientometric meta-analysis. PLOS ONE, 11(1): e0147152. DOI: https://doi.org/10.1371/journal.pone.0147152 

  23. Marks, L, Laird, Y, Trevena, H, Smith, BJ and Rowbotham, S. 2022. A Scoping Review of Citizen Science Approaches in Chronic Disease Prevention. Frontiers in Public Health, 10. DOI: https://doi.org/10.3389/fpubh.2022.743348 

  24. Mayne, J and Stern, E. 2013. Impact evaluation of natural resouce management research programs: a broader view. Canberra, Australia: Australian Centre for International Agricultural Research. 

  25. National Health and Medical Research Council. 2016. Statement on Consumer and Community involvement in Health and Medical Research. Council, NHaMR. 

  26. Patton, MQ. 2010. Developmental evaluation: Applying complexity concepts to enhance innovation and use. Guilford Press. 

  27. QSR International Pty Ltd. 2020. NVivo. QSR International Pty Ltd. 

  28. Resnik, DB, Elliott, KC and Miller, AK. 2015. A framework for addressing ethical issues in citizen science. Environmental Science and Policy, 54: 475–481. DOI: https://doi.org/10.1016/j.envsci.2015.05.008 

  29. Schade, S, Pelacho, M, van Noordwijk, T, Vohland, K, Hecker, S and Manzoni, M. 2021. Citizen Science and Policy. In Vohland, K, Land-Zandstra, A, Ceccaroni, L, Lemmens, R, Perelló, J, Ponti, M, Samson, R, and Wagenknecht, K (eds.), The Science of Citizen Science. Cham, Switzerland: Springer International Publishing. DOI: https://doi.org/10.1007/978-3-030-58278-4_18 

  30. Silvertown, J, Cook, L, Cameron, R, Dodd, M, McConway, K, Worthington, J, Skelton, P, Anton, C, Bossdorf, O and Baur, B. 2011. Citizen science reveals unexpected continental-scale evolutionary change in a model organism. PloS one, 6(4): e18927. DOI: https://doi.org/10.1371/journal.pone.0018927 

  31. Stanford Medicine. 2020. Our Voice: citizen science for health equity. Available at: http://med.stanford.edu/ourvoice.html (Last accessed 21 December 2021). 

  32. Sullivan, BL, Aycrigg, JL, Barry, JH, Bonney, RE, Bruns, N, Cooper, CB, Damoulas, T, Dhondt, AA, Dietterich, T, Farnsworth, A, Fink, D, Fitzpatrick, JW, Fredericks, T, Gerbracht, J, Gomes, C, Hochachka, WM, Iliff, MJ, Lagoze, C, La Sorte, FA, Merrifield, M, Morris, W, Phillips, TB, Reynolds, M, Rodewald, AD, Rosenberg, KV, Trautmann, NM, Wiggins, A, Winkler, DW, Wong, W-K, Wood, CL, Yu, J and Kelling, S. 2014. The eBird enterprise: An integrated approach to development and application of citizen science. Biological Conservation, 169: 31–40. DOI: https://doi.org/10.1016/j.biocon.2013.11.003 

  33. Tasmanian Government: Department of Health and Human Services. 2016. Healthy Tasmania Five Year Strategic Plan. Tasmanian Government: Department of Health and Human Services, Tasmania. 

  34. Theobald, EJ, Ettinger, AK, Burgess, HK, DeBey, LB, Schmidt, NR, Froehlich, HE, Wagner, C, HilleRisLambers, J, Tewksbury, J, Harsch, MA and Parrish, JK. 2015. Global change and local solutions: Tapping the unrealized potential of citizen science for biodiversity research. Biological Conservation, 181: 236–244. DOI: https://doi.org/10.1016/j.biocon.2014.10.021 

  35. Todd, A and Nutbeam, D. 2018. Involving consumers in health research: what do consumers say? Public Health Research and Practice, 28(2): e2821813. DOI: https://doi.org/10.17061/phrp2821813 

  36. VicHealth. 2019. Action Agenda for Health Promotion 2019–2023. Melbourne, VIC: Victorian Health Promotion Foundation. 

  37. Wellbeing SA and Government of South Australia. 2020. Wellbeing SA Strategic Plan 2020–2025. Government of South Australia. 

  38. World Health Organisation. 2017. Engagement and participation for health equity. Denmark: World Health Organization Regional Office for Europe. 

  39. World Health Organisation. 2019. A multilevel governance approach to preventing and managing noncommunicable diseases: the role of cities and urban settings. Denmark: Who Regional Office for Europe. 

  40. Yin, RK. 2008. Case study research: design and methods, Thousand Oaks, CA: Sage Publications.