This article addresses the question: What are the best practices for online citizen science projects to have a positive impact on project volunteers? It does so by evaluating the Power to the People (PTTP) project and examining the experiences of and impacts on the citizen scientists involved.
Citizen science, or the intentional involvement of the public in the scientific research process (Phillips et al. 2019), promises mutual benefit to professional researchers and volunteer contributors. To professional researchers, it promises the ability to collect or process diverse data at scale, manifesting materially as cost or time savings, which accelerate or improve the work (Cox et al. 2015). To citizen scientists, it promises the ability to contribute to meaningful research, to learn, and to form community with fellow volunteers (Land-Zandstra, Agnello and Gültekin 2021). Many citizen science projects grant access to data or methods that can be difficult to access, ranging from photographs of penguins in Antarctica (Jones et al. 2018) to papyrus fragments from Ancient Greece (Williams et al. 2014). Moreover, these projects typically provide direct communication links with otherwise inaccessible scientific researchers. These co-benefits are generally framed as a win-win.
While the benefits of citizen science to professional researchers can be easily quantified (e.g., as the rate of data processed or the diversity of samples collected), the benefits to citizen scientists are more difficult to measure. Intangible benefits of learning, satisfaction, and connection are often implied by the researchers conducting citizen science efforts but are less frequently evaluated directly. Bonney (2016) notes that “very few efforts to determine or measure learning or other social outcomes from participation in citizen science have been undertaken” (p. 5). A survey-based study of citizen science practitioners found that 43% did not evaluate their citizen science projects (Phillips et al. 2018), and those who did evaluate their projects largely used one-off bespoke tools. Standard project evaluation tools for citizen science are generally lacking (Kieslinger et al. 2018); while open frameworks have been developed to tackle this issue (Phillips et al. 2014, 2018; Cox et al. 2015), none are yet accepted as a field-wide standard.
The impacts of citizen science that merit evaluation are well established. As noted by Phillips (2018), the Framework for Evaluating Impacts of Informal Science Education Projects (Friedman et al. 2008) established five core impact areas of informal science education—understanding, interest, attitudes, skills, and behaviors. The National Research Council’s influential Learning Science in Informal Environments report (Bell et al. 2009) echoes these and adds reflection, communication, and self-identification as a person who can undertake science. These themes are reflected in the Center for Advancement of Informal Science Education assessment rubric (Bonney et al. 2009) and align with Archer’s definitions of science capital development along cultural, behavioral, and social dimensions (Archer et al. 2015). The citizen science community seems to agree about what volunteer impacts should be evaluated, but neglects to consistently execute empirical data collection on the same.
The popularity and importance of citizen science is continuously increasing. This method can help tackle some of the most intimidating and pressing global challenges, spanning from the achievement of the Sustainable Development Goals (Fritz et al. 2019) to the mapping of the cosmos (Marshall, Lintott, and Fletcher 2015). It can bring powerful interdisciplinarity to align social and scientific research through interest- and place-based communities of practice (Crain, Cooper, and Dickinson 2014). As citizen science continues to grow in popularity, however, its utility to the research community must be counterbalanced with evidence of positive impact on contributors to adhere to its core principles and ethics (European Citizen Science Association [ECSA] 2015). Evidence that these projects result in contributor learning and heightened awareness is “limited but growing” (Bonney et al. 2016, p. 2); this work aims to add to the growing evidence base.
This paper investigates the impacts of an online remote mapping citizen science project and explores best practices for mutual benefit to contributors and professional researchers. We evaluate PTTP, an online citizen science project that mapped homes in rural Uganda, Kenya, and Sierra Leone for electrical system planning (Leonard, Wheeler, and Mcculloch 2022) and created a training dataset for computer vision (Leonard, Wheeler, and McCulloch 2022a, 2022b).1 Through analysis of beta feedback, discussion board posts, an evaluation survey, and project data, we study the community composition of PTTP contributors, their learnings, their motivations, and their experiences.
PTTP collected home annotations on high-resolution satellite imagery of rural off-grid areas of Kenya, Sierra Leone, and Uganda. It was hosted on the Zooniverse online citizen science platform. PTTP ran from 2nd March to 28th August 2020. Throughout the project, approximately 1,267 km2 were mapped at an average rate of 7 km2/day by more than 6,000 citizen scientists. While PTTP aimed to map homes for electrical system design, these data could also be useful in health, planning, and governance applications.
Alongside its data collection aims, PTTP aimed to meet the following engagement objectives amongst the citizen science community, which could equally be considered as hypotheses to be tested through this evaluation:
The main satellite imagery annotation interface for rural home mapping in PTTP is shown in Figure 1. Alongside this interface, the platform had “About” and “Learn More” pages, a “Field Guide,” a “Tutorial,” and a “Statistics” page. It also had a forum called “Talk” where citizen scientists could discuss images which caused them difficulty, ask questions, and chat. Further details about the technical implementation of the project are provided in (Leonard, Wheeler, and Mcculloch 2022).
Citizen science annotation interface (left) alongside the context menu multiple choice questions for each home annotation (right).
To evaluate the impacts of PTTP on citizen scientists, four data sources were used: beta reviews, discussion board feedback, an evaluation survey, and annotation data collected throughout the project. Each required its own methods for collection and analysis.
PTTP was disseminated to a select number of citizen scientists for beta review (i.e., early engagement and platform testing) from 14th January to 3rd February 2020, prior to full launch. Reviewers were encouraged to leave their feedback via (1) “Talk”, and/or (2) a stand-alone feedback form that solicited feedback about user-friendliness as well as project clarity and suitability to the Zooniverse platform. The full set of beta feedback form questions are included in Supplemental File 1. These data were aggregated and analyzed to collate valuable information about first impressions, project functionality, and user-friendliness issues. Suggestions were implemented where appropriate prior to launch.
After project launch (i.e., 2nd March to 28th August 2020), all contributors had access to the “Talk” forum. The following discussion boards were available:
These boards were monitored by the research team, who endeavored to assist citizen scientists and answer their questions. Board posts were exported following project termination for analysis.
An online survey was administered using Jisc Online Surveys to evaluate project impact. It was open for responses from 14th August to 14th September 2020 (i.e., approximately two weeks prior to and two weeks after project completion). The survey followed the methods of Depper (2019), which largely align with Bonney’s rubric (2009), Friedman (2008) and the National Research Council (2009), and the ECSA core principles (2015). It aimed to investigate who contributed to PTTP, their experience while contributing, why they chose to volunteer, what impacts the project had on them, and any benefits or challenges they encountered in PTTP. A full list of survey questions is included in Supplemental File 2. The survey was advertised via an email to the project mailing list, a banner notice on the project homepage, and a post on “Talk.” This survey received ethical approval from the University of Oxford Medical Sciences Interdivisional Research Ethics Committee (R70873/RE001).
The home annotations collected during the project were also analyzed to understand if and how the experience of citizen scientists manifested in data quality issues. The annotations were analyzed for accuracy based on precision, recall, and F1 score (i.e., the harmonic mean of precision and recall), and visualized for intuitive understanding. More details on the technical specifications of data and a full analysis are available in (Leonard, Wheeler, and Mcculloch 2022).
Analysis results are discussed individually for each data type below. Cross-cutting insights from all results are subsequently described in the Discussion section.
During the project beta review, 52 citizen scientists completed the feedback form, nine citizen scientists left 23 notes on discussion boards, and more than 1,300 image classifications were generated. Note that not all respondents completed all feedback form questions.
Over half (57%) of citizen scientists who completed the beta feedback form indicated that the home annotation task was moderately or very easy on a four-point scale from very easy to very hard. Those who found it moderately or very hard largely experienced issues related to image quality. Namely, they indicated that the images were too small, or that their brightness, contrast, or resolution were too low. To address this, images were up-sampled by 200% (i.e., doubled in size), and image histograms were adjusted in pre-processing for a natural color appearance prior to launch.
Beta reviewers seemed to understand the project. They were asked to describe the project goals in an open-form text prompt, and their responses were coded into four categories by the research team: understood, somewhat understood, did not answer question, and did not understand. 80% of responses indicated full understanding, while only 4% of responses explicitly misunderstood project goals. Reviewers were also asked whether the help text available on the project interface was adequate to complete the required annotation task, and 90% indicated that it was adequate (i.e., they responded “Yes” from closed-form options “Yes,” “Somewhat,” and “No”). It is important to note that 61% of respondents reported that they did not access any of the informational pages for the project (see Figure 2), which means that by simply viewing the project homepage and annotation interface, most contributors were able to understand the project.
Beta form responses to the question “Did you find the additional information on other pages useful?” (n = 49). Note that most beta testers did not read the information pages beyond what was available on the homepage and the annotation interface.
Beta reviewers also requested some changes to project design. For instance, they requested additional information on how the data would be used and clarification and examples about how to label confusing cases, both of which were added prior to launch. Reviewers also requested changes to the interface, including the ability to choose what country to work on, to save question selections and apply them repeatedly, to orient the annotation boxes, to see multiple images at different times of day, to sub-tile images, to rotate images by increments, and to draw points and circles instead of boxes. Some of these changes (i.e., country choice and box orientation) were implemented, but many were impossible due to the data requirements and Zooniverse infrastructure available.
Generally, the beta review indicated a good first impression of the project. 92% of beta form respondents indicated that this project was suitable for the Zooniverse (from closed-form options “Yes” and “No”), and 54% said that they would participate once the project launched (from the closed form options “Yes and I’ll bring friends!,” “Yes,” “Not sure,” and “No”). Nevertheless, some of the most interesting feedback came from the minority who presented critiques. For instance, some were concerned that the collected data could be used nefariously or worried about privacy. Others indicated a disbelief in the project premise (i.e., that increased electricity access reduces poverty). Based on these critiques, explanations were added to help assuage privacy fears and to clarify the research premise.
Discussion board posts left by beta reviewers largely indicated the same issues identified in the beta form regarding image size and resolution. Some also asked more specific questions about how to use the interface or about particular images; these types of questions became routine after project launch.
The “Talk” forum received 914 posts over the course of PTTP, 168 of which were made by researchers or moderators and 746 of which were made by citizen scientists. The forum boards experienced very different levels of engagement, as shown in Figure 3. “Notes” was by far the most popular board, probably due to the design of the Zooniverse platform, which provides a button in the annotation interface which links directly to “Notes” to allow citizen scientists to comment on any image they find interesting or difficult.
Number of posts per Power to the People “Talk” discussion board throughout the project, with moderator and researcher posts excluded.
There was notably low engagement with the PTTP “Introductions and Chat” board (2 posts). While this could be interpreted as volunteers not experiencing a sense of community through the project, observing the contents of the other boards, this interpretation is likely to be incorrect. Within the “Notes” board, for instance, volunteers frequently helped one another interpret images and discussed image features without any prompting from researchers or project moderators. An example of this is shown in Figure 4. Such spontaneous interaction indicates a sense of community, even if the purpose-built board to spark community interaction was underused.
An example of a Notes board post where citizen scientists assisted each other with images and discussed features. Usernames have been removed.
Beyond interacting with each other, citizen scientists also asked many questions and made suggestions for the research team on “Talk.” These were quite insightful and covered topics ranging from the design of the platform, to data post-processing, to the eventual use of these data. Several illustrative examples are included below:
“Have you considered adapting this project to have a “made for mobile” work stream that focuses on just Yes or No to if houses are visible? Then the desktop users can only be shown the images that are known to have houses and focus on the house mapping?”
“Anyway, how are negatives processed? How many people have to agree there is nothing there before an image is retired?”
“Have you described the next steps for the data and ongoing engagement somewhere? It certainly appears that this data might really help with more than just electrifying projects.”
“What kind of power grids will be provided to the homes? I would love it if these homes didn’t have to later convert to clean energy after using fossil fuel energy!”
The evaluation survey received 142 responses, or approximately a 2% completion rate from all project contributors. As participation was voluntary, respondents may not be fully representative of all citizen scientists contributing to PTTP. We anticipate that these respondents may represent a more engaged subset; however, they still provide a useful indication of project impacts.
Results concerning the composition of the citizen science community, their experiences, their motivations, their learnings, and their platform interaction are discussed in separate subsections below.
Evaluation survey respondents represented six continents (Asia, Europe, North America, South America, Africa, and Australia) and 25 countries. We suspect that there were even more countries represented amongst PTTP contributors, and that the quantity of countries represented in the evaluation was affected by the sample size.
Respondents were more highly concentrated in some countries than others. Specifically, there was high representation from the United Kingdom and the United States of America (44% and 27%, respectively). This was not unexpected, as the Zooniverse platform has a high pre-existing contributor base in these countries and was founded in the United Kingdom.
Interestingly, there were more women than men represented among respondents. 59% of respondents identified as women, 36% identified as men, and the remaining 5% either identified as non-binary or preferred to self-describe or not to answer, as shown in Figure 5.
Genders of Power to the People evaluation survey respondents (n = 142). Note the higher proportion of women than men.
This community also represented diverse educational and employment backgrounds. Over half (54%) had no background in science or engineering. Their employment statuses varied: 31% were students, 30% were employed full-time, and 19% were retired. The remainder were either employed part-time, in some other employment status (e.g., on disability), or preferred not to answer. Many (56%) had achieved some higher education degree (i.e., Bachelor’s, Master’s, or PhD); the remainder were all over the educational spectrum, from currently in school (11%) to having completed high school (20%) or done vocational training (6%). Respondents also varied significantly in age, as shown in Figure 6.
Ages of Power to the People evaluation survey respondents (n = 142).
Evaluation survey respondents overwhelmingly reported a positive experience contributing to PTTP. 87% qualified their experience as either “Good” or “Excellent” on a closed-form five-point scale, as shown in Figure 7. Respondents were asked to explain any parts of PTTP they found particularly enjoyable in an open-from text question, and responses were reviewed and coded based on the themes they discussed. Five categories of volunteer enjoyment emerged from this exercise:
Evaluation responses for contributor experience on Power to the People (n = 142). Note the overwhelming positivity, with 87% choosing either “Good” or “Excellent.”
When asked for any general comments or thoughts about the project, several respondents indicated that it was a great use of time during COVID-19 lockdowns, and that it was a great way to volunteer and bring meaning to life:
“It has been really great to contribute to this project particularly over the corona-virus lockdown period.”
“Has been a useful distraction during the lockdown imposed by Covid, and has been good to know that I have helped in some small way.”
“Power To The People has been an easy platform for desktop volunteering for the construction industry teams I work with. Covid has prevented our other community investment activities within the UK in local to our sites, but now that 2 community development projects in Uganda have been cancelled (UK personnel being unable to travel to Africa) it’s meant we can still make a contribution while waiting to reorganize the projects.”
Contributors engaged with PTTP with varying frequency based on interest and lifestyle factors. 28% of respondents engaged weekly or more frequently, while 47% engaged 2 to 3 times a month or less, and 17% engaged only once. When asked whether that there were things that kept them from spending as much time as they would like on PTTP, 45% of respondents indicated that indeed there were, indicating reasons such as work, other commitments, medical or disability issues, forgetfulness, or increasing disinterest over time in a follow-up open-form text prompt. While 32% of respondents reported that they had shared PTTP with others, 91% of respondents had heard about PTTP themselves via the Zooniverse website. This may indicate that word of mouth may not be a good way to promote this type of citizen science project. That said, this could equally indicate that those who completed the evaluation survey were more likely to be keen or self-starters who would go seek a project to complete themselves on the Zooniverse website. More study is required on this point.
Citizen scientists were asked to identify all reasons why they chose to engage with PTTP from a closed-form list of 10 predetermined options, including an “Other” option which provided an open-form text input if selected. The predetermined options were defined by the research team based on their observations throughout the project. The most popular reasons reported were a desire to contribute to projects with real world impact (91% reporting) and a desire to contribute to scientific research (79% reporting), as shown in Figure 8.
Motivations reported by the evaluation survey respondents who selected any of the predetermined options (n = 141), for women (n = 84) and for men (n = 50).
Motivations largely align across women and men, though slightly more men than women reported the motivations “I want to contribute to scientific research” and “I am generally interested in science and/or engineering”. This interestingly seems to reflect gendered societal trends in science interest.
The importance of real-world impact to PTTP volunteers is also illustrated in their choice of other Zooniverse projects. Amongst respondents who had contributed to Zooniverse projects prior to PTTP, the most frequently mentioned previous projects were Bash the Bug, American WW1 Burial Cards, and Every Name Counts. American WW1 Burial Cards and Every Name Counts are human-focused historical projects (e.g., transcribing burial cards of first-world-war soldiers and records of holocaust victims respectively); their interest in human-focused projects may have led them to choose our project as well.
Nearly two-thirds (66%) of evaluation respondents reported that they learned something through volunteering on PTTP. These respondents were asked to describe what they had learned in an open-form text prompt, which were reviewed and coded based on their contents. Common themes included learning about rural electrification (40%), home styles and settlement patterns in rural sub-Saharan Africa (38%), the geography of Kenya, Sierra, Leone, and Uganda (31%), and satellite imagery analysis (21%). To illustrate, when asked what they learned, responses included the following:
“How to ‘read’ satellite imagery; something about conditions on the ground in rural Kenya”
“I learnt more about the layout and construction of housing in the Uganda [sic].”
“Housing patterns in rural S. L. [sic]”
“The communities in Africa are very numerous and they seem to have great agricultural skills.”
“I learned more about the landscape of Africa buy [sic] looking at the photos and seeing trees and buildings. It gave me a better understanding of the landscape.”
“I learned that even in today’s world there are still many people living without things that we consider basic necessities like electricity.”
“That access to electricity is not something to be taken for granted”
“I learned that access to reliable and sustainable electricity was part of the UN SDGs and how important it is for livelihoods in Africa to have it.”
“It hit home what we take for granted. And, it wasn’t as easy as I thought to locate the homes.”
“That there are SO many rural houses that may not have any power, it’s sad that the governments in these countries aren’t doing more.’”
When asked for more details, those who said they did not learn anything in the project indicated that the citizen science task was routine or unengaging, that they did not spend much time on the project, that they were confused and needed more help, that they were already familiar with the topic, or that they were simply not looking to learn anything.
Participating in this project also prompted 18% of respondents to do their own investigations. Most of these were via internet search, and the most popular research topics were information about the countries of Kenya, Sierra Leone, and Uganda.
During their time on PTTP, 94% of respondents visited the “Tutorial” page, 73% visited the “Field Guide,” 68% visited the “About” or “Learn More” pages, and 36% visited the “Statistics” page. It is unsurprising that most visited the “Tutorial” page, as this automatically opened the first time they entered the classification interface. The “Field Guide” may be the next most popular because there is an icon to access it displayed in the classification interface. It was heartening to see that many contributors accessed the “About” or “Learn More” pages, as it was hoped that contributors would explore these pages to gain more context on the project. It is possible that the “Statistics” page was the least visited because this icon was much smaller and harder to find than the others. The Zooniverse team could consider enlarging this icon to ensure contributors know how to access statistics.
Only 22% of respondents reported using the project “Talk” discussion board. As “Talk” is the main communication link between the researchers and citizen scientists during project execution, it is an important finding that only a minority of participants use it. It is possible that “Talk” may be more appealing to certain personality types or motivations than others, as will be further addressed in the Discussion section.
Throughout PTTP, 578,010 georeferenced home annotations were made by citizen scientists on satellite imagery. Compared with a researcher-generated annotation set, the PTTP annotations achieved a precision of 49%, recall of 93%, and F1 score of 64%. The high recall may indicate that citizen scientists found most homes in the images, but is also influenced by the fact that the annotations frequently overlap, as up to 10 citizen scientists were tasked to annotate each image for a higher confidence final dataset. What is more important here is the low precision, which indicates a substantial number of annotations made where no home was present, showing that the citizen scientists struggled, at least in certain circumstances. Visualizing a sample of the annotation data illuminated the issues illustrated in Figure 9, and listed below.
Issues faced by citizen scientists in annotating homes on Power to the People. Individual home annotations are visualized as yellow boxes.
PTTP succeeded in attracting a diverse global volunteer base from a wide range of educational attainment and employment statuses. This may speak to the online citizen science being accessible across socioeconomic lines and class structures—though, of course, those with lower income and more precarious living situations may be more time constrained and have less time to invest in such efforts. The diversity of employment statuses may reflect a divergence from the high socioeconomic bracket that is expected to be overrepresented amongst citizen science participants (Garibay Group 2015). However, a larger proportion of contributors had completed higher education than might be expected based on the general population, indicating that there is still work to be done to expand the reach of citizen science. The higher representation of those with more education aligns with previous observation that those with higher pre-existing scientific capital are more likely to engage with citizen science (Garibay Group 2015; Edwards et al. 2018). This also might indicate higher pre-existing interest in research amongst the university-educated public, though their ability to attend university is of course inextricably linked in some ways to their socioeconomic status.
It was interesting to see that contributor demographics skewed more towards women than men. This is at odds with Zooniverse’s expectations of their projects, where they tend to see more men contributing to astronomy and physics projects, and more women contributing to ecological projects (Miller, 2020). While gender representation varies in citizen science depending on the specific project and field (Paleco et al. 2021), in a project whose aims are most highly linked to engineering, a high representation of women and gender divergent people is interesting. The project’s placement at the intersection of engineering, geography, and sustainable development may have helped it to attract genders typically underrepresented in engineering.
While many contributors learned something through the project, there is still room for improvement on this. 34% of PTTP citizen scientists did not learn anything while contributing, and only 18% sought their own additional information. This shows how learning something that is presented to you requires a different level of effort than feeling inspired to learn more independently.
Despite the low volume of respondents proceeding to independent inquiry, this example of action by citizen scientists based directly on their experience is exciting. While there is a strong sentiment that citizen science can encourage action on social issues, there is limited evidence to support this, as discussed in the literature with regards to climate change (Groulx et al. 2017). This appears to again be a simple lack of empirical data collection: The independent learning noted here therefore adds a piece of evidence to this notion, which can be strengthened through further study.
Different modalities and depths of engagement were shown by volunteers on PTTP. Throughout the project, it was clear that some contributors engaged deeply with the content through their insightful, vocal contributions on “Talk.” However, this vocal subset was not the majority: Only 22% of the contributor base engaged with the PTTP discussion forums. This subset made the most noise, and so could easily be presumed to represent the interests of the entire contributor base, putting the research team at risk of building the project to only serve those who engage via communication and community building. However, the 78% of volunteers who did not engage vocally may equally have been engaging deeply, simply through other engagement modalities, such as through independent reflection or action. This more nuanced understanding of engagement aligns with Phillips’ “Dimensions of Engagement Framework,” which divides engagement in citizen science into behavioral, learning, emotional, and social quadrants stemming from both extrinsic and intrinsic motivations (Phillips et al. 2019). Whilst those participating loudly in PTTP forums may find the most satisfaction through deep social engagement, others may instead act, feel, or learn through the project at various depths without vocalizing this. Indeed, PTTP implicitly aimed to provide engagement opportunity in each quadrant through a research activity (behavior), educational opportunities (learning), emphasis on real-world impact (emotional), and discussion forums (social). These quadrants also align with the aspects of PTTP that evaluation respondents indicated to be most enjoyable: their sense of discovery while annotating images and learning about other places by participating (learning), the potential for project impact (emotional), connecting to the research team (social), and the convenience of participating in their own home (behavior).
Reviewing annotation data acquired through this project, several issues were evident that were not discussed so frequently on “Talk” or in feedback forms. For instance, issues with rotating annotations were not frequently discussed, nor was any technical issue causing people to make spurious annotations. Routinely reviewing the data produced as a citizen science project is executed could illuminate some of these issues earlier in the project while they can still be addressed.
This work has investigated the impacts of the citizen science project “Power to the People” (PTTP) on volunteer contributors. To accomplish this, we have analyzed beta feedback collected before project launch, discussion board posts made during the project, a project evaluation survey, and mapping data generated during the project, with the aim to understand whether PTTP engaged a diverse and global community in an accessible, enjoyable, and educational experience.
It was found that PTTP did indeed attract a diverse community of volunteers spanning at least six continents and 25 countries. Amongst evaluation survey respondents, 59% identified as women and 36% as men, with the remaining 5% either outside the gender binary or preferring not to self-identify. In terms of employment, 31% were students, 30% were employed full-time, and 19% were retired, with the remainder either employed part-time, in some other employment status (e.g., on disability), or preferring not to disclose. While 54% had no background in science or engineering, 56% had achieved some higher education degree (i.e., Bachelor’s, Master’s, or PhD), indicating that despite the diversity across multiple indicators, a more highly educated population than average was being engaged by the project, as has been observed previously in the literature.
Many citizen scientists were found to engage deeply in the PTTP community. Contributor exchanges on discussion boards frequently indicated high understanding. However, not all contributors engaged at the same depth or through the same modality. Only 22% of evaluation survey respondents used the discussion boards, with others indicating a lack of time, interest, or nerve to contribute. However, those who did not engage vocally may still enjoy deep engagement through other modalities, including through solitary reflection, learning, or action. While PTTP incorporated elements catered to each of these dimensions, further study is required on the depth of engagement along each.
Generally, contributors reported a positive experience with PTTP. 87% of evaluation survey respondents qualified their experience as either “good” or “excellent.” Contributors particularly enjoyed the project’s potential for impact, their sense of discovery while annotating images, learning about other places by participating, connecting to the research team, and the convenience of participating in their own home.
Furthermore, 66% of respondents learned something through participating in PTTP. They reported learning about rural electrification (40%), home styles and settlement patterns in rural sub-Saharan Africa (38%), the geography of Kenya, Sierra, Leone, and Uganda (31%), and satellite imagery analysis (21%). Participating in this project also prompted 18% to do their own investigations, typically via internet search, and most commonly about the studied countries of Kenya, Sierra Leone, and Uganda.
Image quality, size, and crowding were the most frequent technical issues discussed by contributors, with beta feedback often indicated that images were too small, or that their brightness, contrast, or resolution were too low. This was addressed with up-sampling and color correction prior to launch. Final resulting data additionally indicated difficulty with crowded images (i.e., images with many homes) even when up-sampled, alongside other less-discussed issues such as under-rotation of annotations.
Based on these findings, the following best practices are recommended for online citizen science projects in remote mapping. They are also applicable more broadly in citizen science projects using the Zooniverse interface and/or which focus on image annotation:
Many of these best practices may appear to be common knowledge or common sense, and indeed they are amongst many in the citizen science community. They confirm various notions from the theoretical literature on citizen science engagement. However, common sense and theory require empirical validation to be justified. This study adds rigor to many of the intuitive practices already employed amongst researchers in citizen science and serves as a guidepost for those newly adopting these methods. By implementing these best practices in future similar projects, and continuing to evaluate project impact, better experiences can be created for the dedicated citizen science community.
Survey details are available in the Supplemental Files. Power to the People project data are shared openly for research use at (Leonard, Wheeler and McCulloch, 2022a, 2022b).
The Supplementary files for this article can be found as follows:
Supplemental File 1Beta feedback form. DOI: https://doi.org/10.5334/cstp.534.s1
Supplemental File 2Evaluation survey. DOI: https://doi.org/10.5334/cstp.534.s2
1For more information, please see the project landing page at the following link: https://www.zooniverse.org/projects/alycialeonard/power-to-the-people.
This evaluation survey detailed in this article received ethical approval from the University of Oxford Medical Sciences Interdivisional Research Ethics Committee in accordance with standard university practices (R70873/RE001).
This research would be impossible without the dedicated work of the Zooniverse citizen science community. Many thanks to the Zooniverse team for their support of this project and to volunteer project moderator Ravi Kohli.
The Power to the People Citizen Science Project was funded by a Citizen Science Exploration Grant (BB/T01833X/1) provided by the United Kingdom Research and Innovation (UKRI) Public Engagement office. Satellite imagery for the project was provided by Earth-i through the Satellite Applications Catapult.
The authors have no competing interests to declare.
Alycia Leonard conceived the study, ran the citizen science project, conducted the evaluation, analyzed the data, and drafted the manuscript.
Scot Wheeler assisted in conceived the evaluation study and edited the manuscript.
Malcolm McCulloch guided the conception of the study and edited the manuscript.
Archer, L, Dawson, E, DeWitt, J, Seakins, A and Wong, B. 2015. “Science capital”: A conceptual, methodological, and empirical argument for extending bourdieusian notions of capital beyond the arts. Journal of research in science teaching, 52(7): 922–948. DOI: https://doi.org/10.1002/tea.21227
Bell, P, Lewenstein, B, Shouse, AW and Feder, MA. 2009. Learning science in informal environments: People, places, and pursuits (Vol. 140). Washington, DC: National Academies Press.
Bonney, R, Ballard, H, Jordan, R, McCallie, E, Phillips, T, Shirk, J and Wilderman, CC. 2009. Public Participation in Scientific Research: Defining the Field and Assessing Its Potential for Informal Science Education. A CAISE Inquiry Group Report. Online submission.
Bonney, R, Phillips, TB, Ballard, HL and Enck, JW. 2016. Can citizen science enhance public understanding of science? Public understanding of science, 25(1): 2–16. DOI: https://doi.org/10.1177/0963662515607406
Cox, J, Oh, EY, Simmons, B, Lintott, C, Masters, K, Greenhill, A, Graham, G and Holmes, K. 2015. Defining and measuring success in online citizen science: A case study of Zooniverse projects. Computing in Science & Engineering, 17(4): 28–41. DOI: https://doi.org/10.1109/MCSE.2015.65
Crain, R, Cooper, C and Dickinson, JL. 2014. Citizen science: a tool for integrating studies of human and natural systems. Annual Review of Environment and Resources, 39(1): 641–665. DOI: https://doi.org/10.1146/annurev-environ-030713-154609
Depper, A. 2019. Planet Hunters, Zooniverse evaluation report. Oxford.
Edwards, R, Kirn, S, Hillman, T, Kloetzer, L, Mathieson, K, McDonnell, D and Phillips, T. 2018. ‘Learning and developing science capital through citizen science’. In Hecker, S, Haklay, M, Bowser, A, Makuch, Z, Vogel, J and Bonn, A (eds.), Citizen Science: Innovation in Open Science, Society and Policy, 381–390. London: UCL Press. DOI: https://doi.org/10.2307/j.ctv550cf2.33
European Citizen Science Association (ECSA). 2015. ‘Ten principles of citizen science’. Berlin. DOI: https://doi.org/10.17605/OSF.IO/XPR2N
Friedman, A, Allen, S, Campbell, P, Dierking, L, Flagg, B, Garibay, C, Korn, R, Silverstein, G and Ucko, D. 2008. Framework for evaluating impacts of informal science education projects, report from a ational Science Foundation workshop. Available at: http://insci.org/resources/Eval_Framework.pdf.
Fritz, S, See, L, Carlson, T, Haklay, MM, Oliver, JL, Fraisl, D, Mondardini, R, Brocklehurst, M, Shanley, LA, Schade, S and Wehn, U. 2019. Citizen science and the United Nations sustainable development goals. Nature Sustainability, 2(10): 922–930. DOI: https://doi.org/10.1038/s41893-019-0390-3
Garibay Group. 2015. Driven to discover: summative evaluation report, University of Minnesota Extension.
Groulx, M, Brisbois, MC, Lemieux, CJ, Winegardner, A and Fishback, L. 2017. A role for nature-based citizen science in promoting individual and collective climate change action? A systematic review of learning outcomes. Science Communication, 39(1): 45–76. DOI: https://doi.org/10.1177/1075547016688324
Jones, FM, Allen, C, Arteta, C, Arthur, J, Black, C, Emmerson, LM, Freeman, R, Hines, G, Lintott, CJ, Macháčková, Z and Miller, G. 2018. Time-lapse imagery and volunteer classifications from the Zooniverse Penguin Watch project. Scientific data, 5(1): 1–13. DOI: https://doi.org/10.1038/sdata.2018.124
Kieslinger, B, Schäfer, T, Heigl, F, Dörler, D, Richter, A and Bonn, A. 2018. ‘Evaluating citizen science-towards an open framework’. In Hecker, S, Haklay, M, Bowser, A, Makuch, Z, Vogel, J and Bonn, A (eds.), Citizen Science: Innovation in Open Science, Society and Policy, 81–96. London: UCL Press. DOI: https://doi.org/10.2307/j.ctv550cf2.13
Land-Zandstra, A, Agnello, G and Gültekin, YS. 2021. ‘Participants in Citizen Science’. In Vohland, K, et al. (eds.), The Science of Citizen Science, 243–260. Springer. DOI: https://doi.org/10.1007/978-3-030-58278-4_13
Leonard, A, Wheeler, S and McCulloch, M. 2022. Power to the people: Applying citizen science and computer vision to home mapping for rural energy access. International Journal of Applied Earth Observation and Geoinformation, 108: 102748. DOI: https://doi.org/10.1016/j.jag.2022.102748
Leonard, A, Wheeler, S and McCulloch, M. 2022a. Rural Home Annotation Dataset Mapped by Citizen Scientists in Satellite Imagery. Data in Brief, 42. DOI: https://doi.org/10.1016/j.dib.2022.108262
Leonard, A, Wheeler, S and McCulloch, M. 2022b. ‘Rural Home Annotation Dataset Mapped by Citizen Scientists in Satellite Imagery (Dataset)’. Mendeley Data. DOI: https://doi.org/10.1016/j.dib.2022.108262
Marshall, PJ, Lintott, CJ and Fletcher, LN. 2015. Ideas for citizen science in astronomy. Annual Review of Astronomy and Astrophysics, 53: 247–278. DOI: https://doi.org/10.1146/annurev-astro-081913-035959
National Research Council. 2009. Learning Science in Informal Environments: People, Places, and Pursuits, Clinical Chemistry. In Bell, P, et al.(eds.), Committee on Learning Science in Informal Environments. National Research Council. DOI: https://doi.org/10.1093/clinchem/31.10.1706
Paleco, C, García Peter, S, Salas Seoane, N, Kaufmann, J and Argyri, P. 2021. ‘Inclusiveness and Diversity in Citizen Science’. In Vohland, K, et al. (eds), The Science of Citizen Science, 261–282. Springer. DOI: https://doi.org/10.1007/978-3-030-58278-4
Phillips, T, Ferguson, M, Minarchek, M, Porticella, N and Bonney, R. 2014. Evaluating learning outcomes from citizen science. Ithaca: Cornell Lab of Ornithology.
Phillips, T, Porticella, N, Constas, M and Bonney, R. 2018. A framework for articulating and measuring individual learning outcomes from participation in citizen science. Citizen Science: Theory and Practice, 3(2). DOI: https://doi.org/10.5334/cstp.126
Phillips, TB, Ballard, HL, Lewenstein, BV and Bonney, R. 2019. Engagement in science through citizen science: Moving beyond data collection. Science Education, 103(3): 665–690. DOI: https://doi.org/10.1002/sce.21501
Williams, AC, Wallin, JF, Yu, H, Perale, M, Carroll, HD, Lamblin, AF, Fortson, L, Obbink, D, Lintott, CJ and Brusuelas, JH. 2014, October. A computational pipeline for crowdsourced transcriptions of Ancient Greek papyrus fragments. In 2014 IEEE International Conference on Big Data (Big Data) (pp. 100–105). IEEE. DOI: https://doi.org/10.1109/BigData.2014.7004460