Introduction

This article addresses the question: What are the best practices for online citizen science projects to have a positive impact on project volunteers? It does so by evaluating the Power to the People (PTTP) project and examining the experiences of and impacts on the citizen scientists involved.

Citizen science, or the intentional involvement of the public in the scientific research process (), promises mutual benefit to professional researchers and volunteer contributors. To professional researchers, it promises the ability to collect or process diverse data at scale, manifesting materially as cost or time savings, which accelerate or improve the work (). To citizen scientists, it promises the ability to contribute to meaningful research, to learn, and to form community with fellow volunteers (). Many citizen science projects grant access to data or methods that can be difficult to access, ranging from photographs of penguins in Antarctica () to papyrus fragments from Ancient Greece (). Moreover, these projects typically provide direct communication links with otherwise inaccessible scientific researchers. These co-benefits are generally framed as a win-win.

While the benefits of citizen science to professional researchers can be easily quantified (e.g., as the rate of data processed or the diversity of samples collected), the benefits to citizen scientists are more difficult to measure. Intangible benefits of learning, satisfaction, and connection are often implied by the researchers conducting citizen science efforts but are less frequently evaluated directly. Bonney () notes that “very few efforts to determine or measure learning or other social outcomes from participation in citizen science have been undertaken” (p. 5). A survey-based study of citizen science practitioners found that 43% did not evaluate their citizen science projects (), and those who did evaluate their projects largely used one-off bespoke tools. Standard project evaluation tools for citizen science are generally lacking (); while open frameworks have been developed to tackle this issue (, ; ), none are yet accepted as a field-wide standard.

The impacts of citizen science that merit evaluation are well established. As noted by Phillips (), the Framework for Evaluating Impacts of Informal Science Education Projects () established five core impact areas of informal science education—understanding, interest, attitudes, skills, and behaviors. The National Research Council’s influential Learning Science in Informal Environments report () echoes these and adds reflection, communication, and self-identification as a person who can undertake science. These themes are reflected in the Center for Advancement of Informal Science Education assessment rubric () and align with Archer’s definitions of science capital development along cultural, behavioral, and social dimensions (). The citizen science community seems to agree about what volunteer impacts should be evaluated, but neglects to consistently execute empirical data collection on the same.

The popularity and importance of citizen science is continuously increasing. This method can help tackle some of the most intimidating and pressing global challenges, spanning from the achievement of the Sustainable Development Goals () to the mapping of the cosmos (). It can bring powerful interdisciplinarity to align social and scientific research through interest- and place-based communities of practice (). As citizen science continues to grow in popularity, however, its utility to the research community must be counterbalanced with evidence of positive impact on contributors to adhere to its core principles and ethics (). Evidence that these projects result in contributor learning and heightened awareness is “limited but growing” (); this work aims to add to the growing evidence base.

This paper investigates the impacts of an online remote mapping citizen science project and explores best practices for mutual benefit to contributors and professional researchers. We evaluate PTTP, an online citizen science project that mapped homes in rural Uganda, Kenya, and Sierra Leone for electrical system planning () and created a training dataset for computer vision (, ). Through analysis of beta feedback, discussion board posts, an evaluation survey, and project data, we study the community composition of PTTP contributors, their learnings, their motivations, and their experiences.

About Power to the People

PTTP collected home annotations on high-resolution satellite imagery of rural off-grid areas of Kenya, Sierra Leone, and Uganda. It was hosted on the Zooniverse online citizen science platform. PTTP ran from 2nd March to 28th August 2020. Throughout the project, approximately 1,267 km2 were mapped at an average rate of 7 km2/day by more than 6,000 citizen scientists. While PTTP aimed to map homes for electrical system design, these data could also be useful in health, planning, and governance applications.

Alongside its data collection aims, PTTP aimed to meet the following engagement objectives amongst the citizen science community, which could equally be considered as hypotheses to be tested through this evaluation:

  • Engage a diverse and global community of citizen scientists. While most academic outreach tends to have a local focus, we wanted to provide access on a global scale. As this project mapped rural homes in sub-Saharan Africa, we hoped to engage citizens with knowledge of these housing styles to improve data quality. The timeline of this project coincided precisely (and unintentionally) with the start of the COVID-19 pandemic, providing a serendipitous opportunity to explore the potential of global online academic outreach at a time of high interest in virtual educational opportunities.
  • Provide an accessible, enjoyable, and engaging experience for citizen scientists. We wanted PTTP to be approachable and interesting, leaving a positive impression of the project and citizen science more broadly, thereby encouraging citizen scientists to engage with further projects in the future.
  • Integrate learning opportunities. We aimed to provide learning opportunities throughout PTTP about the research topic (i.e., the geography of sub-Saharan Africa and rural electrification) and methods (i.e., satellite imagery analysis and deep learning).

The main satellite imagery annotation interface for rural home mapping in PTTP is shown in Figure 1. Alongside this interface, the platform had “About” and “Learn More” pages, a “Field Guide,” a “Tutorial,” and a “Statistics” page. It also had a forum called “Talk” where citizen scientists could discuss images which caused them difficulty, ask questions, and chat. Further details about the technical implementation of the project are provided in ().

Figure 1 

Citizen science annotation interface (left) alongside the context menu multiple choice questions for each home annotation (right).

Methods and Materials

To evaluate the impacts of PTTP on citizen scientists, four data sources were used: beta reviews, discussion board feedback, an evaluation survey, and annotation data collected throughout the project. Each required its own methods for collection and analysis.

Beta review

PTTP was disseminated to a select number of citizen scientists for beta review (i.e., early engagement and platform testing) from 14th January to 3rd February 2020, prior to full launch. Reviewers were encouraged to leave their feedback via (1) “Talk”, and/or (2) a stand-alone feedback form that solicited feedback about user-friendliness as well as project clarity and suitability to the Zooniverse platform. The full set of beta feedback form questions are included in Supplemental File 1. These data were aggregated and analyzed to collate valuable information about first impressions, project functionality, and user-friendliness issues. Suggestions were implemented where appropriate prior to launch.

Discussion board feedback

After project launch (i.e., 2nd March to 28th August 2020), all contributors had access to the “Talk” forum. The following discussion boards were available:

  • “Notes: General comment/question thread about individual subjects.”
  • “Announcements: Project announcements and updates.”
  • “Technical questions: A place to ask for help with the platform and report any issues or bugs.”
  • “What am I looking at?!: Want help figuring out what’s going on in an unclear image? Share your trickiest images here for feedback and discussion.”
  • “Introductions and chat: Tell us about yourself!! We love hearing from you. Use this space to chat with other contributors and learn a bit about our project community.”
  • “Research questions: Ask us about the science behind this project (including computer vision, electrical grid design, and more!)”

These boards were monitored by the research team, who endeavored to assist citizen scientists and answer their questions. Board posts were exported following project termination for analysis.

Evaluation survey

An online survey was administered using Jisc Online Surveys to evaluate project impact. It was open for responses from 14th August to 14th September 2020 (i.e., approximately two weeks prior to and two weeks after project completion). The survey followed the methods of Depper (), which largely align with Bonney’s rubric (), Friedman () and the National Research Council (), and the ECSA core principles (). It aimed to investigate who contributed to PTTP, their experience while contributing, why they chose to volunteer, what impacts the project had on them, and any benefits or challenges they encountered in PTTP. A full list of survey questions is included in Supplemental File 2. The survey was advertised via an email to the project mailing list, a banner notice on the project homepage, and a post on “Talk.” This survey received ethical approval from the University of Oxford Medical Sciences Interdivisional Research Ethics Committee (R70873/RE001).

Project data

The home annotations collected during the project were also analyzed to understand if and how the experience of citizen scientists manifested in data quality issues. The annotations were analyzed for accuracy based on precision, recall, and F1 score (i.e., the harmonic mean of precision and recall), and visualized for intuitive understanding. More details on the technical specifications of data and a full analysis are available in ().

Results

Analysis results are discussed individually for each data type below. Cross-cutting insights from all results are subsequently described in the Discussion section.

Beta review

During the project beta review, 52 citizen scientists completed the feedback form, nine citizen scientists left 23 notes on discussion boards, and more than 1,300 image classifications were generated. Note that not all respondents completed all feedback form questions.

Over half (57%) of citizen scientists who completed the beta feedback form indicated that the home annotation task was moderately or very easy on a four-point scale from very easy to very hard. Those who found it moderately or very hard largely experienced issues related to image quality. Namely, they indicated that the images were too small, or that their brightness, contrast, or resolution were too low. To address this, images were up-sampled by 200% (i.e., doubled in size), and image histograms were adjusted in pre-processing for a natural color appearance prior to launch.

Beta reviewers seemed to understand the project. They were asked to describe the project goals in an open-form text prompt, and their responses were coded into four categories by the research team: understood, somewhat understood, did not answer question, and did not understand. 80% of responses indicated full understanding, while only 4% of responses explicitly misunderstood project goals. Reviewers were also asked whether the help text available on the project interface was adequate to complete the required annotation task, and 90% indicated that it was adequate (i.e., they responded “Yes” from closed-form options “Yes,” “Somewhat,” and “No”). It is important to note that 61% of respondents reported that they did not access any of the informational pages for the project (see Figure 2), which means that by simply viewing the project homepage and annotation interface, most contributors were able to understand the project.

Figure 2 

Beta form responses to the question “Did you find the additional information on other pages useful?” (n = 49). Note that most beta testers did not read the information pages beyond what was available on the homepage and the annotation interface.

Beta reviewers also requested some changes to project design. For instance, they requested additional information on how the data would be used and clarification and examples about how to label confusing cases, both of which were added prior to launch. Reviewers also requested changes to the interface, including the ability to choose what country to work on, to save question selections and apply them repeatedly, to orient the annotation boxes, to see multiple images at different times of day, to sub-tile images, to rotate images by increments, and to draw points and circles instead of boxes. Some of these changes (i.e., country choice and box orientation) were implemented, but many were impossible due to the data requirements and Zooniverse infrastructure available.

Generally, the beta review indicated a good first impression of the project. 92% of beta form respondents indicated that this project was suitable for the Zooniverse (from closed-form options “Yes” and “No”), and 54% said that they would participate once the project launched (from the closed form options “Yes and I’ll bring friends!,” “Yes,” “Not sure,” and “No”). Nevertheless, some of the most interesting feedback came from the minority who presented critiques. For instance, some were concerned that the collected data could be used nefariously or worried about privacy. Others indicated a disbelief in the project premise (i.e., that increased electricity access reduces poverty). Based on these critiques, explanations were added to help assuage privacy fears and to clarify the research premise.

Discussion board posts left by beta reviewers largely indicated the same issues identified in the beta form regarding image size and resolution. Some also asked more specific questions about how to use the interface or about particular images; these types of questions became routine after project launch.

Discussion board feedback

The “Talk” forum received 914 posts over the course of PTTP, 168 of which were made by researchers or moderators and 746 of which were made by citizen scientists. The forum boards experienced very different levels of engagement, as shown in Figure 3. “Notes” was by far the most popular board, probably due to the design of the Zooniverse platform, which provides a button in the annotation interface which links directly to “Notes” to allow citizen scientists to comment on any image they find interesting or difficult.

Figure 3 

Number of posts per Power to the People “Talk” discussion board throughout the project, with moderator and researcher posts excluded.

There was notably low engagement with the PTTP “Introductions and Chat” board (2 posts). While this could be interpreted as volunteers not experiencing a sense of community through the project, observing the contents of the other boards, this interpretation is likely to be incorrect. Within the “Notes” board, for instance, volunteers frequently helped one another interpret images and discussed image features without any prompting from researchers or project moderators. An example of this is shown in Figure 4. Such spontaneous interaction indicates a sense of community, even if the purpose-built board to spark community interaction was underused.

Figure 4 

An example of a Notes board post where citizen scientists assisted each other with images and discussed features. Usernames have been removed.

Beyond interacting with each other, citizen scientists also asked many questions and made suggestions for the research team on “Talk.” These were quite insightful and covered topics ranging from the design of the platform, to data post-processing, to the eventual use of these data. Several illustrative examples are included below:

“Have you considered adapting this project to have a “made for mobile” work stream that focuses on just Yes or No to if houses are visible? Then the desktop users can only be shown the images that are known to have houses and focus on the house mapping?”

“Anyway, how are negatives processed? How many people have to agree there is nothing there before an image is retired?”

“Have you described the next steps for the data and ongoing engagement somewhere? It certainly appears that this data might really help with more than just electrifying projects.”

“What kind of power grids will be provided to the homes? I would love it if these homes didn’t have to later convert to clean energy after using fossil fuel energy!”

Evaluation survey

The evaluation survey received 142 responses, or approximately a 2% completion rate from all project contributors. As participation was voluntary, respondents may not be fully representative of all citizen scientists contributing to PTTP. We anticipate that these respondents may represent a more engaged subset; however, they still provide a useful indication of project impacts.

Results concerning the composition of the citizen science community, their experiences, their motivations, their learnings, and their platform interaction are discussed in separate subsections below.

Community composition

Evaluation survey respondents represented six continents (Asia, Europe, North America, South America, Africa, and Australia) and 25 countries. We suspect that there were even more countries represented amongst PTTP contributors, and that the quantity of countries represented in the evaluation was affected by the sample size.

Respondents were more highly concentrated in some countries than others. Specifically, there was high representation from the United Kingdom and the United States of America (44% and 27%, respectively). This was not unexpected, as the Zooniverse platform has a high pre-existing contributor base in these countries and was founded in the United Kingdom.

Interestingly, there were more women than men represented among respondents. 59% of respondents identified as women, 36% identified as men, and the remaining 5% either identified as non-binary or preferred to self-describe or not to answer, as shown in Figure 5.

Figure 5 

Genders of Power to the People evaluation survey respondents (n = 142). Note the higher proportion of women than men.

This community also represented diverse educational and employment backgrounds. Over half (54%) had no background in science or engineering. Their employment statuses varied: 31% were students, 30% were employed full-time, and 19% were retired. The remainder were either employed part-time, in some other employment status (e.g., on disability), or preferred not to answer. Many (56%) had achieved some higher education degree (i.e., Bachelor’s, Master’s, or PhD); the remainder were all over the educational spectrum, from currently in school (11%) to having completed high school (20%) or done vocational training (6%). Respondents also varied significantly in age, as shown in Figure 6.

Figure 6 

Ages of Power to the People evaluation survey respondents (n = 142).

Experience

Evaluation survey respondents overwhelmingly reported a positive experience contributing to PTTP. 87% qualified their experience as either “Good” or “Excellent” on a closed-form five-point scale, as shown in Figure 7. Respondents were asked to explain any parts of PTTP they found particularly enjoyable in an open-from text question, and responses were reviewed and coded based on the themes they discussed. Five categories of volunteer enjoyment emerged from this exercise:

Figure 7 

Evaluation responses for contributor experience on Power to the People (n = 142). Note the overwhelming positivity, with 87% choosing either “Good” or “Excellent.”

  • Potential for impact: Enjoying the possibility for this project to make a real-world difference.
  • Discovery: Enjoying the discovery of interesting buildings, artifacts, and geographical features in the satellite imagery.
  • Learning about other places: Enjoying learning about different places through engagement with the project.
  • Connecting with researchers: Enjoying researcher engagement on project forums.
  • Convenience: Enjoying the ability to contribute on their own time and in their own home.

When asked for any general comments or thoughts about the project, several respondents indicated that it was a great use of time during COVID-19 lockdowns, and that it was a great way to volunteer and bring meaning to life:

“It has been really great to contribute to this project particularly over the corona-virus lockdown period.”

“Has been a useful distraction during the lockdown imposed by Covid, and has been good to know that I have helped in some small way.”

“Power To The People has been an easy platform for desktop volunteering for the construction industry teams I work with. Covid has prevented our other community investment activities within the UK in local to our sites, but now that 2 community development projects in Uganda have been cancelled (UK personnel being unable to travel to Africa) it’s meant we can still make a contribution while waiting to reorganize the projects.”

Contributors engaged with PTTP with varying frequency based on interest and lifestyle factors. 28% of respondents engaged weekly or more frequently, while 47% engaged 2 to 3 times a month or less, and 17% engaged only once. When asked whether that there were things that kept them from spending as much time as they would like on PTTP, 45% of respondents indicated that indeed there were, indicating reasons such as work, other commitments, medical or disability issues, forgetfulness, or increasing disinterest over time in a follow-up open-form text prompt. While 32% of respondents reported that they had shared PTTP with others, 91% of respondents had heard about PTTP themselves via the Zooniverse website. This may indicate that word of mouth may not be a good way to promote this type of citizen science project. That said, this could equally indicate that those who completed the evaluation survey were more likely to be keen or self-starters who would go seek a project to complete themselves on the Zooniverse website. More study is required on this point.

Motivations

Citizen scientists were asked to identify all reasons why they chose to engage with PTTP from a closed-form list of 10 predetermined options, including an “Other” option which provided an open-form text input if selected. The predetermined options were defined by the research team based on their observations throughout the project. The most popular reasons reported were a desire to contribute to projects with real world impact (91% reporting) and a desire to contribute to scientific research (79% reporting), as shown in Figure 8.

Figure 8 

Motivations reported by the evaluation survey respondents who selected any of the predetermined options (n = 141), for women (n = 84) and for men (n = 50).

Motivations largely align across women and men, though slightly more men than women reported the motivations “I want to contribute to scientific research” and “I am generally interested in science and/or engineering”. This interestingly seems to reflect gendered societal trends in science interest.

The importance of real-world impact to PTTP volunteers is also illustrated in their choice of other Zooniverse projects. Amongst respondents who had contributed to Zooniverse projects prior to PTTP, the most frequently mentioned previous projects were Bash the Bug, American WW1 Burial Cards, and Every Name Counts. American WW1 Burial Cards and Every Name Counts are human-focused historical projects (e.g., transcribing burial cards of first-world-war soldiers and records of holocaust victims respectively); their interest in human-focused projects may have led them to choose our project as well.

Learning

Nearly two-thirds (66%) of evaluation respondents reported that they learned something through volunteering on PTTP. These respondents were asked to describe what they had learned in an open-form text prompt, which were reviewed and coded based on their contents. Common themes included learning about rural electrification (40%), home styles and settlement patterns in rural sub-Saharan Africa (38%), the geography of Kenya, Sierra, Leone, and Uganda (31%), and satellite imagery analysis (21%). To illustrate, when asked what they learned, responses included the following:

“How to ‘read’ satellite imagery; something about conditions on the ground in rural Kenya”

“I learnt more about the layout and construction of housing in the Uganda [sic].”

“Housing patterns in rural S. L. [sic]”

“The communities in Africa are very numerous and they seem to have great agricultural skills.”

“I learned more about the landscape of Africa buy [sic] looking at the photos and seeing trees and buildings. It gave me a better understanding of the landscape.”

“I learned that even in today’s world there are still many people living without things that we consider basic necessities like electricity.”

“That access to electricity is not something to be taken for granted”

“I learned that access to reliable and sustainable electricity was part of the UN SDGs and how important it is for livelihoods in Africa to have it.”

“It hit home what we take for granted. And, it wasn’t as easy as I thought to locate the homes.”

“That there are SO many rural houses that may not have any power, it’s sad that the governments in these countries aren’t doing more.’”

When asked for more details, those who said they did not learn anything in the project indicated that the citizen science task was routine or unengaging, that they did not spend much time on the project, that they were confused and needed more help, that they were already familiar with the topic, or that they were simply not looking to learn anything.

Participating in this project also prompted 18% of respondents to do their own investigations. Most of these were via internet search, and the most popular research topics were information about the countries of Kenya, Sierra Leone, and Uganda.

Platform interaction

During their time on PTTP, 94% of respondents visited the “Tutorial” page, 73% visited the “Field Guide,” 68% visited the “About” or “Learn More” pages, and 36% visited the “Statistics” page. It is unsurprising that most visited the “Tutorial” page, as this automatically opened the first time they entered the classification interface. The “Field Guide” may be the next most popular because there is an icon to access it displayed in the classification interface. It was heartening to see that many contributors accessed the “About” or “Learn More” pages, as it was hoped that contributors would explore these pages to gain more context on the project. It is possible that the “Statistics” page was the least visited because this icon was much smaller and harder to find than the others. The Zooniverse team could consider enlarging this icon to ensure contributors know how to access statistics.

Only 22% of respondents reported using the project “Talk” discussion board. As “Talk” is the main communication link between the researchers and citizen scientists during project execution, it is an important finding that only a minority of participants use it. It is possible that “Talk” may be more appealing to certain personality types or motivations than others, as will be further addressed in the Discussion section.

Project data

Throughout PTTP, 578,010 georeferenced home annotations were made by citizen scientists on satellite imagery. Compared with a researcher-generated annotation set, the PTTP annotations achieved a precision of 49%, recall of 93%, and F1 score of 64%. The high recall may indicate that citizen scientists found most homes in the images, but is also influenced by the fact that the annotations frequently overlap, as up to 10 citizen scientists were tasked to annotate each image for a higher confidence final dataset. What is more important here is the low precision, which indicates a substantial number of annotations made where no home was present, showing that the citizen scientists struggled, at least in certain circumstances. Visualizing a sample of the annotation data illuminated the issues illustrated in Figure 9, and listed below.

Figure 9 

Issues faced by citizen scientists in annotating homes on Power to the People. Individual home annotations are visualized as yellow boxes.

  • Citizen scientists struggled with crowded images (i.e., with many homes). This was flagged on “Talk” and in the beta review. However, it was particularly stark when viewing the resulting data. Citizen scientists either struggled to find all homes or did not have the time or patience to label all homes in crowded images, resulting in variable annotation quality and density.
  • At times, citizen scientists seemingly misunderstood non-home features to be homes. This could be due to low image resolution, or inadequate explanation of the features they might be looking for in help material.
  • Citizen scientists seemed to occasionally make spurious annotations, where there was no object in the image which could have been misunderstood as a home. There are multiple reasons that could explain this, including difficulty with the project interface, frustration, or willfully submitting incorrect data.
  • Citizen scientists showed different understandings of what constitutes a “home.” While most identified each structure, some grouped multiple structures, interpreting the group as one home. As the instructions referred to homes and not structures, this is valid, and indicates an interesting difference in opinion.
  • Citizen scientists frequently under-rotated their annotations on rectangular homes. This misalignment lowers data quality. While multiple factors could be at play, this may indicate difficulty using the rotation tool, or unwillingness to take the extra time to do the additional rotation task.

Discussion

PTTP succeeded in attracting a diverse global volunteer base from a wide range of educational attainment and employment statuses. This may speak to the online citizen science being accessible across socioeconomic lines and class structures—though, of course, those with lower income and more precarious living situations may be more time constrained and have less time to invest in such efforts. The diversity of employment statuses may reflect a divergence from the high socioeconomic bracket that is expected to be overrepresented amongst citizen science participants (). However, a larger proportion of contributors had completed higher education than might be expected based on the general population, indicating that there is still work to be done to expand the reach of citizen science. The higher representation of those with more education aligns with previous observation that those with higher pre-existing scientific capital are more likely to engage with citizen science (; ). This also might indicate higher pre-existing interest in research amongst the university-educated public, though their ability to attend university is of course inextricably linked in some ways to their socioeconomic status.

It was interesting to see that contributor demographics skewed more towards women than men. This is at odds with Zooniverse’s expectations of their projects, where they tend to see more men contributing to astronomy and physics projects, and more women contributing to ecological projects (). While gender representation varies in citizen science depending on the specific project and field (), in a project whose aims are most highly linked to engineering, a high representation of women and gender divergent people is interesting. The project’s placement at the intersection of engineering, geography, and sustainable development may have helped it to attract genders typically underrepresented in engineering.

While many contributors learned something through the project, there is still room for improvement on this. 34% of PTTP citizen scientists did not learn anything while contributing, and only 18% sought their own additional information. This shows how learning something that is presented to you requires a different level of effort than feeling inspired to learn more independently.

Despite the low volume of respondents proceeding to independent inquiry, this example of action by citizen scientists based directly on their experience is exciting. While there is a strong sentiment that citizen science can encourage action on social issues, there is limited evidence to support this, as discussed in the literature with regards to climate change (). This appears to again be a simple lack of empirical data collection: The independent learning noted here therefore adds a piece of evidence to this notion, which can be strengthened through further study.

Different modalities and depths of engagement were shown by volunteers on PTTP. Throughout the project, it was clear that some contributors engaged deeply with the content through their insightful, vocal contributions on “Talk.” However, this vocal subset was not the majority: Only 22% of the contributor base engaged with the PTTP discussion forums. This subset made the most noise, and so could easily be presumed to represent the interests of the entire contributor base, putting the research team at risk of building the project to only serve those who engage via communication and community building. However, the 78% of volunteers who did not engage vocally may equally have been engaging deeply, simply through other engagement modalities, such as through independent reflection or action. This more nuanced understanding of engagement aligns with Phillips’ “Dimensions of Engagement Framework,” which divides engagement in citizen science into behavioral, learning, emotional, and social quadrants stemming from both extrinsic and intrinsic motivations (). Whilst those participating loudly in PTTP forums may find the most satisfaction through deep social engagement, others may instead act, feel, or learn through the project at various depths without vocalizing this. Indeed, PTTP implicitly aimed to provide engagement opportunity in each quadrant through a research activity (behavior), educational opportunities (learning), emphasis on real-world impact (emotional), and discussion forums (social). These quadrants also align with the aspects of PTTP that evaluation respondents indicated to be most enjoyable: their sense of discovery while annotating images and learning about other places by participating (learning), the potential for project impact (emotional), connecting to the research team (social), and the convenience of participating in their own home (behavior).

Reviewing annotation data acquired through this project, several issues were evident that were not discussed so frequently on “Talk” or in feedback forms. For instance, issues with rotating annotations were not frequently discussed, nor was any technical issue causing people to make spurious annotations. Routinely reviewing the data produced as a citizen science project is executed could illuminate some of these issues earlier in the project while they can still be addressed.

Conclusions

This work has investigated the impacts of the citizen science project “Power to the People” (PTTP) on volunteer contributors. To accomplish this, we have analyzed beta feedback collected before project launch, discussion board posts made during the project, a project evaluation survey, and mapping data generated during the project, with the aim to understand whether PTTP engaged a diverse and global community in an accessible, enjoyable, and educational experience.

It was found that PTTP did indeed attract a diverse community of volunteers spanning at least six continents and 25 countries. Amongst evaluation survey respondents, 59% identified as women and 36% as men, with the remaining 5% either outside the gender binary or preferring not to self-identify. In terms of employment, 31% were students, 30% were employed full-time, and 19% were retired, with the remainder either employed part-time, in some other employment status (e.g., on disability), or preferring not to disclose. While 54% had no background in science or engineering, 56% had achieved some higher education degree (i.e., Bachelor’s, Master’s, or PhD), indicating that despite the diversity across multiple indicators, a more highly educated population than average was being engaged by the project, as has been observed previously in the literature.

Many citizen scientists were found to engage deeply in the PTTP community. Contributor exchanges on discussion boards frequently indicated high understanding. However, not all contributors engaged at the same depth or through the same modality. Only 22% of evaluation survey respondents used the discussion boards, with others indicating a lack of time, interest, or nerve to contribute. However, those who did not engage vocally may still enjoy deep engagement through other modalities, including through solitary reflection, learning, or action. While PTTP incorporated elements catered to each of these dimensions, further study is required on the depth of engagement along each.

Generally, contributors reported a positive experience with PTTP. 87% of evaluation survey respondents qualified their experience as either “good” or “excellent.” Contributors particularly enjoyed the project’s potential for impact, their sense of discovery while annotating images, learning about other places by participating, connecting to the research team, and the convenience of participating in their own home.

Furthermore, 66% of respondents learned something through participating in PTTP. They reported learning about rural electrification (40%), home styles and settlement patterns in rural sub-Saharan Africa (38%), the geography of Kenya, Sierra, Leone, and Uganda (31%), and satellite imagery analysis (21%). Participating in this project also prompted 18% to do their own investigations, typically via internet search, and most commonly about the studied countries of Kenya, Sierra Leone, and Uganda.

Image quality, size, and crowding were the most frequent technical issues discussed by contributors, with beta feedback often indicated that images were too small, or that their brightness, contrast, or resolution were too low. This was addressed with up-sampling and color correction prior to launch. Final resulting data additionally indicated difficulty with crowded images (i.e., images with many homes) even when up-sampled, alongside other less-discussed issues such as under-rotation of annotations.

Best Practice Recommendations

Based on these findings, the following best practices are recommended for online citizen science projects in remote mapping. They are also applicable more broadly in citizen science projects using the Zooniverse interface and/or which focus on image annotation:

  • Consider intersecting demographics if targeting a diverse citizen science community. As highlighted in the results, PTTP attracted many women to volunteer; however, most were from the United States and United Kingdom. Similarly, PTTP attracted volunteers from diverse employment backgrounds; however, more than expected were university educated, in agreement with previous literature (). The analysis of the PTTP volunteer community composition conducted here shows that when targeting a diverse group of citizen scientists, it is important to look across multiple axes to ensure that more marginalized intersections are represented.
  • Emphasize inter-disciplinary aspects and real-world impact. The potential for real-world impact was found to be a huge driver of citizen scientist motivation on PTTP. This interdisciplinary work attracted a larger diversity of contributors than would be expected on a typical engineering citizen science project, particularly with regards to gender. Based on these findings, it is recommended that the potential for impact be clearly communicated to citizen scientists. Interdisciplinary elements, for instance which draw together social and physical sciences (), are also recommended as drivers of diverse engagement.
  • Provide engagement opportunities at multiple depths and using multiple modalities to make learning accessible to the whole community. While many PTTP citizen scientists engaged vocally and reported learning, the entire community did not engage at the same depth and via the same modalities. It is therefore recommended to provide learning and engagement opportunities ranging from surface-level to deep dive along multiple dimensions including social, emotional, behavioral, and educational (). By doing this and providing extension and action opportunities for the enthusiastic subset of learners, a project can create the most wide-reaching learning impacts.
  • Remember that the most vocal contributors do not necessarily represent the majority or the entire community. The most vocal contributors on PTTP were found not to be the majority, with only 22% using the “Talk” discussion forum. As such, it is important to try to elicit feedback from the quieter majority of contributors in order to “sanity-check” feedback from the vocal minority and ensure the project does not become biased to meet the needs of only one type of contributor.
  • Evaluate data quality regularly to identify silent comprehension or interface issues. Several issues which appeared when evaluating data directly did not emerge in citizen scientist feedback. It is possible that citizen scientists will not volunteer information on particular struggles because they are embarrassed, or because they do not realize they are having issues and making mistakes. However, these issues will still be evident in the data. By reviewing data regularly, these can be identified. One can then elicit feedback on these issues and minimize their impact.

Many of these best practices may appear to be common knowledge or common sense, and indeed they are amongst many in the citizen science community. They confirm various notions from the theoretical literature on citizen science engagement. However, common sense and theory require empirical validation to be justified. This study adds rigor to many of the intuitive practices already employed amongst researchers in citizen science and serves as a guidepost for those newly adopting these methods. By implementing these best practices in future similar projects, and continuing to evaluate project impact, better experiences can be created for the dedicated citizen science community.

Data Accessibility Statement

Survey details are available in the Supplemental Files. Power to the People project data are shared openly for research use at (, ).

Supplementary Files

The Supplementary files for this article can be found as follows:

Supplemental File 1

Beta feedback form. DOI: https://doi.org/10.5334/cstp.534.s1

Supplemental File 2

Evaluation survey. DOI: https://doi.org/10.5334/cstp.534.s2