Introduction

Large datasets are often required to study wildlife across geographically large areas, but collection of these data can be costly, time consuming, and logistically challenging. Scientists are increasingly looking to citizen science as a potential solution because it potentially allows economical and efficient collection of data over large spatial and temporal scales (). Observations by volunteers from projects like eBird and iNaturalist have been useful for mapping species distributions (; ; ); however, presence-only data (such as that in iNaturalist) limit inferential capability (). In some cases, volunteers can report effort, such as how long a citizen spent observing birds (e.g., something possible in eBird; ), but this is not common. Though indirect measures of effort can be used (i.e., higher human population areas have proportionally more observations; ), an alternative approach is to recruit citizens to collect data with sensors (e.g., camera traps, acoustic monitors, etc.) that record effort (e.g., sampling time/intervals) automatically. Indeed, there are several examples of citizen science projects using specialized sensors or smart phone applications that also record effort (e.g., bats (Barlow et al. 2015), air pollution (; ), and noise pollution (; )).

This sensor-based approach has opened new areas of research for citizen science and can provide more standardized and verifiable data. However, it also introduces new challenges in terms of how participants gain access to the equipment and learn new, potentially complicated techniques (). The sensor-based approach requires technological training and dedication by the volunteers, and more complicated logistics and planning by researchers to manage the equipment and data. One large-scale citizen science sensor-based project, Snapshot Wisconsin, dedicated two full-time staff members purely for volunteer management and project growth (). Dedicated staff can promote a more stable and efficient management system (), but are costly to implement.

Whether or not they use sensors, citizen science projects face many challenges with respect to study design, volunteer management, and scientific and learning outcomes (; ). The development and implementation of all citizen science projects require notable effort and strategic engagement by project managers (; ). This is particularly true of contributory-style projects (), in which scientists ask the public to collect and contribute data to answer particular research questions. Although there is no set tool or volunteer management framework to follow, some researchers have attempted to construct conceptual frameworks for those attempting to start their own citizen science endeavors (Yadav and Darlington 2017). Others have sought new strategies for recruiting and retaining volunteers (; ), as well as assessing their individual learning outcomes (). However, much remains to be learned about how to develop and manage projects to help citizen science achieve its full potential (; ). Diverse case studies can help expand the knowledge base for citizen science project management and give a better understanding of how sensors affect the development and management of citizen science projects.

In this paper, we describe our use of sensors (camera traps) to collect large-scale (i.e., statewide and over three years) wildlife records with citizen scientists through the North Carolina’s Candid Critters (NCCC) Project, a partnership between NC State University, NC Wildlife Resources Commission (NCWRC), NC Museum of Natural Sciences, eMammal, and NC Cardinal Libraries. The objectives of NCCC were to test whether large-scale citizen science camera trapping surveys were conducive to collecting wildlife records (and, with these records, estimate various species’ distribution, occupancy, and recruitment) and to involving the public in meaningful science. We discuss the challenges and successes of NCCC, including to study design, volunteer recruitment and management, equipment distribution, outreach, training, and data management, and make recommendations on how to maximize project benefits while minimizing financial costs and logistical problems associated with data collection across large scales. Our experiences are relevant to other researchers interested in large-scale, sensor-based citizen science projects for collecting large, robust (defined here as stable when conducting calculations with smaller subsets of data) datasets that can lead to further understanding of wildlife populations (R. Kays, M. Lasky, B. Pease, A.W. Parsons, and K. Pacifici, in review).

Methods

Study design

The scientific goals of the NCCC project were to collect data to estimate white-tailed deer (Odocoileus virginianus) recruitment (i.e., fawn-doe ratios), coyote (Canis latrans) occupancy, and the distribution and abundance of multiple mammal species across the state. To meet these objectives, we aimed to get a representative sample of each county of North Carolina by sampling at least 28 locations per county (as suggested in ) every season (winter [December 1–February 28], spring [March 1–May 14], summer [May 15–July 31] and fall [August 1–November 30]), 2016 through 2019. For our idealized study design, we stratified sampling between public and private lands and within three main habitat types: open (a continuous area of at least 0.02 km2 treeless land), forested (less than 0.02 km2 treeless), and developed (impervious surfaces accounting for 80% to 100% of the total cover). Stratification goals within each habitat type were proportional to the habitat makeup of each county. For example, Carteret County is 61% forest, 11% developed, and 29% open, so we aimed to have at least 17 cameras set in forests, three in developed areas, and eight in open areas. While proportionality was a useful guideline during the project, final analyses can account for habitat effects as long as an adequate sample (40–60) per category is acquired ().

Although we had our a priori study design, we did not tell volunteers where to run cameras specifically, but asked volunteers to choose a pre-selected project site or provide us with information on a site of their choosing. Though we did ask volunteers to avoid pointing cameras at roads and trails, we allowed them to choose a site on private property (92.2% of all NC lands are privately owned; ) or from public land (7.8%; ) where we already had secured permits. We worked with the National Park Service, the US Fish and Wildlife Service, the US Forest Service, North Carolina’s Wildlife Resources Commission (NCWRC), The Nature Conservancy(TNC), NC State Parks, and multiple smaller agencies to obtain 39 research permits that granted volunteers to access 175 separate public lands (with spatial scales anywhere from a few to several hundred acres) around the state at which volunteers were able to set cameras. Land managers sometimes had concerns about volunteers accessing permitted sites when the area might be scheduled for other events (e.g., logging or controlled burns); therefore, volunteers were required to contact managers before deploying cameras, giving managers a chance to deny access as needed. We also worked with 25 private landowners to allow volunteers access to large tracts of private land (i.e., volunteers had the option of setting cameras on these 25 private properties instead of on their own yard or public lands). Volunteers could choose from all of these potential camera locations (shared private or public lands) using an interactive map on the NCCC website. The map allowed users to zoom in on locations and search for a physical address or coordinates, and when a site was selected, it produced pop-up windows that listed information about the public land on which the site was located.

For cameras set at a site of volunteers’ choosing (often their yards), we asked volunteers to fill out a site description form that described aspects of their camera location that might affect local wildlife (hunting permissions, domestic animal presence, known animal feeding locations nearby, etc.). Each time they set a camera in a new location (deployment protocol described below), public or private, we asked volunteers to record deployment information (dates, location, and equipment type) and use the following field protocol:

  1. Attach the camera to a tree at knee height, aimed parallel to the ground, at a relatively open area. Clear vegetation within 2m if necessary (to prevent false camera triggering).
  2. Do not use bait or aim camera at features such as bird feeders, roads, game trails, or houses.
  3. Camera settings should include multiple (3–5) pictures for each trigger with no delay between triggers.
  4. If on a slope, the camera should face across the slope whenever possible.
  5. Conduct a “walk test” to record the maximum distance the camera trap will trigger on a human (walk test protocol was included in training)
  6. Obtain GPS coordinates of the camera’s exact location using a volunteer-supplied GPS device (i.e., smart phone).

We used un-baited motion- and heat-sensitive infrared flash trail cameras (mostly Reconyx HC 500 camera traps) to conduct the wildlife survey year-round for three years. Citizen scientists set camera traps for two- (fall) or three- (all other seasons) week deployments and spaced cameras at least 200m away from current or previous camera sites to maximize spatial coverage and reduce spatial correlation in counts (). Volunteers could borrow one of our camera traps through North Carolina Public Libraries (Figure 1) or use their own, pre-approved camera trap model (camera traps with infrared flash and trigger speed less than 0.5 seconds were allowed). Volunteers who did use their own camera trap (33% of volunteers) used 28 different models from 10 camera trap manufacturers. 83.4% of deployments were run with Reconyx cameras, 9% run with Bushnell cameras, and the remaining run with cameras from 8 other manufacturers.

Figure 1 

Camera kits (camera, batteries, cable lock, keys, and memory cards), as shown in the bottom left, were distributed to 63 libraries and checked out to trained volunteers to monitor wildlife across the state of North Carolina.

After obtaining a camera, volunteers chose a deployment site from our interactive map or deployed their camera on their private land and submitted a site description form. Once the two- or three-week deployment was complete, volunteers retrieved the camera, uploaded photos through the eMammal desktop application (), and identified animals seen in the pictures (; Figure 2a). We verified species identifications using eMammal’s online expert review application, after which the data were archived in the Smithsonian Digital Repository (Figure 2b). The majority of camera trap photos (excluding photos that possessed sensitive information–e.g., photos of endangered species or photos collected on certain private properties) and all photo metadata were made public immediately through the eMammal website (emammal.si.edu). To protect privacy, photos of people were never made available for public viewing. After photos were uploaded, volunteers could run their camera again at a different site or return the camera to the library from which they borrowed it. Three three-week camera trap deployment sessions were conducted in winter, spring, and summer. Though cameras could be borrowed at any time and set at will in between these deployment sessions, volunteers were asked to follow the deployment sessions as closely as possible to line up survey initiatives. We provided one-month breaks between seasons to give volunteers time to return cameras, if desired, or for new volunteers to go through training and prepare for the upcoming season.

Figure 2 

(a) Potential volunteers signed up on the North Carolina’s Candid Critters website, went through training, deployed a camera, identified species in photos, then uploaded photos to eMammal. (b) Camera trap data were uploaded by volunteers to eMammal and were then reviewed and validated by expert biologists before being archived into the Smithsonian Digital Repository. The data in this repository was then made available on the eMammal website.

During the fall, we focused sampling on ten representative counties for a deer-specific population study known as Fall Fawn Frenzy (FFF), which focused on estimating deer recruitment through fawn-doe ratios that can be used to help manage deer populations in NCWRC units. Focusing our cameras into ten representative counties concentrated our sample size at a time when fawns were most likely to be detected (i.e., fawns were more active as opposed to being mostly bedded, while spots on the fawns’ coats were still visible for easier fawn identification) so we could get more precise estimates (R. Kays, M. Lasky, B. Pease, A.W. Parsons, and K. Pacifici, in review). The fall season was also separated from the other seasons by a one-month hiatus, but deployments during the fall were two weeks in length (rather than the usual three weeks), and their timing differed by county. Thus, the sampling timeline can be summed up as three three-week deployment sessions in the winter, spring, and summer, and three two-week deployment sessions in the fall.

In addition to research goals outlined above, NCCC was also motivated by outreach and engagement objectives. The project provided a unique mechanism to connect a diverse network of volunteers around the state to wildlife science and conservation (), potentially changing the way the public thinks and acts with respect to wildlife. Achievement of this goal required creative and comprehensive approaches to volunteer recruitment and management, described in the section, “Volunteer recruitment and management”.

Volunteer recruitment and management

We hired one full-time volunteer coordinator and two part-time project managers to coordinate NCCC. Two to four part-time (5–10 hours per week, year-round) interns were also recruited to help ship cameras, upload data, and coordinate volunteers for the entire length of the project. Though anyone could participate in the project, we targeted recruitment efforts toward college students, naturalists, hikers, hunters, library patrons, and primary school teachers to obtain a wide variety of volunteers. Initial volunteer recruitment was achieved through press releases in local news sources across the state, a targeted Facebook advertisement, and an email campaign to NCWRC’s subscribers to eNews Blast. To continue to recruit volunteers throughout the project and to share results with existing volunteers, we sent out quarterly email newsletters, posted on social media outlets daily, and posted videos on YouTube

(both semi-annual webinars and various educational videos). We directly involved primary school classes in the project and created lesson plans that would allow NCCC to be implemented in secondary school (5th–9th grade) classes as part of the NC STEM curriculum (). We managed volunteer information internally through the NCCC website, a series of linked Google Forms and Sheets, and later with a custom web application.

Our recruitment campaigns highlighted the project’s goals of learning about wildlife, especially in participants’ yards and neighborhoods. During the fall, we centered recruitment initiatives around white-tailed deer because of our FFF sub-survey. Recruitment techniques almost always included wildlife photos, and posed questions to the public such as “What wildlife is in your yard?”.

Participants signed up on the NCCC website and were then assigned a training module consisting of written instructions, video tutorials, and an online quiz. This training module took approximately 40 minutes to complete. Once trained, participants were asked to create an account on eMammal, an online data management system for camera trap data. eMammal offers a desktop application that allows volunteers to identify species and upload photos to a database that can later be accessed by researchers for quality control and, eventually, by the public to see the results of a project (). After successful training, volunteers were approved to borrow a camera from a nearby library or use their personal camera if it possessed a trigger speed greater than 0.5 seconds and an infrared flash. We partnered with 63 libraries across North Carolina to distribute camera traps and accessories (lock, key, batteries, memory card) to volunteers (Figure 1). We used shared Google Sheets for librarians to log camera inventory information and to allow project staff to update the names of approved volunteers (i.e., those that had passed training) on an ongoing basis.

We sent prizes to volunteers after they achieved certain accomplishments to further encourage long-term participation (; ). These prize incentives were provided to volunteers at the beginning of the project, and volunteers were reminded of their existence in quarterly webinar updates. Drink snap-koozies were sent after the first deployment and a project-themed t-shirt was sent after participation for more than one year. To keep volunteers engaged, to share results, and to provide real-time feedback, we directed volunteers to eMammal’s auto-generated graphs and data overviews (see for an example data overview on the NCCC eMammal page, https://emammal.si.edu/north-carolinas-candid-critters). At the end of the project, we also created an interactive Tableau visualization on our website (Figure 3) to provide an overview of project findings.

Figure 3 

Volunteers can view species’ profiles from North Carolina’s Candid Critters data, which includes the distribution of the species across the state, the animal’s daily activity pattern, and photos obtained in the project.

To understand and enhance volunteer recruitment and retention (), we integrated assessment of participation outcomes for volunteers engaged in NCCC into the project participation process. Everyone who successfully completed training was asked to complete this web survey prior to camera deployment. We then asked volunteers who deployed at least one camera to complete an optional post-project survey 6–12 months after their initial training completion date. Whereas pre-project surveys were distributed to align with the rolling project enrollment process, post-project surveys were distributed to certain cohorts of volunteers at pre-specified times (e.g., end of monitoring seasons) to facilitate administration and minimize time burden on project and research staff.

Challenges and Solutions

Study design

The NCCC study design made it easy for volunteers to run cameras at their preferred locations, but did not allow us to meet all a priori sampling goals. We found it difficult to obtain data on public lands, in rural areas, and with an even spatial coverage. Volunteers were more likely to set cameras on their own private property (54% of cameras set by volunteers were set on private lands), resulting in slightly fewer data points from public lands. We presume that the bias towards sampling on private land was a combination of volunteer curiosity regarding what animals lived on their property, the convenience of running cameras near home, and the increased difficulty surveying public lands owing to the need to coordinate with land managers. In addition, people sampling near their home were able to provide us with the location of the camera using an online map or from smart phone–obtained coordinates, rather than a handheld GPS unit, requiring less training and equipment. We assume the small sample size from rural areas (see relative distribution of camera traps in Figure 4a) is because fewer people live in these areas, and those living in urban areas are uninclined to travel to rural areas.

Figure 4 

(a) The number of camera traps per North Carolina county. (b) The intended (left-hand pie chart) and resulting (right-hand pie chart) habitat distribution of camera traps. (c) The majority (64.5%) of camera traps were set by citizen scientists. (d) The distribution of sites across private (48.1%) and public (51.9%) lands was near even.

To address habitat and spatial sampling gaps (i.e., obtaining data on public lands, in rural areas, and with an even spatial coverage), we adopted a hybrid sampling design by monitoring where volunteers were setting cameras and supplementing data collection ourselves. To meet the objectives of FFF, we recruited the help of 3–5 NCWRC staff members in ten counties (30–50 individuals in total) and a lead NCWRC coordinator to set cameras across an even distribution of habitat types during the fall. With this assistance, 253 more cameras were set on public lands (7.4% of total deployments; Table 1). NCWRC staff assistance also led to a more even spatial coverage (see Table 1), and to acquiring 800,000 more photos (2.2 million total obtained in project) and 42,842 more wildlife observations (120,671 total) for NCCC (Table 1). Even with this assistance, the majority of all camera deployments remained set by citizen science volunteers (65% of all cameras were set by citizen scientists in all seasons, and 82% of cameras were set by citizen scientists outside of the fall season; Figure 4c). This combined effort resulted in a large sample distribution across the state (Figure 4a).

Table 1

Effects of the addition of North Carolina Wildlife Resources Commission (NCWRC) sampling efforts on volunteer sampling efforts.


GOAL UNREACHEDREALIZED BY VOLUNTEERSREALIZED WITH SUPPLEMENTATION BY NCWRCOUTCOME OF STAFF SUPPLEMENTATION

Spatial coverage: optimal = 11% of cameras should be set in developed areas, 63% in forested, and 26% in open habitats8% developed (too low)
74% forested (too high)
18% open (too low)
32% developed (increased cameras set in this habitat type)
33% forested (decreased)
51% open (increased)
Much more even distribution of camera traps across the state as compared to the optimal (actual) spatial coverage of habitats

50% camera traps set on public, 50% on private lands1,613 deployments set on private lands, 1,378 deployments on public lands37 deployments private, 253 deployments publicMore even (51%/49% staff + volunteer sampling vs 54%/46% volunteer only sampling) distribution of camera traps across public and private lands

Large dataset: number of wildlife photos1.4 million wildlife photos*800,000 wildlife photos*Number of wildlife photos obtained went from 1.4 million with just volunteer effort to 2.2 million when including staff effort, which is a 57% increase in the number of photos obtained in NCCC*

Large dataset: number of wildlife observations77,829 wildlife observations42,842 wildlife observationsNumber of wildlife observations obtained went from 77,829 with just volunteer effort to 120,671 when including staff effort, which increased the number of observations by 55%

* These are estimated values based on camera trap deployment effort and average number of wildlife photos per deployment.

Despite staff filling in sampling gaps, our goal of proportional habitat sampling was not met in open habitats (18% sampled versus 26% intended; Figure 4b). We also did not achieve our seasonal goal of obtaining 28 deployments per county each season, which may partly be due to the concentration of deployments in ten specific counties during FFF. Nevertheless, we did–with our combined (i.e., mostly citizen scientist with supplemental professional camera trapping) sampling technique–achieve a minimum sample size across all habitat types of at least 200 deployments (forest = 2,248, open = 250, developed = 225), adequate to address our wildlife science objectives (i.e., estimating species distributions and coyote occupancy; R. Kays, M. Lasky, B. Pease, A.W. Parsons, and K. Pacifici, in review). Furthermore, the sample was tested for robustness and representativeness of all habitat types across the state. This test found that the data was both robust when sub-sampled for the project’s originally proposed species distribution occupancy models and had a large enough sample size to be representative of habitat categories on a statewide level as well as for 98.4% of land at the ecoregion level (R. Kays, M. Lasky, B. Pease, A.W. Parsons, and K. Pacifici, in review).

Citizen science projects face a tradeoff between getting more samples with fewer restrictions or using a strict sampling design that leads to more challenges in recruiting volunteers (). This may be less of a problem for small-scale studies; for example, Kays et al. () was able to recruit volunteers to run cameras at specific pre-determined sites, but this targeted approach is difficult when the number of volunteers and geographic area increases. NCCC’s hybrid study design provided us with a large-scale dataset, including data from private lands (which represent more than 85% of the state) that are otherwise difficult to access. However, this required staff to monitor progress toward sampling goals during the study and supplement efforts with additional field work by project staff or associates, especially to sample in open habitats.

Based on our experience, we recommend that large projects needing a large amount of data allow more flexible sampling protocols that encourage participation. Once volunteers are engaged with the project, they can be encouraged to set sensors in specific areas through special programs or incentives to obtain data in certain locations or habitat types. For example, using a paired deployment strategy in which volunteers are asked to set one camera at an open site for every two cameras set in forested sites. Alternatively, managers could target participation by local conservation groups or clubs that can commit to meeting stratification goals before the project is initiated. This could allow project managers to realistically determine if their goals are going to be met or if they need to focus on a different approach. For smaller citizen science projects, requiring volunteers to follow a stricter protocol may make recruitment more difficult but will allow for more targeted data collection to fulfill project goals.

Volunteer recruitment and management

Recruitment

The initial launch of NCCC led to 147 volunteer signups within the first week, and the project received 3,126 signups overall (for reference, North Carolina’s population is 10.5 million; ). We tried several strategies to promote ongoing recruitment throughout the project, some of which were less successful (namely, targeted Facebook advertising, blogs through the project’s website, and newsletters sent through email blasts) and others that successfully provided a significant influx of new recruits. Our first webinar (signups = 147 within a week of release), first YouTube video (signups = 246) and an email promotion by the NCWRC (signups = 197) were the most successful initiatives. All webinars and YouTube videos were promoted through our email blasts, website, and newsletters. Press releases and partner organization promotions provided great assistance in volunteer recruitment efforts by reaching new audiences. The email promotion by the NCWRC was particularly effective for obtaining volunteers from rural areas. Our multiple ongoing recruitment efforts resulted in 84.5% of our sign ups coming in after project launch, providing evidence that continual recruitment can drastically increase the number of volunteers acquired.

When we asked volunteers why they signed up for the project (asked during the sign up process, n = 3,126), 30.4% signed up after receiving personal emails (from a friend organization), 24.4% by word of mouth (non-electronic communication) and 45.2%) because of our promotional efforts (i.e., our website, newspaper articles, newsletters, YouTube videos, Facebook page, and NCWRC promotions). The NCCC website was the top reason volunteers signed up for the project (8.1% volunteers said they signed up for this reason) followed by YouTube videos (7.2%) and newsletters (6.8%). Despite the lack of viewership for our newsletters, they had a similar recruitment effect as our YouTube videos, which received hundreds to thousands of views. However, YouTube videos were infrequent and took less time than newsletters, and were thus a more efficient way to recruit volunteers.

Overall, our volunteer recruitment helped us obtain a large (i.e., 2.2 million photos and 120,671 wildlife observations across the entire state, 1.4 million and 77,829 of which were obtained by volunteers, respectively; Table 1) wildlife record dataset. For study designs that require volunteer presence in particular regions (e.g., certain neighborhoods, communities, etc.), targeting those audiences with the help of local influential individuals or organizations is vital (). For NCCC, the NCWRC’s promotion led to an influx of volunteers from rural NC, and working with local libraries allowed us to connect more directly with these rural communities. Overall, we suggest using the help of existing organizations to promote citizen science projects to potential volunteers. Specifically, we suggest project managers search for nearby agencies or non-government organizations that have similar initiatives and partner with them to achieve shared goals. In our case, we worked with the NCWRC because they also conduct outreach initiatives to the public and educate the public in wildlife science.

Training

NCCC’s online training module allowed volunteers to complete training in their own time at their own pace. Nevertheless, using the project’s protocol was challenging for many volunteers, and the time commitment for online training and photo uploading (40 minutes and 30 minutes, respectively) was another barrier to participation. Less than half of the people (44.6%) who signed up for the project completed training, and only 21.5% of those who completed training for NCCC (n = 580, 9.6% of total signups) followed through to set a camera trap for the project. A portion of these volunteers may not have completed training because they realized partway through the large time commitment it would entail.

To improve our training module, we sent reminder emails and limited the amount of information presented to volunteers at any one time to reduce any potential frustrations. We also revised our online training several times over the course of the project to reduce training time and to emphasize key points of common volunteer questions and errors. Quizzing volunteers at the end of training led to fewer questions later on and seemed to encourage participants to pay attention to all training material ().

There are many tradeoffs to consider with respect to volunteer training. In general, online training may be more feasible for large-scale projects, but less engaging and–depending on the kind of training–less effective than in-person training. In-person training can lead to higher mean achievement by trainees () and more dedication by volunteers (e.g., only 8% of volunteers dropped out of a previous similar project with in-person training; ), but may be impractical for large-scale projects. Thought must be given to developing training materials that are visually compelling enough to engage the volunteers, while also teaching them correct protocol. In cases where complicated online training is required, we recommend including a portion of the budget to work with professionals to develop engaging training materials to minimize volunteer dropout in future projects.

Equipment

We worked with 63 libraries across the state to loan cameras to volunteers. Libraries were key to equipment distribution, which would have been unsustainable on such a large scale from a single facility without additional funds. However, coordinating equipment reservations with participating libraries proved to be logistically challenging. Because we started by working with only 15 libraries, many libraries were unable to maintain a large inventory (~30 cameras were originally distributed to each facility) because of limited space and staff. We later redistributed camera kits to numerous more libraries (n = 63 libraries) and spread out the 500 cameras to better match local demand (2–30 cameras per library). We originally provided online training for one or more librarians at each facility on how to use a Google Sheets inventory tracking system we developed for the project. However, staff turnover was common, and the system was not used consistently, resulting in incomplete camera inventory tracking. To compensate, we regularly contacted librarians by phone or email to record camera locations. Nevertheless, we successfully lent cameras (n = 1,047 lendings) to 77% (n = 435) of all NCCC volunteers, and cameras were borrowed from nearly all (82%) participating libraries across the state. This averages to approximately 8.8 camera deployments per camera available (n = 500), which is fairly low camera use considering that there were 36 deployment sessions (3 sessions per season over 3 years) throughout the project. Therefore, we could have provided fewer cameras to volunteers to save on initial purchasing costs. We believe that the project could have easily collected the same amount of data with 250 cameras based on the greatest number of cameras borrowed across the state. Of the 500 cameras made available to volunteers through libraries over three years, 29 were stolen or lost, which is a typical portion (~2%/year for cameras deployed for one month or more) for professionally run camera trap studies ().

Libraries in North Carolina do not have a unified computer system to track lending materials, so we created our own system with Google Sheets. Tracking cameras and streamlining checkouts to volunteers will be more efficient where libraries share an inventory management system. A further challenge is making sure libraries check out equipment only to volunteers who have completed the required training. Using our Google Sheets system, we added names of trained volunteers to a shared list that was checked by librarians prior to loaning a camera kit. Alternately, to avoid using Google Sheets, volunteers could be required to provide libraries with proof of training completion before checking out equipment.

Project management

The management of volunteers and equipment over large scales requires logistics and organizational tools. We originally used 60 different Google Sheets to manage volunteer signups, training, deployments, and equipment movements, but this quickly became cumbersome and error prone. Furthermore, there are potential security and participant privacy issues when using public platforms such as Google Sheets, and such platforms are often discouraged by official entities (e.g., the European Union’s General Data Protection Regulations). We created a custom-built web application to replace Google Sheets (including the library Google Sheets system) two-thirds of the way through the project, cutting volunteer management time by approximately 30%. We had little to no issues with harmonizing our website and outreach initiatives with the database outside of beta testing, and there were few changes for volunteers (the new system mimicked old forms as closely as possible). The management application was built using the Laravel MVC Framework with a MYSQL database backend facilitating camera management, tracking of volunteer training, site mapping, and custom report generation (Supplemental File 1: Appendix A). Though NCCC project managers have been the only ones to use the database thus far, we made the source code for this web application available publicly through GitHub, which can be accessed through the link available in Appendix A.

The question of whether projects are able to use simple data management systems (i.e., Google Sheets) or need to develop a new system will depend on the size of the project and the needs of the managers. We suggest that large citizen science projects preestablish an application to manage volunteers and inventory. If possible, using applications developed by similar citizen science projects would be more time and cost effective than creating new solutions. For example, citizen science projects distributing technology to volunteers across a large geographic area could use the system we developed and provide in Appendix A.

Data retrieval

To ensure cameras were set correctly and species were identified accurately, we reviewed all photos and rejected deployments that did not follow protocol. We rejected 0.7% of deployments for being set too low, 3.2% for being set too high, and 4.9% for other problems (e.g., equipment malfunctions, camera destroyed by bears, nearby food bait, white flash, facing up or down slope, unknown location). We also corrected species misidentifications and animal miscounts. On average, species were identified with 69.7% accuracy for each deployment (91.2% average accuracy for all species). Individual volunteer accuracy varied greatly, with some volunteers identifying all species correctly (100%) and others identifying none correctly (0%). However, on average, individual volunteers had a species identification accuracy of 89.1%. Some species were identified with almost perfect accuracy by volunteers (e.g., white-tailed deer and wild turkey [Meleagris gallopavo] were identified with 98% and 96% accuracy, respectively). Other species were less frequently identified accurately by volunteers (e.g., North American river otters [Lontra canadensis] were identified with only 56% accuracy). Species that were unfamiliar to volunteers, similar to related species, and those in unclear photos (animal blocked by vegetation, far from camera, nighttime photos) were often misidentified.

We used the eMammal system to manage camera data, which allowed users to upload data from their personal computers to the cloud, where we reviewed species identifications before making the data publicly available through the eMammal website (). eMammal underwent several software developments that caused errors in the system, preventing volunteers from submitting data. Across the project’s lifetime, approximately 65 camera traps (1.5% of total) were set but data from these deployments were never uploaded because volunteers deleted the data after experiencing problems with eMammal.

Because of the issues we faced, we view reliable, user-friendly software as one of the most important aspects of scaling up sensor-based citizen science research. Many large-scale citizen science projects have developed software packages for data management. For example, Ellul et al. () developed a mobile phone application for citizen science projects that can be used by administrators without the need for programming skills (i.e., provide a user interface), and SciStarter has a Participation API that can help project coordinators manage their volunteers (). Currently, Wildlife Insights is developing cloud-based software for camera trap users to make it easier for people to import images, to incorporate AI to accelerate species identifications, and to provide a storage solution that promotes global camera trap data sharing (). For sensor-based projects, having appropriate software to manage data that is user-friendly and error free is of utmost importance.

Engagement and evaluation

Volunteer engagement in citizen science projects offers a great opportunity for science communication (), and creating a dialogue between project managers and volunteers can be useful to discuss the project and its scientific goals (). Therefore, maintaining a consistent social media presence can lead to a more successful outreach program. To facilitate regular engagement between the project and volunteers, we developed a weekly calendar of social media posts by the last year of the project. These included quizzes on animal identifications, recent wildlife news articles, project findings or updates, an invitation to caption photos, best camera trap photos of the week, and fun animal facts. We acquired the most followers on Facebook (955), followed by Twitter (551) and Instagram (335). Though live Webinars received about 5–20 live views and 5–10 Q&A comments, they accrued approximately 100–200 views over time (viewership decreased over the project’s lifetime). YouTube videos received 163,902 views total and 23,415 on average. One video received 146,133 views in two years, likely because of its “click bait” title (“You Won’t Believe what this Chupacabra really is”).

Quarterly newsletters (15% viewership by all volunteers) and semiannual webinars (10–20 views) were less effective.

Pre- and post-project survey data helped our team understand the participant experience and the broader impacts of NCCC on volunteers, but obtaining these data proved more challenging than expected. Overall, slightly less than 40% of adult volunteers who completed NCCC training elected to complete the voluntary survey. Of those who completed the pre-survey, 38% completed the post. These relatively low response rates underscore the need for project managers to incentivize engagement and embed evaluation strategies directly into project protocols (without compromising human subjects research requirements), making them part of the overall project experience. Still, 314 adult volunteers and 1,158 youth volunteers (youth responses mostly from school classes) completed at least one survey associated with the project. Survey data revealed that 93% of adult respondents, most of them first-time citizen science participants (74%), were likely or very likely to continue participating. When asked what they liked the most about the project, common responses for adult and youth focused on contributing to scientific research, experiencing a sense of adventure and suspense, seeing and learning new things about wildlife, and sharing the experience with others. All of these benefits and motivations have been documented in other citizen science–based wildlife monitoring projects (). A majority of adult (60%) and youth (59%) volunteers indicated participation positively impacted their views of wildlife. Analyses of survey responses for both adults and youth indicated that NCCC participation enhanced wildlife knowledge, bolstered sense of place and connection to nature, and increased science efficacy (). Similar wildlife camera trapping projects have been shown to increase environmental literacy, science efficacy, and conservation advocacy (; ). Collectively, these results provide evidence that our engagement efforts in NCCC led to significant broader impacts, and they accentuate the idea that citizen science can benefit both science and society ().

Financial Considerations

Citizen science has been touted as a cost-effective method to collect large-scale wildlife records (). We compared the cost of running a statewide camera survey with citizen scientist volunteers (the citizen science portion of our project) versus using the projected costs of using paid field technicians to help bring light to the value of citizen science (Supplemental File 2: Appendix B). To calculate our costs, we used a formula developed by Welbourne et al. () devised to estimate the costs of camera trap surveys. We presume that both approaches would require the same upfront data management costs. The citizen science approach equipment costs were calculated as $107,500 USD, and the equipment costs for the technician approach were $64,500 USD. The citizen science approach had $217,087 total expenses associated with volunteer recruitment and management, including a three-year salary for a volunteer coordinator ($100,227 total which also paid for initial project planning and preparation), mailing cameras to libraries across the state ($6,600), and Facebook advertisements to help us recruit volunteers ($300). To conduct the same survey (i.e., same number of camera trap days) using field assistants instead of citizen scientists, we estimated the approach would require 110 weeks of field work by a pair of technicians (paid at $12/hour), including $21,805 in mileage expenses and $38,259 in out-of-town lodging. We had a 0.11% loss of equipment (either stolen or destroyed in the field) per 64,953 camera trap nights with citizen scientist usage, resulting in $2,460 in replacement costs. In our own recent field work with technicians placing cameras in public land, we had a 0.02% equipment loss, which would result in a $1,687 in replacement costs for a technician-based statewide survey. The greater loss of equipment with the citizen science approach was due to lending cameras through libraries, which had some difficulty keeping track of inventory. Nonetheless, the citizen science approach had total costs of $217,087.00 while the technician approach had total costs of $284,069.66, making the citizen science approach more economical.

It is important to note that, to meet the objectives of FFF, we recruited NCWRC staff members to help set cameras during the fall season. This necessitated significant time and travel effort by paid NCWRC staff members across the state. Neither the camera trap effort nor costs of supplementation by the NCWRC were included in our cost considerations because we specifically analyzed the efficacy of citizen science for surveying wildlife. For this cost analysis, we included only the effort of the 580 citizen science volunteers to set 3,093 camera sites, whereas the total samples size including both citizen scientist and NCWRC staff was 4,295. Details on NCWRC sampling efforts can be seen in Table 1.

There are other external costs associated with the citizen science approach that should be considered. By using libraries to move cameras, we estimated that we saved $20,940 based on $20 individual camera movement cost (certified mail) and 1,047 lendings. However, even when including these costs, the citizen science approach would still be cheaper than the technician approach ($238,027 versus $284,069.66, respectively). The other externality worth noting is the development of the eMammal system, from which both the citizen science and technician approaches could benefit in terms of data management. While not crucial to the technician approach, eMammal allowed us to collect data from volunteers remotely, which was vital for the citizen science approach. Therefore, we note that the eMammal system decreased the costs of the citizen science approach drastically, though to an unknown extent.

Cost evaluations can be seen as rough estimates, and we could imagine a variety of strategies to reduce costs for both approaches. It is also important to consider the non-economic benefits of the citizen science approach, including access to private lands (which would be impossible or much more difficult with technicians), engagement of the public with science and nature, and the fact that the volunteer coordinator can also help with other project activities such as data analysis and writing.

Conclusions

North Carolina’s Candid Critters was an ambitious citizen science project that collected 2.2 million wildlife photos from 4,295 camera traps deployed in all 100 counties in NC. This included more than 120,671 detections of 30 mammal and three terrestrial bird species (Figure 5). Most published camera trap papers use data from many fewer (e.g., 30–100) camera trap deployments (). In comparison to existing biodiversity records, the world’s museums contain about 15,000 records of North Carolina mammals and iNaturalist contains about 6,500 mammal records (iNaturalist.org; DOI: 10.15468/ab3s5x). Thus, in three years, we collected roughly five times the number of verified mammal records that were previously available in North Carolina, though it must be noted that the total number of records is not the only meaningful heuristic. Furthermore, unlike many other datasets, we have a standardized measure of effort (number of days a camera was in a given place) that make our data more valuable for studying wildlife populations. These data have already been used to analyze species activity patterns, distributions, and relative abundance in relation to multiple environmental variables (; ).

Figure 5 

Top 10 species detections across the state. We photographed elk (a reintroduced species), feral hog (an invasive species), and spotted skunk (a species of concern according to the 2015 North Carolina Wildlife Resources Commission Wildlife Action Plan).

To maximize the success of large-scale sensor-based citizen science projects, we agree with other researchers that the overall participant experience must be emphasized to improve volunteer recruitment and retention (; ; ). Modern data management cyberinfrastructure and improvements in sensor technology have opened the possibility of a new type of large-scale environmental monitoring. However, as our experience with NCCC suggests, scaling up this more-complicated style of citizen science while still adhering to best practices for volunteer management also introduces new challenges. We hope the lessons described in this paper (Table 2) are useful for project managers wishing to develop and expand similar efforts in the future.

Table 2

Challenges and potential solutions for citizen science projects using camera traps to gather large-scale datasets (based on insights from North Carolina Candid Critters [NCCC]).


CATEGORYCHALLENGEMITIGATION

Study designPersistent coverage gaps; many volunteers wanted to monitor sites based on personal preferences, not sampling protocols.1) Use an a priori study design to monitor sampling goals and adjust volunteer recruitment to meet those sampling goals.
2) Providing flexibility in study design will attract more volunteers to citizen science projects. However, this also may require data collection supplementation with professional staff to meet sampling goals.

RecruitmentDifficult to reach a wide range of people, particularly those in remote areas.3) Outreach initiatives that reached the most volunteers were press releases and promotions by partner organizations. Future projects should work to identify outreach initiatives that have the greatest recruitment potential and align with volunteer motivations.

TrainingOnline training, though efficient, may not be as effective as face-to-face training and may contribute to dropouts.4) Although online training had a higher drop-out rate than in-person training, it was less cumbersome and more efficient when training large numbers of volunteers. Future projects should weigh the pros and cons of using online training, though it may be necessary for larger citizen science endeavors.

ImplementationCollaborating with public libraries to distribute equipment led to loss of inventory and managerial challenges.5) Develop a single equipment management system, either in partnership with a distribution center (e.g., regional library system) or as part of an existing volunteer management database.

LogisticsDifficult to facilitate timely and error-free photo uploads.6) Data management remains a tremendous challenge for large-scale studies using sensors, and might require a combination of custom and off-the-shelf solutions.

Outreach and educationInvolvement in science does not always increase knowledge or science efficacy.7) NCCC participation enhanced wildlife knowledge, bolstered sense of place and connection to nature, and increased science efficacy. Targeted engagement and outreach efforts (rather than just relying on project involvement alone) are key to enhancing participant outcomes within citizen science endeavors.

Financial considerationsRelatively high cost of managing volunteers as well as collecting large datasets.8) Our fiscal expenditure for the citizen science approach was lower than what would have been spent on a professionally employed research endeavor, especially because of savings in travel costs associated with sampling large areas.

Data Accessibility Statements

Data for the cost analysis and the script for the volunteer database are provided in Appendices A and B, respectively. Furthermore, the wildlife records obtained by the North Carolina’s Candid Critters Project are available for public download from https://emammal.si.edu/analysis/data-download.

Supplementary Files

The Supplementary Files for this article can be found as follows:

Supplemental File 1

Appendix A. Volunteer Management Web Application. DOI: https://doi.org/10.5334/cstp.343.s1

Supplemental File 2

Appendix B. Project Cost Calculation. DOI: https://doi.org/10.5334/cstp.343.s2