Introduction

Environmental monitoring is important for examining changes in ecosystems over time. As both climate change and human development intensify, ecosystems globally are rapidly altering. One method for monitoring the physical environment over time is photography (), which can be used to assess on-the-ground environmental changes such as variation in vegetation (; ; ), recovery from natural disasters (), and shifts in phenology (). Such practices can provide insight into visual patterns of ecological change, which can then be further used for research purposes ().

Photo-point monitoring, also known as fixed-point photography, can be used to assess changes in local environments (). This approach often includes “repeat photography,” which involves taking multiple similar photos at specific sites over time (; ; ; ). This strategy dates back to the 19th century () and can be both reliable and cost-effective (). Repeat photography projects can allow for examination of short- and long-term ecological changes ().

Traditionally, researchers take the photographs in repeat photography projects (). However, an increasing number of repeat photography projects now involve citizen science. The international CoastSnap program (www.coastsnap.com) has 200 monitoring locations in 21 countries () and allows citizen scientists to contribute photos of coastlines that can be used to detect and map changes in shorelines (). The Changing Coasts Program in the United Kingdom asks citizen scientists to submit photos of the coastline along the Pembrokeshire Coast (https://www.pembrokeshirecoast.wales/get-involved/changing-coasts/). Other projects seek to replicate historical photos in present day to assess long-term landscape changes (). Program objectives are both scientific and data-based (, ), as well as educational, including encouraging members of the public to think about environmental issues (; ).

These programs can generate large photo archives, but are subject to the same challenges with data quality and participation rates as other citizen science projects (; ). As citizen science projects involving repeat photography by participants proliferate, formal study and assessment of these programs has begun to emerge, although most has focused on assessing programs in marine coastal areas (; ; ; , but see ; ).

In this case study, we discuss a citizen science repeat photography program that collects photos of freshwater and terrestrial systems (the “PhotoMon Project”) in Pinery Provincial Park, Canada. Pinery Park is a mixed-used park that contains rare habitat and high species diversity and also has high visitation. To preserve the integrity of its natural environment, the park is heavily managed, including regular controlled burns, a deer herd management strategy, and active control of some invasive plant species. The objective of the Pinery PhotoMon Project is to act as a low-budget mechanism for collecting and archiving standardized photos of ecologically significant areas within the park that may be referenced by future park managers to inform them about the short- and long-term effects of previous management decisions and to guide future ones. Metrics that may be collected from photos include plant density, height, diversity/richness, and abundance of foliage. The PhotoMon Project consists of a series of sites located in significant habitats throughout the park. Park visitors are requested to take photos of standardized fields of view at each site and submit these to the Project. These photos create a valuable record of changes in the park ecosystem over time, and are primary data that may be used in the future to address research questions and to guide park management practices. To be effective, the Project must receive consistent submissions of photos that align with project criteria. The minimum goal of the PhotoMon Project is to archive photos that provide a coarse (one photo per season) documentation of ecosystem reference states at various locations in the park. However, some elements of the landscape are most effectively observed using more frequent photos (e.g., presence and abundance of ephemeral wildflowers [; ]; plant phenology []; etc.), and the more ambitious goals of the project are to provide higher-resolution archives of five or ten photos per site per season. Additionally, Pinery Park has an active interpretive program, and a further Project goal is to provide visitors with opportunities for park stewardship.

We investigated the effectiveness of the Pinery Park PhotoMon Project at meeting its research/monitoring objective of developing a longitudinal archive of standardized photos that may be used for data collection, as well as its reach as an interpretive (educational) tool. Because the success of the program requires both consistent submission of photos over time, as well as submission of photos that meet the project criteria, we investigated how the number (quantity) of submissions varied over time, as well as how closely those photos met a series of idealized criteria (quality). We investigated variation in photo quantity and quality among seasons and years for the first seven years of the program, as well as during the COVID-19 pandemic.

Methods

Case study site

Pinery Provincial Park (Figure 1) is a 21 km2 park in Ontario on the southeastern shore of Lake Huron. It is a mixed-use park that receives approximately 750,000 visitors annually. Most visitation occurs during the summer, when many visitors camp at the park. Recreational opportunities include swimming in Lake Huron, paddling in the Old Ausable River Channel, hiking and biking on the 11 trails in the park; and attending interpretive programs. Pinery Park contains significant habitat, including rare Oak Savanna and Carolinian Forest ecosystems and freshwater coastal dune systems. It is the home of numerous species designated as Species At Risk with legal protection under provincial and national statutes. The park is actively managed, including large-scale interventions such as annual prescribed burns, invasive species mitigation, and white-tailed deer herd management. The deer management initiatives necessitate short (week-long) annual park closures in autumn. The park was also closed from mid-March–May 2020 at the onset of the COVID-19 pandemic.

Figure 1 

Locations of ten PhotoMon Project sites within Pinery Provincial Park, Ontario. Sites 1 and 2 capture images of Oak Savanna woodland; sites 3, 4, 5, and 6 have views at various points along the Old Ausable River Channel; sites 7 and 9 are located in the freshwater dune system and look out over the beaches and water of Lake Huron; site 10 looks out over a small pond within Carolinian forest; and site 8 looks at a wet meadow where a road was removed and restored to natural vegetation. Sites 2, 3, 4, 7, and 9 have year-round vehicle access (red), while the remaining sites are gated for parts of the year (yellow). Total numbers of photos submitted per site from 2014 through 2021 are provided (n = 2,567).

The PhotoMon Project

The PhotoMon Project started in May 2014 with ten sites located in significant habitats (Figure 1). Sites are located in Oak Savanna (1 and 2), Carolinian Forest (10), and wet meadow (8) habitats; looking out over freshwater coastal dune systems and the Lake Huron beach (7 and 9); overlooking the Old Ausable River Channel (3, 4, 5, and 6); and overlooking a pond (also 10). Seasonal vehicle access varies among sites: Road access to site 1 is gated when there is snow (December–April), and the road to sites 5, 6, 8, and 10 is gated from November 1 until Easter (March/April). When roads are gated, associated PhotoMon sites can still be accessed by hikers, skiiers, and cyclists. All PhotoMon sites were active throughout the study period except two sites along the Lake Huron shoreline, which washed out during storms (site 7 in November 2017 and site 9 in April 2020).

Each site consists of an explanatory sign and a post with a grooved top (Figure 2). All photo submissions are made by email and reviewed by one designated volunteer with the Friends of Pinery Park (FoPP) Organization (RF, 2015 to present), who manages the submissions approximately weekly. Photos are immediately posted on the FoPP website (https://pinerypark.on.ca/photomon/) unless they diverge substantially from Project criteria (e.g., facing the wrong direction, filtered) or they feature people. To protect visitor privacy, no photos of people are uploaded; these are either withheld, or (site 5 only) people on the featured dock are digitally removed/edited from the photos. All submitted photos, including those not uploaded to the website, are stored online using an FTP client. The total cost of the PhotoMon Project includes site materials and installation (approximately $300 CAD per site); storing photos on the website ($25 CAD per year); and in-kind contributions of the designated volunteer time.

Figure 2 

Example PhotoMon Project site. Each of ten PhotoMon sites in Pinery Provincial Park were marked with (a) a sign affixed to a grooved post for participants to use as a camera rest (illustrated here by site 9, overlooking the Lake Huron coastline), including (b) instructions for participation.

Park visitors discover PhotoMon sites during their regular movements. Further, the Project is publicized through several channels, including the FoPP website and via the Pinery Park social media accounts. Park staff have learned about the project through an internal newsletter, and park interpreters encourage park visitors to participate during interpretive programs. The Project was previously highlighted in the park tabloid (discontinued in 2018), a paper booklet for all park visitors.

Data collection from photos

We counted and analyzed all photos submitted between Project inception (May 2014) and February 2021. We reported both the raw number of photo submissions and the number relative to park visitation. We quantified park visitation rates based on data collected by an automatic speed sign that was installed near the park’s front gate in spring 2017 (TrafficLogix SafePace digital radar speed sign), hereafter referred to as the traffic counter. The traffic counter is triggered each time a vehicle approaches it while entering the park. While not a perfect measure of visitation (there may be multiple people in a car; the same car may enter and exit the park multiple times during a camping visit), this metric provides an index of visitor activity. We calculated the rate of submissions relative to visitor activity by dividing the number of photo submissions by the number of vehicle triggers per unit time. The traffic counter failed for multiple days in December, January, and February in 2018, 2019, and 2020, necessitating the removal of the winter season from analyses that used traffic data.

To assess photo submission quality, we developed a scoring method that assigned photos quality scores between zero and two in various categories (Table 1), where zero corresponded to a photo exactly meeting the project’s ideal criteria, one indicated minor divergence from criteria; and two indicated major divergence. To provide an overall measure of individual photo quality, we generated a total error score for each photo, calculated as the sum of scores from each of the error categories. Photos could receive a score between 0 and 16, where a photo that met all project criteria had a score of zero. The higher the score of a photo, the more it diverged from the project criteria. We (ASM) blurred the faces of any people included in the photo submissions using the Smudge Tool function in the software package, Adobe Photoshop (v. 23.4.1), before the photos were scored. We conducted an interobserver reliability test to measure replicability of the scoring method. We selected 30 photos from the photo-monitoring project to independently score (two observers, CF and VF), and we subsequently compared results by calculating the sum of the total differences (absolute value) for each photo-quality category, and converting this number to a % of the total potential difference between observers. We also recognized that some unexpected elements of photo quality might not be captured by our rating system, and so noted additional issues as we observed them.

Table 1

Photo-quality scoring categories and associated descriptions.


CATEGORYSCOREDESCRIPTION

Filter0No filter; natural colors

1Slight filter; some color alteration

2Heavy filter; obvious color alteration

Effect0No effect; photo not blurry or magnified

1Slight effect; photo slightly blurry or magnified

2Heavy effect; photo highly blurry or magnified

Lighting0Suitable lighting; all features clearly visible

1Slightly bright or dark; some features not visible (i.e., slightly dark trees, slight reflection of sun on water)

2Highly bright or dark; most features not visible (i.e., photo captures direct sun, dark landscape at sunset)

Orientation0Landscape style photo

1Diagonal or square style photo

2Portrait style photo

Vertical angle (pitch)0Photo upright; no landscape features cut out (horizon of land is centered in photo)

1Photo not perfectly upright but still captures most of the landscape; some features cut out (horizon of land is slightly tilted up or down)

2Photo not upright and captures mostly ground or sky; most features cut out (horizon of land is not at all centered in photo)

Direction0Photo captures correct landscape

1Photo captures parts of the correct landscape

2Photo captures a different landscape than what is asked for

People0No people in photo

1People present but cause little to no disruption of photo (people at a distance, non-recognizable)

2People present; disrupt photo (people close to the camera, recognizable)

Animals0No animals in photo

1Animals present but cause little to no disruption of photo (animals at a distance)

2Animals present; disrupt photo (animals close to camera)

Data analysis

For all analyses, we used the software packages R (v.4.0.0) and RStudio (v.1.2.5033) and the “stats” and “rstatix” packages. To determine whether we could include data from the pandemic in our broader seasonal analysis, we first investigated the impact of the COVID-19 pandemic on PhotoMon submissions using two sets of analyses. First, we compared the raw and relative numbers of photos submitted for each of nine months at the eight active PhotoMon sites following the re-opening of the park after the onset of the pandemic (June through February) and the corresponding months that preceded the onset of the pandemic. We used paired t-tests, conducted using the “t.test” function, to make these comparisons, and Wilcoxon signed rank tests, conducted using the “wilcox_test” function, when the assumptions for parametric tests were not met. Second, we conducted Wilcoxon signed rank tests to compare the same response variables at each of eight sites, before and during the pandemic, during the period of greatest submissions (June through August.)

To investigate variation in PhotoMon activity among seasons we defined the seasons as: summer (June through August); fall (September through November); winter (December through February); and spring (March through May). We assigned winter seasons to the year that included December. We then compared the total number of photos submitted, the number from the five PhotoMon sites with year-round vehicle access, and the quality of photos from all sites, among seasons from the years 2014 through 2021 with one-way ANOVA tests, using the “aov” and “TukeyHSD” functions. We excluded quantity data from spring 2020 because of pandemic-related closures. We further compared the number of traffic counter triggers and the relative number of photo submissions between summer and fall from 2017 through 2021, including all sites and the subset with year-round vehicle access, using Kruskal-Wallis tests (“kruskal.test” function). We excluded winter and spring from these latter analyses because of malfunctions in the traffic counter (winter) and extremely small sample size (spring, n = 2).

Results

From inception until February 19, 2021, 2,567 photos were submitted to the PhotoMon Program (Figure 3a; Supplemental File 1: Appendix A). Excluding the first partial year and the pandemic year, a mean +/ standard deviation of 405 +/– 100.9 photos were submitted per year, ranging from 336 (2015) to 574 (2016) photos per year. The total number of photos submitted per site was highly variable, with 471 submissions from site 3 (Old Ausable River Channel bridge) and 479 submissons from site 10 (Carolinian Trail pond), while site 8 (Burley Campground Meadow) had only 125 submissions.

Figure 3 

Time series from summer 2014 through winter 2021 showing (a) the total number of photos submitted per season to the PhotoMon Project; (b) the total number of vehicles counted per season by the traffic counter, which was installed in summer 2017 (winter data omitted due to multiple winter season malfunctions); and (c) the number of photos submitted per vehicle per season. The onset of the COVID-19 pandemic (spring 2020) is indicated with dotted vertical lines; the park was closed to visitors from mid-March through mid-May, 2020.

Ninety-eight percent (249 of 254) of site-season combinations met the minimum project objective of receiving at least one submission per season. Only 6 sites had 0 photo submissions in a season, and 3 were gated at the time when no photo submissions occurred (Figure 4a). Seventy-one percent (180) of site-season combinations met the more ambitious project goal of having 5 or more photo submissions, and 43% received 10 or more. The mean +/– standard deviation number of photos submitted per site per season (starting summer 2014) was 10.09 +/– 8.23 with a maximum of 43 (site 10). Notably, there regularly (n = 32) were photo submissions from gated sites. The traffic counter was triggered 573,762 times (Figure 3b). Seasonal (summer and fall) submission rates relative to park traffic suggest very low participation (values from 0.0009 to 0.005 photos/vehicle [Figure 3c]).

Figure 4 

Frequency histograms of photo submission quantity and quality. (a) Most sites were the subject of five or more photos per season, with only six sites having 0 photo submissions in a single season. Sample size n = 254 site-season combinations. (b) Most photos had low quality scores, indicating that they closely matched the PhotoMon Project specifications. Sample size n = 2,567.

There was a 1.4% difference between raters’ total quality scores in the inter-rate reliability comparison. Slight differences occurred in the scores of the filter (1.6%), effect (3.3%), and lighting (6.6%) categories. We accepted these results and proceeded. Sixty-five percent of photos had error scores of 0 or 1 (Figure 4b), indicating that they completely met project criteria or were very close, and 83% had scores of 2 or less. Characteristics that commonly increased photo scores were poor lighting (7% major; 30% minor) and incorrect orientation (20% portrait; 2% square/diagonal). A few photos captured the wrong field of view entirely (2%) or partially (2%) (Table 2). During the study period, the small pond associated with Site 10 was encroached by invasive Phragmites australis. Consequently, tall grasses now dominate the foreground of submitted photos in summer, obscuring the original landscape, which was previously visible year-round.

Table 2

Percentage of photos that diverged from PhotoMon project criteria in major (score = 2) and minor (score = 1) ways. Individual photos may have received scores of 2 or 1 in multiple categories (n = 2567.)


CATEGORY/SCORE TYPEFILTEREFFECTLIGHTINGORIENTATIONVERTICAL ANGLE (PITCH)DIRECTIONPEOPLEANIMALS

Major (score = 2)0.584.326.8619.561.561.870.510.04

Minor (score = 1)0.978.4129.571.798.221.874.520.23

Sum1.5612.7436.4221.359.783.745.030.27

Impacts of the COVID-19 pandemic on participation in the PhotoMon Program

We detected no effect of the COVID-19 pandemic on participation in the PhotoMon Program (Supplemental File 2: Appendix B). There was no difference in the raw number of photos submitted (paired t-test: df = 8; t = 0.176; p = 0.864; mean +/– standard deviation photos submitted per month pre-pandemic = 26.7 +/– 20.7 and during pandemic = 27.9 +/– 22.9) or the number of photos relative to traffic (Wilcoxon signed rank test: df = 4; W = 9.0; p = 0.812; mean +/– standard deviation photos relative to traffic per month pre-pandemic = 0.002 +/– 0.002 and during pandemic = 0.002 +/– 0.001) (Supplemental File 2: Appendix B). Further, there was no difference in the mean number of photos submitted per site during the summer (paired t-test: df = 7; t = 1.8426; p = 0.1079; mean +/– standard deviation photos submitted per site during summer pre-pandemic = 16.38 +/– 11.34 and during pandemic = 10.75 +/– 9.57) and no difference in the number of photos submitted per site relative to traffic in summer (paired t-test: df = 7; t = 1.3573; p = 0.2168; mean +/– standard deviation photos submitted per site relative to traffic during summer pre-pandemic = 1.55 × 10–4 +/– 1.07 × 10–4 and during pandemic = 1.14 × 10–4 +/– 1.02 × 10–4).

Seasonal variation in participation in the PhotoMon Program

The raw number of photos submitted to the PhotoMon Program was lower in winter than in any other season; also, marginally fewer photos were submitted in spring than summer (Table 3; one-way ANOVA: df = 3, 22; F = 14.13; p < 0.01; Tukey HSD: winter-fall p < 0.01; winter-spring p = 0.01; winter-summer p < 0.01; spring-summer p = 0.087; mean +/– standard deviation: spring = 100.0 +/– 43.3; summer = 146.0 +/– 34.6; fall = 106.0 +/– 25.4; winter = 39.3 +/– 20.4). When only sites with year-round road access were considered, fewer photos were submitted in winter than in summer (Figure 5a) (one-way ANOVA: df = 3, 22; F = 6.054; p < 0.01; Tukey HSD: winter- summer p < 0.01; mean +/– standard deviation: spring = 47 +/– 22.3; summer = 63.6 +/– 16.1; fall = 47.6 +/– 13.8; winter = 26.4 +/– 14.1.)

Table 3

Summary of photo quantity and quality comparisons among seasons. Comparisons of photo quantity and quality were made across seasons, including photo quantity and quality data from the complete set of ten sites and photo quantity data from only the five sites with year-round vehicle access (data from spring 2020 not included).


SITES INCLUDEDFACTORS COMPAREDSIGNIFICANCE OVERALLPAIRWISE COMPARISONS (ONLY SIGNIFICANT [p < 0.05] AND MARGINALLY SIGNIFICANT [0.05 < p < 0.10] DIFFERENCES SHOWN)

All (n = 10)Total (raw) number of photos/site/season (2014–2021)p < 0.01Winter < fall (p < 0.01)
Winter < summer (p < 0.01)
Winter < spring (p = 0.01)
Spring < summer (p = 0.087)

Number of photos/site/season relative to vehicles triggering traffic sign (winter and spring data not included; 2017–2021)p = 0.02Summer < fall (p = 0.02)

Photo quality scores (2014–2021) p = 0.577None

Sites with year-round vehicular access (n = 5)Total (raw) number of photos/site/season (2014–2021)p < 0.01Winter < summer (p < 0.01)

Number of photos/site/season relative to vehicles triggering traffic sign (winter data not included; 2017–2021)p = 0.02Summer < fall (p = 0.02)

N/ANumber of vehicles triggering traffic sign/season (winter and spring data not included; 2017–2021)p = 0.02Fall < summer (p = 0.02)

Figure 5 

Total photo submissions and photo submissions relative to park traffic from the five PhotoMon sites with year-round vehicle access. (a) More photos were submitted overall in summer than in winter, whereas (b) fewer photo submissions relative to park traffic (number of vehicles entering park) occurred in summer than in fall. Sample sizes in parentheses. Because of the small sample size, the spring data in the lower panel were not included in the statistical analyses, but are shown here for illustration. Traffic data were missing from multiple winter seasons, and so winter is not included in the lower panel or associated analysis.

The traffic counter was triggered more in summer than fall (Kruskal-Wallis test: df = 1; X2 = 5.33; p = 0.02; mean +/– standard deviation: spring = 1,5625 +/– 6,677; summer = 9,6213 +/– 13,507; fall = 30,727 +/– 5,930) and fewer photos were submitted relative to traffic in summer than fall (Kruskal-Wallis test: df = 1; X2 = 5.33; p = 0.02; mean +/– standard deviation: spring = 0.004 +/– 0.001; summer = 0.001 +/– 0.00; fall = 0.004 +/– 0.001.) Similarly, the sites with year-round vehicle access received fewer photo submissions relative to traffic in summer than in either spring or fall (Figure 3b) (Kruskal-Wallis test: df = 1; X2 = 5.33; p = 0.02; mean +/– standard deviation: spring = 0.002 +/– 0.001; summer = 0.001 +/– 0.00; fall = 0.002 +/– 0.001) (Figure 5b). Although spring data were not included in the tests that involved traffic data, the mean number of vehicles counted in spring was lower than in summer, and the mean number of photos/vehicle was higher in spring than in summer for all sites and the subset of sites with year-round vehicle access. There was no variation in the mean quality scores of the submitted photos among seasons (one-way ANOVA: df = 3, 23; F = 0.354; p = 0.577; mean +/– standard deviation: spring = 1.07 +/– 0.118; summer = 1.36 +/– 0.225; fall = 1.28 +/– 0.256; winter = 1.11 +/– 0.264) (Figure 4b).

Discussion

Overview

Overall, the Pinery PhotoMon Project met its scientific objective of compiling regular, seasonal reference photos of key ecosystems in the park that met project quality criteria. The Project met its minimum scientific objective, with most sites having at least one photo submission per season. The more ambitious scientific objectives of having 5 or 10 submissions per site/season combination were partially met, with the majority of sites having at least five submissions per season, and less than half having ten or more.

Most photos closely met the PhotoMon Project ideal criteria, with low error scores and very few photos containing critical errors. There was no effect of the COVID-19 pandemic on the quantity of photo submissions, which otherwise varied with season, reaching the lowest level during the winter. Notably, although the raw number of photos submitted during the summer was higher than during winter, the proportion of park visitors who participated in the project was lowest in the summer season. The number of photo submissions relative to park traffic was extremely low (five photos or less per thousand vehicles), suggesting that there is potential for the project to more effectively meet its educational objective to provide learning and stewardship opportunities for park visitors. Given that the project operated with a low budget and required little ongoing maintenance, its success in achieving its minimum scientific objectives demonstrates the value of such citizen science-based photo-point monitoring programs.

Quantity of submissions to the PhotoMon Project

Participation varied substantially among sites, with the most photographed sites having severalfold more submissions than the least photographed ones. Inter-site variability has been documented in other citizen science photo-point monitoring programs, and monthly photo submission rates may vary from 0.44 to 52.75 (). The PhotoMon mean seasonal site submission rate of 10.8 converts to approximately 3.4 photos/site monthly, and is within this range.

Our findings support the idea that foot traffic is a primary driver of participation (). PhotoMon sites 3 and 4 were two of the most popular sites and are located on either side of a busy bridge that includes canoe rentals, several docks, and proximity to the park store and visitor centre. Similarly, site 10 was the most photographed site and is located at a look-out platform along a popular hiking trail. Foot traffic is apparently more of a driver of participation than vehicular traffic, as visitors must drive by the much less photographed sites 5 and 6 to access site 10. However, not all sites located in busy areas received high submission rates. Prior to being washed out, site 9 was located at the access point to a busy beach. Despite this location, the number of photo submissions from this site was relatively low. Harley and Kinsela () similarly found lower participation at a residential beach access point than at a site along a popular walking trail with lookout. They suggested that people passing the access point were less likely to participate because doing so would require them to alter their natural behaviour by stopping. The same may be true at the PhotoMon site 9, where visitors are primarily moving by the site en route to the beach.

A unique element of this study is our demonstration of clear seasonality in participation. Harley and Kinsela () reported relatively consistent photo submission rates across 44 Australian CoastSnap sites (all temperate locations), but suggested that more seasonality might be observed at sites with greater climatic extremes, as supported by our results. Park vehicular traffic in summer was approximately three times greater in summer than in fall and six times greater than in spring, so it follows that there was a higher photo submission rate in summer than other seasons. The seasonal variation in the proportion of visitors participating in the PhotoMon project suggests concurrent shifts in park visitor demographics.

A limitation of the PhotoMon project is that there is little information available about the participants. However, we can look to other citizen science projects to propose some likely reasons for seasonal variation in the proportion of visitors participating rates. Two important types () of participants in citizen science photo-point monitoring projects have previously been identified: transient participants, who may participate only once (often tourists), and local residents (called local champions), who may participate multiple times as part of a regular routine and can be critical for the long-term success of a program (). Within the CoastSnap Program, sites at high-profile tourist locations may have submissions dominated by single-submission participants, whereas more rural locations may rely on local champions for regular photo submissions (; ). Pinery Park has exceptional recreational opportunities and highly significant ecological features (), appealing to visitors with a broad range of interests. Different visitor activities peak at different times of the year (e.g., lake swimming in summer; bird watching during spring migration; cross country skiing during winter), with overall visitation at its highest in summer. It is reasonable to expect that visitor demographics, and accompanying motivations to participate in citizen science, may also vary among the seasons. It is possible that a relatively small number of local champions are responsible for the majority of submissions, and these people represent a much smaller proportion of overall park visitors in summer than in other seasons.

Research on the motivations of citizen scientists to participate has built on an existing body of literature about the motivations of volunteers more generally (e.g., ; ), and has proliferated in recent years (). There is limited research specifically on the motivations of participants of citizen science photo-point monitoring programs. Some common responses from participants on their reasons for submitting photos included that they liked contributing to scientific knowledge or their community, that the projects were easy or fun/interesting (Roger et al. 2019), and that they enjoyed doing activities in the local environment (a beach) (). Citizen scientist motivations likely vary among people and projects (), and there has been recent emphasis on understanding the demographics () and associated motivations of particular groups of participants in order to tailor citizen science projects and associated communications to them (e.g., ).

Communications for the Pinery PhotoMon Program emphasize the importance of participation in the program as a means to increase visitor stewardship in the park, with instructions stating “park ecologists will have a large library of photos to monitor changes from week to week and year to year.” The project website further encourages participation by asking park visitors to “contribute to monitoring and protection of some of your favourite trails and beaches.” This approach to messaging seeks to encourage participation by appealing to potential participants’ values-driven or altruistic () motivations to contribute to science and protect the park. Such an approach is reasonable given previous findings about participants in similar projects (, ), research suggesting that values-driven or altruistic motivations often are important drivers in citizen science participation (e.g., ; ), and the current success of this approach at soliciting the existing collection of photos.

However, project participation may be improved by targeting other motivations, also. For example, some programs successfully motivated volunteers through extrinsic () motivators, encouraging participants to compete (), to earn badges (), and to engage in other games (). The CoastSnap program publishes a leaderboard, and participants can like and comment on photos (). A similar approach could be implemented at Pinery Park, where many visiting groups are families whose children are already completing tasks toward a junior naturalist certification. Diversification of messaging about the program may be particularly important given the significant variation in project participation rates among seasons.

It is also important to consider barriers to participation. The User Experience (UX; ) is a critical component of citizen science participation, particularly in programs with mobile phone components (). PhotoMon submissions come via email only, which may discourage some users who use other modes of digital communication. Participants may prefer different submission modes, and having a range of these can encourage a diversity of participants (). For example, email submissions are a preferred mode for local champions of the CoastSnap project, and some CoastSnap sites have predominantly email submissions, whereas sites that have more tourist participants often have submissions predominantly through Instagram (, ).

Limited internet connectivity is an important barrier to participation for the PhotoMon Project. There is no wireless internet and limited cellular service throughout many areas of the park, limiting photo submission in the moment. It may be helpful to provide visitors with reminders at key locations to submit PhotoMon photos when they are able. Many citizen science projects have associated mobile applications, and a PhotoMon application could allow users to store their photos and receive later notifications to upload (), as well as other useful features (e.g., allow project organizers to broadcast messages, multilingual program offerings, etc.; ).

Quality of submissions to the PhotoMon Program

A main objective of the PhotoMon project is to compile photos that managers may use to guide management decisions. Consequently, regular submissions of standardized photos that feature the requested fields of view and are clear and well lit are critical to the success of the project. The PhotoMon project fared quite well at collecting photos that met or approached the ideal criteria, although there were some deviations. Most commonly, photos had unsuitable lighting, meaning that some details were not easily viewable as a result of shadows or overexposure. The lighting issues might preclude park managers from collecting some data (e.g., species recognition, exact number of individual plants, etc.) Also common were photos submitted in portrait orientation instead of the requested landscape. A photo library with different fields of view has implications for future efforts at making standardized measurements of plant density or abundance.

There are currently relatively little data on the characteristics of photo submissions to citizen science photo-point monitoring projects. Only 54 of the 396 photos submitted in 2018–19 to the Bournemouth site of the CoastSnap project were usable for the project objectives (), with the main issues including image dimensions, overall quality, timing of the tide, and the presence of people obscuring relevant features. Submissions to the CoastSnap program more broadly may include selfies, also obscuring important landscape features (). Several elements of the PhotoMon project likely contribute to its relative success in avoiding these particular issues. The coastal sites looked over Lake Huron, which does not have tides. Further, many of the PhotoMon sites look over relatively inaccessible fields of view (bodies of water and habitats with poison ivy), limiting the potential for people and pets to be included in photos. The low incidence of selfies may be related to the project’s use of email submissions only, as Harley and Kinsela () discuss selfies specifically with respect to submissions made using social media, where selfies are very common.

To improve the extent to which future submissions meet project criteria, we look to other citizen science projects with similar methods. To support participants in standardizing their images, we suggest including a sample photo of each landscape on its associated sign (e.g., ). We further suggest making training videos available to potential participants (). To minimize variation in the height of the horizon (i.e., pitch, vertical angle), camera backrests or purpose-built phone cradles (e.g., ; www.coastnap.com) could be included. If a mobile application were developed, accelerometer measurements could help smartphone users position their devices (), an innovation that may also be able to provide automated feedback on photo lighting.

Finally, given the frequency of lighting issues, participants may benefit from specific guidelines on the timing of photos (e.g., not close to sunrise/sunset). Variation in lighting may also result from the settings and capabilities of the different cameras/phones (). Metadata about camera/phone type is beyond the scope of the project, but these factors likely play an important role in better understanding variation in some elements of photo quality. Monitoring changes over time in the proportions of foreground and background is critical, as evidenced by the encroachment of tall grasses at the pond-based site.

Limitations and challenges

A limitation of our study is that we do not have detailed information about project participants. Specifically, it is unknown if participants are one-time or repeat visitors, whether they live locally or not, or how many photos they have submitted. We can only speculate on their motivations for participating. All of these are important areas for future research that will help to better connect patterns of participation in the PhotoMon Project with those of other, similar projects.

Several challenges presented during the course of the study period. The loss of both beachfront sites resulted in loss of potential data. Installation of sites in appropriate locations is also cited as a challenge for the CoastSnap program (), although their main challenge was finding a safely accessible site that provides an unhindered view. There are many unhindered views of the Pinery Lake Huron shoreline, but the undeveloped dune system means that small-scale infrastructure is prone to being lost during large storms. This will likely be an enduring issue for the Pinery PhotoMon Project.

Further, the COVID-19 pandemic occurred during our study period, although we detected no effects of the pandemic on project participation. This is surprising, as the pandemic necessitated large changes in human behaviour, and other citizen science projects experienced shifts in participation rates and data outcomes (e.g., ; ; ). However, overall participation at 44 Australian CoastSnap sites also did not decline during the pandemic, although participation at one site located in a tourism hotspot did (). After re-opening in May 2020, Pinery Park had very high visitation rates during the first year of the pandemic. It is possible that the pandemic resulted in changes in participant behaviour (e.g., relative frequency of long-distance tourists versus local residents, and frequency of repeat photo submissions) that were not detected through the relatively coarse metrics of our study.

A final emerging challenge is related to photo copyright. Based on the PhotoMon instructions, participants were implicitly providing permission for their photos to be publicly posted on the project website and to be archived and viewed by park biologists, but they retained ownership of the images. Consequently, these images could not be reused (i.e., in figures in manuscripts/reports, interpretive talks, etc.). Given the intended scientific purpose of the Project, there is a real need to present photos publicly as evidence for decision-making. The PhotoMon Project has changed the wording of participant instructions to request that participants transfer ownership of the photo copyright to the Project.

Conclusions

We have provided an assessment of the first seven years of the Pinery PhotoMon Project, a citizen science initiative to conduct photo-point monitoring. We assessed the effectiveness of the project at meeting its scientific objective of establishing reference seasonal photos of important ecosystems in the park, as well as its educational objective of providing a stewardship opportunity for visitors. In the emerging field of citizen science photo-point monitoring, this case study describes a relatively early project, which allows a multi-year timescale of examination. It further complements the existing literature, which focuses mainly on coastal monitoring projects, by providing data on photo-point monitoring of terrestrial habitats.

There was large variation in the quantity of photo submissions, both among sites, as well as seasonally. Patterns in site-to-site variation support previous assertions that areas with high foot traffic promote participation, with the possible exception of thoroughfares where potential participants are fully engaged in their movements (e.g., beach access points). Seasonal variation in raw photo submissions mirrored park visitation, but relative participation declined during the busy summer season, suggesting the presence of various motivations among park visitors.

The Project met its minimum scientific goal of collecting one photo per site per season, and partially met the more ambitious goals of collecting 5 and 10 photos per site per season. Most photo submissions met Project criteria exactly or very closely. Proportionally, very few visitors appear to participate in the Project, and so there is room to improve at meeting the educational objective.

We have generated a series of specific suggestions to improve photo submission quantity and adherence to project criteria (quality), including suggestions for providing clearer instructions to participants, for better physical infrastructure (e.g., camera cradles), for diversification of messaging about the project, and for consideration of some barriers to participation. However, we further identify strengths of the project as its low maintenance and cost effectiveness, and we recognize that some of our suggestions may detract from these obvious strengths. Finally, we highlight gaps in knowledge about the motivations and demographics of the participants in the program, and recognize that future research in these areas will help us to better understand the drivers of variation in participation.

Data Availability Statement

Most PhotoMon Project photos are publicly accessible through the Friend of Pinery Park website: https://pinerypark.on.ca/photomon/Project Data are available through the Open Science Framework at this link: https://osf.io/4kgvx/?view_only=a1c1088cde354ae99682ecf3e35654b1

Supplementary Files

The Supplementary Files for this article can be found as follows:

Supplemental File 1

Appendix A: Total number of photos submitted per site/season/year combination for the duration of the study period. DOI: https://doi.org/10.5334/cstp.558.s1

Supplemental File 2

Appendix B. Effects of the onset of the COVID-19 pandemic on park visitation rates and participation in the PhotoMon project. DOI: https://doi.org/10.5334/cstp.558.s2