Start Submission

Reading: Quiahua, the First Citizen Science Rainfall Monitoring Network in Mexico: Filling Critical G...

Download

A- A+
Alt. Display

Research Papers

Quiahua, the First Citizen Science Rainfall Monitoring Network in Mexico: Filling Critical Gaps in Rainfall Data for Evaluating a Payment for Hydrologic Services Program

Authors:

Xoco A. Shinbrot ,

Cornell University, US
X close

Lyssette Muñoz-Villers,

Universidad Nacional Autónoma de México, MX
X close

Alex Mayer,

Michigan Technological University, US
X close

Melissa López-Portillo,

Universidad Nacional Autónoma de México, MX
X close

Kelly Jones,

Colorado State University, US
X close

Sergio López-Ramírez,

Michigan Technological University, US
X close

Carlos Alcocer-Lezama,

Universidad Nacional Autónoma de México, MX
X close

Miriam Ramos-Escobedo,

Global Water Watch, MX
X close

Robert Manson

Instituto de Ecología, A.C., MX
X close

Abstract

Citizen science data can fundamentally advance the natural sciences, but concerns remain about its accuracy, reliability, and overall value. While some studies have evaluated accuracy of citizen science data, few have also assessed its potential contribution to conservation policy. This study focuses on rainfall data collection, with four goals: (1) to examine motivations of, and barriers for, volunteer participation in citizen science; (2) to evaluate accuracy of citizen science rainfall data in comparison to automatic rain gauge data; (3) to incorporate citizen science rainfall datasets into hydrological models; and (4) to apply the hydrologic model to gauge the contribution of citizen science data to the efficient design of payment for hydrological services (PHS) programs. Twelve citizen science volunteers were trained and collected rainfall data between June 2017 and February 2019 across two watersheds in Veracruz, Mexico. We found that these volunteers were highly motivated by conservation values and learning, while only a few volunteers faced barriers related to time availability for making daily measurements. The mean error in daily rainfall, computed by comparing the manual and automated gauge measurements, was less than 1 mm, or 12% of the average daily rainfall. Approximately one-third (29%) and two-thirds (71%) of the errors were attributed to missing data and misread data, respectively. Spatial patterns of rainfall distribution across the watersheds were similar between citizen science and automatic gauge data, revealing a large fraction of rainfall in middle elevations. Furthermore, the results show that if PHS areas are determined using the existing national rainfall network alone, without citizen science data, critical areas that contribute to dry-season flows would be missed. To our knowledge, this is the first citizen science network for collecting rainfall data in Mexico that has produced results that are relevant to conservation policy design.

How to Cite: Shinbrot, X.A., Muñoz-Villers, L., Mayer, A., López-Portillo, M., Jones, K., López-Ramírez, S., Alcocer-Lezama, C., Ramos-Escobedo, M. and Manson, R., 2020. Quiahua, the First Citizen Science Rainfall Monitoring Network in Mexico: Filling Critical Gaps in Rainfall Data for Evaluating a Payment for Hydrologic Services Program. Citizen Science: Theory and Practice, 5(1), p.19. DOI: http://doi.org/10.5334/cstp.316
250
Views
71
Downloads
9
Twitter
  Published on 07 Sep 2020
 Accepted on 24 Jul 2020            Submitted on 09 Mar 2020

Introduction

In recent years, citizen science has emerged as a way to collect data for scientific efforts (Follett and Strezov 2015) across disciplines as diverse as evolution (e.g., Evolution MegaLab; Worthington et al. 2012), astronomy (e.g., GalaxyZoo; Fortson et al. 2012), ornithology (e.g., eBird; Sullivan et al. 2014), plant phenology (e.g., Project BudBurst; Wolkovich and Cleland 2011), and water surveillance (e.g., Global Water Watch; Deutsch and Ruiz-Córdova 2015). In addition to the broad objective of providing data for scientific research efforts, citizen science projects often include goals of environmental education, community engagement, and citizen empowerment. A common denominator among citizen science projects, as opposed to traditional scientific monitoring, is the volunteer base committed to collecting data. Choosing citizen science over traditional monitoring may involve tradeoffs between lower costs of citizen science data collection and loss of data accuracy. Although the literature on citizen science data collection methods is rich (e.g., Hochachka et al. 2012), less is known about the reliability or accuracy of hydrologic citizen science data and its application for policy makers.

Data collection by citizen scientists commonly involves the collection of periodic but infrequent snapshots of, for example, wildlife, vegetation, or soil (Roy et al. 2012; Vianna et al. 2014). However, these temporary observations are less useful in most hydrometeorological applications where continuous time series may be required to understand and solve a scientific or engineering problem (Gomani et al. 2010; Liu et al. 2008; Walker et al. 2016). To date, only a few published examples exist within the hydrology and water resources literature that describe continuous citizen science monitoring programs (Buyert et al. 2014; Conners et al. 2001; Deutsch et al. 2005; Gomani et al. 2010; Lowry et al. 2013; Walker et al. 2016). Such case studies demonstrate that continuous monitoring programs can be successfully implemented for hydrologic data collection to study, for example, groundwater levels, stream heights, rainfall patterns, and other climatic data.

Indeed, with proper protocols, training, and oversight, volunteers can collect data of similar quality to those collected by experts (Bonney et al. 2014; Cooper, Shirk, and Zuckerberg, 2014; Kosmala et al. 2016; Theobald et al. 2015). Several studies have also found that the experimental design should be chosen to best match the potential uncertainty in the data collected by citizen scientists (Bonney et al. 2014; Hochachka et al. 2012; Kremen, Ullman, and Thorp 2011; Riesch and Potter 2014). Some researchers have attempted to correlate predictors of citizen science data quality with factors such as participant experience and demographics (Crall et al. 2011; Danielsen et al. 2014), but the results have been inconsistent (Kelling et al. 2015; Kosmala et al. 2016).

The continuous collection of accurate data is largely dependent on the commitment of motivated volunteers who contribute their unpaid time. Understanding volunteer motivations for and barriers to participation might be essential for improving program retention (Alender et al. 2016). In a systematic review of 888 citizen science articles, Follett and Strezov (2015) found only 3% of studies investigated volunteer motivations to participate. Increasingly, citizen science projects have examined the desire to contribute to science (Land-Zandstra et al. 2016; Raddick et al. 2013), to help the environment (Domroese and Johnson 2017; Hobbs and White 2012), to learn about the topic (Land-Zandstra et al. 2016), to spend time with others who share the same values (Rotman et al. 2012), and to spend time in nature (Wright et al. 2015). However, the influence of any one of these motivations in the decision to volunteer for citizen science varies. For example, although some studies show that helping the environment is the germane factor in environmental volunteer participation (Bruyere and Rappe 2007), others found it had no influence on the duration of volunteers’ involvement (Asah and Blahna 2012). Without appropriate designs accounting for such motivations citizen scientists are likely to drop out of projects/programs. Water-monitoring programs specifically have found it difficult to maintain group motivation; this leads to volunteer fatigue and dropouts (Deutsch and Ruiz-Córdova 2015). Such turnover is concerning for scientists who are often limited by funding and time to implement multiple rounds of training (West and Pateman 2016).

Finally, to the authors’ knowledge, no published studies have attempted to evaluate, quantitatively, the contribution of citizen science data collection efforts to the scientific understanding of a hydrologic system and the translation of that understanding to watershed conservation policy. We explore the issue of participant motivations and data accuracy in Quiahua, one of the first citizen science rainfall monitoring projects in Mexico, carried out in two main catchments located in the upper Antigua river watershed in central Veracruz, Mexico. In the past few decades, this area has experienced substantial deforestation of tropical montane cloud forests (TMCF) (Muñoz-Villers and López-Blanco 2008). Low flows in rivers used for local urban water supplies in the 1990s prompted local interest in how land-use change can impact hydrologic cycling, particularly streamflow, as well as the creation of payment for hydrological services (PHS) programs (Nava-López et al. 2018).

Mexico’s national PHS program has been managed by the National Forest Commission (CONAFOR in Spanish) in the study area since 2003. The stated purpose of the national PHS program is to support the provision of hydrologic services while promoting poverty alleviation by paying rural landowners to conserve forests within priority watersheds (McAfee and Shapiro 2010). The national PHS program has been gradually replaced by matching funds from programs managed by local water operators, which now provide more than 50% of the funding for program operations (Nava-López et al. 2018). Although CONAFOR publishes a detailed explanation of the methodology used to select eligible zones in target watersheds that can receive payments, no guidance is provided to prioritize where the payments should be targeted within these zones. Targeting criteria are typically assigned by local PHS program operators using a mix of spatially explicit biophysical and socio-economic data, such as a combination of deforestation risk and degree of socioeconomic marginalization (Mokondoko et al. 2018; Von Thaden et al. 2019). According to interviews with local program operators, they do not consider spatially explicit hydrological information, such as locations where the greatest amount of groundwater recharge occurs, in their targeting methodologies. Extensive hydrologic data collection is typically too expensive to maintain by local PHS program managers.

While efforts have been made to quantify the hydrological impacts associated with forest conversion in headwater catchments (García Coll et al. 2008; López-Ramírez et al. 2020; Muñoz-Villers and McDonnell 2013; Muñoz-Villers et al. 2015) a complete understanding of the local hydrology, the impacts the conversion of tropical montane cloud forests to pasture and crop lands has on hydrology, and the potential success of PHS programs, has been limited by the paucity of data, especially rainfall data. We recruited citizen science volunteers to measure daily rainfall using manual gauges over a 20-month period, with four goals: (1) to examine motivations and barriers of volunteers to participate in the rainfall monitoring program; (2) to evaluate data accuracy of citizen science rainfall data in comparison with automatic rain gauge data; (3) to incorporate citizen science rainfall datasets into hydrological models; and (4) to apply the hydrologic models to assess the efficiency of local PHS programs and thereby demonstrate the potential utility of citizen science data to inform PHS program design.

Methods

Study area

This research was conducted in the Pixquiac and the Gavilanes catchments in the upper Antigua river watershed in central Veracruz, Mexico. The drainages of the Pixquiac (~100 km2; 1,024–3,759 meters above sea level [m.a.s.l.]) and the Gavilanes (~41 km2; 1,090–2,960 m.a.s.l.) rivers are located on the eastern (windward) slopes of the Sierra Madre Oriental mountain range (19°28’, –97°01’; Figure 1). These two adjacent catchments are important water supplies for the city of Xalapa (providing 38% of the water supply for 500,000 inhabitants) and Coatepec (providing 90% of the water supply for 92,000 inhabitants). Dominant land covers in the catchments consist of forests (76%), pasture and agricultural lands (22%), and urban areas (2%) (Von Thaden et al. 2019). The forests are primarily composed of TMCF (>50%), followed by mixed temperate forest (~25%) (ESA 2015). The climate is temperate humid with abundant summer rains (García 1988). About 80% of the annual rainfall falls as convective storms during the wet season (May–October), followed by a prolonged dry season (November–April). Maximum groundwater recharge and runoff also occurs during the wet season (Muñoz-Villers and McDonnell 2013).

Figure 1 

(a) Study area location in central Mexico and (b) locations of existing national rainfall gauges and new gauges associated with this project. Also shown are locations of areas receiving payment for hydrologic services (PHS) (blue) and cities (green) that depend on water supplies from the catchments.

The local climate varies markedly with elevation. Data from climate stations in or near the catchments indicate that mean annual rainfall ranges from 1,120 mm to 3,185 mm to 855 mm as elevation increases from 1,200 m.a.s.l. to 2,100 m.a.s.l. to 3,000 m.a.s.l., respectively. Mean daily temperatures (5°C to 19°C) and mean annual evapotranspiration (855–1,215 mm, estimated as reference evapotranspiration) also vary considerably with elevation (Holwerda et al. 2013; Muñoz-Villers et al. 2012). Although these spatial trends in climate have been recognized at the regional scale, climatic data in the two catchments is sparse, which has limited the characterization of critical hydrologic processes controlling runoff. As a result, the impacts of land use on hydrologic cycling and the effectiveness of conservation programs focusing on the linkages between forest conservation and hydrologic services are poorly understood.

The Pixquiac and Gavilanes catchments meet several of the enrollment criteria for Mexico’s national PHS program, including the presence of priority ecosystems, proximity to downstream cities, and water scarcity issues, as well as active local PHS programs operating in the region. Figure 1 shows the areas receiving payments from PHS programs in the study area between 2003 and 2014, covering 26% of both catchments. Climatic data is especially limited in the mid-range elevation portions of these catchments, i.e., 1800–2500 m.a.s.l., which, if regional spatial trends were to hold true, is where most of the rainfall occurs. The goal of the Quiahua network was to concentrate collection of rainfall measurements at mid-range elevations (1,500–2,500 m.a.s.l.) across our study catchments and thus to contribute the information toward efficient targeting of PHS payments.

Data collection

Recruiting and training volunteers

The local PHS programs are managed by the Fideicomiso Coatepecano para la Conservación del Bosque y el Agua (FIDECOAGUA) in the Gavilanes watershed and supported by the Senderos y Encuentros para un Desarrollo Autónomo Sustentable (SENDAS) in the Pixquiac watershed (Nava-López et al. 2018). We worked with these organizations as well as with the largest water monitoring and citizen science organization in the country, Global Water Watch-Mexico (GWW-Mexico). Established in 2008 in Veracruz state, GWW-Mexico has trained more than 750 citizen scientists to date in 12 Mexican states in both stream water quality and discharge methods (Florez-Diaz et al. 2013).

Recruitment occurred between September and November 2016. Two training workshops were held in March 2017—one in the Pixquiac in the town of Rancho Viejo with ten volunteers and one in the Gavilanes catchment in the city of Coatepec with five volunteers. The workshops were conducted by GWW-Mexico, SENDAS, and hydrologists from Universidad Nacional Autónoma de Mexico and Michigan Technological University. The workshop began with a discussion of the general goals of the rainfall monitoring program and the roles of the rainfall volunteers. Participants were trained to measure daily rainfall from manual gauges and report the corresponding data on forms developed for the project. The training followed the Community Collaborative Rain, Hail and Snow Network (CoCoRaHS) training protocol, which has been used for training thousands of daily rainfall observers in the US, including instruction in the Spanish language in Puerto Rico. For identification and engagement purposes, the word quiahua, meaning lluvia (rainfall) in the indigenous Aztec náhuatl language, was proposed in the workshops as the name of the network. From these workshops, we recruited 16 volunteers to participate in the rainfall monitoring training.

Quiahua rainfall monitoring network

A total of 12 out of the 16 volunteers trained in the workshops participated in the Quiahua network throughout the entire study period. Six participants monitored a site individually, and three sites were each monitored by two volunteers, for a total of nine locations. Volunteers measured rainfall within their home watersheds. Rainfall was measured once per day, and volunteers were advised to take rainfall measurements in the early morning, between 6 a.m. and 10 a.m. local time. Program technicians visited volunteers one to two weeks after the workshops to install the equipment on volunteers’ properties and to answer general questions regarding data collection. In each location, volunteers measured rainfall with a 250-mm manual rain gauge with 0.2 mm increments (WeatherYourWay/CoCoRaHS, https://weatheryourway.com/), and a RG3 automatic tipping bucket rain gauge (Onset, EUA; resolution of 0.2 mm) was installed nearby. Rainfall monitoring points covered an altitude range between 1,309 and 2,581 m.a.s.l., corresponding mostly to mid-range elevations in the watersheds. For this study, we collected daily rainfall data from June 2017 to February 2019, including two complete dry seasons (November 2017–April 2018; November 2019–February 2019) and one complete wet season (May-October 2018). The nine sites were visited monthly by a field technician to collect the data forms and to download the data from paired automatic rain gauges. Daily rainfall information from volunteers was entered manually in Excel spreadsheets. Data from the automatic gauges was checked for errors using quality assurance protocols similar to those outlined in Goodrich et al. (2008).

For comparison, we used the automatic gauges as a benchmark to validate volunteers’ rainfall data. Differences between daily rainfall for the automatic and manual gauges were divided into errors for the days when a reading was not taken (“missing data error” = manual gauge – automatic gauge; note that manual gauge = 0 for all of these days) and errors for the days when a reading was taken (“misread data error” = manual gauge – automatic gauge). We used this data to evaluate data accuracy of citizen science rainfall data in comparison to automatic rain gauge data.

Survey data collection and analysis

We surveyed the 12 volunteers who continued to participate in rainfall monitoring throughout the study period prior to the workshop, two weeks after the workshop, and six months after the workshop to understand motivations and barriers to participate and any changes over time in these variables. The surveys were self-administered in Spanish (Supplemental File 1). In the pre-survey, we collected information about (1) demographics—specifically, sex, age, number of family members, number of children, years in the community, education level, and income; (2) a 17-item scale of motivations derived from (Ryan, Kaplan, and Grese 2001), particularly (a) sense of responsibility to care for nature, (b) motivations to learn, (c) motivations to be a part of something, (d) social motivations, and (e) motivation to escape daily routines and spend time in nature. Demographic factors were recorded using fill-in-the-blank questions, except income, which was measured with a multiple-choice question with income ranges. Motivations were measured using a five-point Likert scale of totally disagree, disagree, neutral, agree, or totally agree. Each Likert scale construct—responsibility, learning, desire to be part of something, and escape—had a minimum of three related questions to ensure reliability, validity, and generalizability (Carifio and Perla 2007; Vaske 2008). The two post-surveys collected information on the 17-item scale of motivations, and the final survey included an open-ended question about motivations and barriers to participate. The average time to conduct the surveys was 20 minutes. The survey protocol, including sampling methods, survey instrument, and written informed consent procedure, was approved by administrative review of the IRB at Colorado State University (#264-17-H). Survey participants also provided their written informed consent to participate in this study.

We used descriptive statistics to understand demographics of volunteer citizen scientists. For motivational factors, we conducted an exploratory factor analysis with non-orthogonal rotation (Sarabia et al. 1993). To determine the reliability and consistency of statements, we evaluated Cronbach’s alpha for each factor, where all but one was above 0.8. While values of 0.6 or 0.7 are acceptable (van Griethuijsen et al. 2015), Cronbach’s alpha of 0.8 or higher is preferred (Cortina 1993). A recombination of factors led to new typologies of motivations, including community (“You like to meet new people” and “You are interested in being part of a well-organized team”), public service (“You are interested in improving public knowledge about water” and “You like to spend time with people who have similar interests as you”), and conservation motivations, which included motivations to be part of something and to escape. To understand whether motivations to participate change over time, we conducted a repeated measured analysis of variance (ANOVA) before the workshop, two weeks after, and six months after, followed by a post hoc Tukey-Kramer test on the studentized range distribution for unequal sample sizes (Abdi and Williams 2010).

Answers to the open-ended question on most important motivations and barriers to participate in citizen science were transcribed. Two bilingual researchers used an iterative process of content and thematic analysis, for intercoder reliability, following a six-step process: gaining familiarity with the data, coding, searching for themes, reviewing themes, defining and naming themes, and writing up the results (Saldaña, 2015). The resulting codes were then entered into the qualitative software NVivo 12 Pro for analysis of the text.

Modeling watershed hydrologic services

The point-based daily rainfall data summed to 5,083 observations over 621 days and was interpolated over the study area with ArcGIS to map dry season (November–April), wet season (May–October), and annual precipitation. Three different datasets of rainfall information were compared: (1) the national climate stations alone, (2) the citizen scientist volunteer data plus the national climate stations, and (3) the automatic gauge data plus the national climate stations. Thiessen polygons were used to create the rainfall distribution maps for the national climate stations alone, since this procedure is appropriate for cases where only a few observation points are available. The inverse-distance-weighted technique was used to generate the precipitation maps for the rainfall monitoring data plus the national climate stations as well as the automatic gauge data plus the national climate stations.

A hydrologic model, the Soil Water Assessment Tool (SWAT) (Watson and Philip 1985), was parameterized for the study area and calibrated against available streamflow data (López-Ramirez et al. 2020). SWAT is a semi-distributed rainfall-runoff model that has been used to simulate watersheds in tropical montane regions similar to our study area (Arnold et al. 2012; Kim et al. 2017; López-Ramírez et al. 2020; Plesca et al. 2012; Schmaltz and Fohrer 2009). Soil-type and hydrophysical properties maps (Strauch et al. 2017), land-use/land-cover maps (INEGI, 2013a), watershed boundaries, and digital elevation models (CONANP et al. 2015) were obtained from government sources and used as input to the SWAT model with the ArcSWAT tool (INEGI, 2013b). Calibration was based on minimizing the difference between simulated and observed daily stream flows at the two watershed outlets, as is typical of hydrologic modeling efforts (Arnold and Allen 1999; Gupta, Sorooshian, and Yapo, 1999; Winchell et al. 2013). SWAT streamflow output includes spatial distributions of surface runoff and baseflow contributions. In this case, the spatial distributions of baseflow contributions were used to indicate spatial locations of hydrologic significance, since baseflow is the critical element for maintaining streamflow during the dry season and buffering against droughts. We used the annual average baseflow contributed over the simulation period as a simple metric representing baseflow magnitude.

Results

Volunteer motivations and barriers

We had a 100% response rate prior to the workshop and two weeks after the workshop, and 75% (n = 9) six months after the workshop. Volunteers were in their late thirties (mean 39.2 ± 17.8 yrs), had lived most of their lives in their community (21.7 yrs), were highly educated (67% [n = 8] had completed more than high school education), and 58.3% (n = 7) reported earning above-average incomes (>8,170 pesos/month) (Table 1). Volunteers were equally split between men and women who had an average of 3.83 family members, with few having children less than 12 years old (Table 1).

Table 1

Demographic variables of citizen scientists (number of observations = 12).

Variables Mean (SD)

Age in years 39.25 (17.79)
Fraction (%) with more than high school education 66.67 (NA)
Fraction (%) women 50.00 (0.54)
Number of family members 3.83 (3.98)
Number of children less than 12 years old 0.50 (0.54)
Years in the community 21.66 (22.72)
Fraction (%) high income (>8,170 pesos/month) 58.33 (NA)

The factor analysis with non-orthogonal rotations listed in Table 2 shows motive-related questions on responsibility (4), on learning (4), on being part of something (3), on socializing (2) and on escape (3) were loaded onto five factors. Factor 1, conservation values, is a recombination of motivation themes that explains 4.4% of the variance, and results from the recombination of motivations to be part of a conservation team and to be in nature (Ryan, Kaplan, and Grese 2001). Factor 2, motivations for learning, reflects the same interests in learning new things as conceptualized in (Ryan, Kaplan, and Grese 2001) and explains 4.2% of the variation. Factor 3, feelings of responsibility for taking care of nature, explains 3.6% of the variation and largely reflects the same themes described in (Ryan, Kaplan, and Grese 2001) while integrating one escape factor in the form of, “you like to explore the environment.” Factor 4, public service, is a theme that reflects a recombination of social and learning motivations, explaining 2.6% of the variation. Finally, Factor 5, community, is also a theme that reflects a recombination of social motivations and motivations to be part of something, explaining 2.1% of the variation.

Table 2

Principal component analysis of motivational factors for participation in citizen science.

Motivation Code Factors

Survey question F1 F2 F3 F4 F5

Conservation values
You would like to work with an effective leader in conservation. Organization_2 0.87
You like to be in a natural environment. Escape_1 0.86
You have a desire to be part of an organization that values your work. Organization_3 0.84
You like to have time for quiet reflection. Escape_2 0.76
Learning
It’s important to you to learn more about nature Learning_4 0.94
You are interested in learning about water Learning_2 0.91
You are interested in learning new things generally. Learning_3 0.75
Responsibility
You like to explore the environment. Escape_3 0.88
You are worried about the impact of humans on water Responsibility_2 0.84
You feel obligated to conserve the environment. Responsibility_1 0.78
In your opinion, it’s a public responsibility to consider how your actions affect the environment. Responsibility_3 0.66
You would like to help others in the community. Responsibility_4 0.55
Public service
You are interested in improving public knowledge about water. Learning_1 0.92
You like to spend time with people who have similar interests as you. Social_1 0.61
Community
You like to meet new people. Social_2 0.96
You are interested in being part of a well-organized team. Organization_1 0.82
Cronbach’s alpha 0.87 0.88 0.84 0.61 0.82
Variance explained (%) 4.4 4.2 3.6 2.6 2.1

Before the training, volunteer citizen scientists were highly motivated by learning (4.81; mean value) and conservation values (4.65; mean value). Motivations of responsibility for the environment, public service, and community were also high (Table 3). After two weeks, the highest motivations continued to be learning (4.81; mean value), while public service became more important (4.63; mean value). Six months after training, the highest stated motivations for those that responded were still conservation values (4.60; mean value) and learning (4.50; mean value), although they were lower than originally stated before the training.

Table 3

Volunteer motivations to participate in rainfall monitoring before, two weeks after, and six months after training by factor (5-point Likert Scale).

Motivational factors Before mean (SD) 2 weeks mean (SD) 6 months mean (SD)

F1. Conservation values 4.65 (0.44) 4.43 (0.85) 4.60 (0.39)
F2. Learning 4.81 (0.33) 4.81 (0.38) 4.50 (0.47)
F3. Responsibility 4.55 (0.50) 4.42 (0.70) 4.30 (0.95)
F4. Public service 4.50 (0.67) 4.63 (0.56) 4.55 (0.49)
F5. Community 4.46 (0.58) 3.64 (0.98) 4.35 (0.52)
Observations 12 12 9

A repeated measures ANOVA for the five factors yielded no significant variation for Factor 1 (p-value = 0.61), Factor 3 (0.459), or Factor 4 (p-value = 0.807) (Supplemental File 2). However, the repeated measures ANOVA demonstrated significant changes over time for Factor 5 (0.026) and Factor 2 (p-value = 0.049), where a post hoc Tukey-Kramer test for Factor 5 showed significant differences before and two-weeks after training (p = 0.031), but not before and six months after training (p = 0.937). The post hoc Tukey-Kramer test failed to show significant differences at p < 0.05 between any time points for Factor 2.

Qualitative data from the open-ended survey question on the greatest motivation for participation in rainfall monitoring were consistent with survey data, showing that volunteers were overwhelmingly interested in learning more generally, and learning about water specifically. Of the nine who responded six months after the survey, six responded on their motivation to learn, for example, “I’d like to know more.” Another important theme mentioned by four volunteers was their interest in helping research, for example, “I’d like to help research on precipitation.” Also mentioned by two people was their interest in contributing to the community, and one person mentioned their interest in learning about the region they live in, which we coded as “sense of place.”

Data from the open-ended question on the greatest barriers to participation in rainfall monitoring show that the frequently cited barriers for volunteers were insufficient time (n = 4), difficulty organizing others to collect rainfall data (n = 3), or traveling to the site (n = 2), after six months, where two people mentioned more than one barrier. However, the majority (n = 5) indicated that there were no barriers to participation.

Evaluating accuracy of manual versus automatic rainfall gauges

On average, volunteers measured precipitation 565 times (91% of days) between June 2017 and February 2019 (621 days), for a total of 5,083 observations. Table 4 shows the results of the comparison between the automatic gauges and the volunteer rainfall data annually and for wet and dry seasons. The results indicated that, overall, data collected by volunteers underestimated daily rainfall by almost 1 mm (12% error), and t-tests indicated significant differences between the automatic gauge and volunteer’s rainfall data (p < 0.0001). Most of the underestimation occurred during the wet season, where volunteer data underestimated the rainfall by 16%, and the t-test results indicated a high probability that the means were different (p < 0.0001).

Table 4

Comparison of automatic gauge and volunteer data by season, with p-values from results of paired t-tests.

Season Mean daily rainfall (mm) p-value for Difference between means

Automatic gauge data (n) Volunteer data (n) Difference (%)

Annual 7.05 (5520) 6.19 (5083) –0.86 (–12%) <0.0001
Wet season 11.73 (2682) 9.85 (2499) –1.88 (–16%) <0.0001
Dry season 2.63 (2838) 2.58 (2584) –0.05 (–2%) <0.001

The mean missing data error and misread data error were –4.5 mm/day (506 missing observations) and –1.0 mm/day, respectively. The missing data and misread errors were responsible for 29% and 71% of the mean error of –0.86 mm/day, respectively. The highest fraction of missing days for a volunteer was 29%; with this particular volunteer removed, the mean fraction of missing days was less than 7%. Of the days when readings were taken by volunteers, approximately one-third (32%) occurred on dry days and where both the automatic and manual gauges registered zero rainfall and a misread error of zero. Figure 2 shows the cumulative frequency of the misread errors without the days both the automatic and manual gauges registered zero rainfall, showing that the distribution of errors is roughly symmetric around zero, with a slight bias toward negative numbers. One-third (33%) and more than three-quarters (82%) of misread errors in daily rainfall fall between –1 mm and +1 mm and –10 and +10 mm, respectively.

Figure 2 

Cumulative distribution of daily misread errors (n = 3,440). Dry days on which both the automatic and manual gauges registered zero rainfall (n = 1,592) have been removed.

Although volunteers’ measurements tended to underestimate total rainfall, the errors did not translate to substantial differences in spatial patterns when compared with patterns obtained from the automatic gauges alone, as shown in Figure 3. The results in Figure 3 show dramatic differences between the rainfall distributions using the national climate stations alone and distributions using volunteer rainfall data and automatic gauge data together with the national gauge data. Especially important is that the annual and wet-season rainfall in the central portions of the study area were substantially higher (more than 300 mm and 400 mm for the wet season and annual rainfall) when looking at volunteer or automatic gauge data combined with the national gauge data, versus with the national gauge data alone. Volunteer rainfall data and national gauge data underestimate annual and wet-season rainfall by 100 mm to 200 mm in the central portion of the watershed, in agreement with the results in Table 4. In particular, the gauge located in the center of the watersheds (ZAP_P2) was responsible for much of the underestimation by volunteers at that location in the central part of the watershed.

Figure 3 

Annual, wet-season (November 2017–May 2018), and dry-season (June–October 2017) rainfall distributions (in millimeters) from the national climate stations alone, from volunteer rainfall data plus the national climate stations, and from the automatic gauge data plus the national climate stations. Squares and triangles indicate the location of the national and monitor-automatic gauges, respectively.

Incorporating rainfall datasets into hydrologic models

The three sets of rainfall data were used to generate annual baseflow contribution maps with the SWAT model in Figure 4. As expected, since the rainfall in the central-to-lower portion of the Pixquiac and lower portion of the Gavilanes watersheds was higher when volunteer rainfall data or automatic gauge data were combined with the national gauge data, the baseflow contribution rates were substantially higher in these maps (Figure 4b and 4c) than for baseflow contribution rates simulated with the national gauge data alone (Figure 4a). The increase in simulated baseflow contribution rates in the middle portion of the study area for volunteer rainfall and automatic gauge data (Figure 4b), and with the national gauge data, was not as large as one might expect, given the large differences in rainfall with the national gauges alone (Figure 3). This result could be explained by the relatively steep slopes in most of the middle portion of the study area (see Figure 1), which tend to increase runoff and reduce soil infiltration, ultimately decreasing the water available for baseflow. Similar to rainfall distributions, the simulated baseflow contribution was substantially lower when using combined volunteer rainfall data with the national gauge data as compared with combined automatic gauge and national gauge data.

Figure 4 

Annual average baseflow contribution distributions (in mm) obtained from the Soil Water Assessment Tool (SWAT) simulations with (a) the national climate stations alone, (b) volunteer rainfall data plus the national climate stations, and (c) the automatic gauge data plus the national climate stations. Gray overlay indicates the areas receiving payment for hydrologic services (PHS) program payments.

The average baseflow contributions for the entire study area and for the areas receiving PHS payments alone are shown in Figure 5. The results in Figure 5 (white and gray bars) indicate that the average baseflow contributions predicted by the SWAT model increase substantially when comparing the national gauges alone with volunteer rainfall data plus the national gauges, and with the automatic plus the national gauges, matching the spatially distributed results in Figure 4. The results in Figure 5 also indicate that the current targeting results in 13%–18% greater contributions to baseflow in the areas receiving payments, compared with the mean baseflow contributions across the watershed.

Figure 5 

Average baseflow contributions for the whole watershed area, for the areas receiving payments, and for the total payment area applied to the maximum baseflow areas for the Soil Water Assessment Tool (SWAT) simulations as measured by national climate stations alone, the volunteer rainfall data plus the national climate stations, and the automatic gauge data plus the national climate stations.

To further assess the efficiency of the payments with respect to capturing baseflow, the areas with the highest baseflow contributions corresponding to the same total area (3,990 ha) currently receiving payments were identified. These results (black bars in Figure 5) indicate that, as expected, baseflow capture increases when the PHS payments target areas that maximize baseflow contribution, compared with the mean baseflow contribution across the watershed. The value of volunteer rainfall data toward increasing targeting efficiency is demonstrated by looking at the improvement in baseflow contributions for the national gauges alone (30%) versus the national gauges with volunteer rainfall data (54%) when maximizing baseflow contributions.

Discussion

Designing citizen science to fulfill motivations and reduce missing and misread data

Understanding who signs up to participate in citizen science, why they stay, and what barriers they face are essential questions to address when engaging volunteers in citizen science projects. This study supports previous research that shows that citizen science typically attracts educated, financially stable individuals who are motivated by learning about the projects’ topics (Deutsch et al. 2009; Deutsch and Ruiz-Córdova 2015; Moriasi et al. 2015) and are interested in learning generally (Land-Zandstra et al. 2016; Newman et al. 2012; Brossard et al. 2005). In this study, we found that citizen scientists were also motivated by their conservation values. However, motivations did not change significantly over time; rainfall volunteers remained engaged, motivated by learning and conservation values, and very few listed barriers to volunteering beyond constraints to time. The results suggest that recruitment should target individuals who are interested in engaging intellectually with the topic, and provide support for ongoing learning opportunities.

Like many citizen science projects, understanding how to maintain a consistent volunteer effort in data collection is only part of the puzzle. Ensuring that the data itself is accurate is another critical issue. Our results show that volunteers consistently underestimated rainfall compared with automatic tipping bucket data, particularly in the wet season. There is a possibility that the automatic gauges were biased; however, most authors have noted that automated tipping bucket gauges are subject to underestimation, especially during high-intensity rainfall events (Crall et al. 2013; Molini, Lanza, and Barbera 2005; Upton and Rahimi 2003; Lanza and Vuerich 2009), rather than overestimation. In contrast to our findings, Reges et al. (2016) reported that manual gauge measurements by CoCoRaHS volunteers tended to be higher than automated, tipping bucket gauges. Parsing the time series error revealed that missing data and misread errors contributed to roughly one-third (29%) and two-thirds (71%) of the mean error of –0.86 mm/day, respectively. While strategies such as collecting data forms from volunteers or providing other opportunities for interactions more frequently may decrease the incidence of missing data, reducing the misread error may be difficult. However, analysis of the distribution of misread errors, illustrated in Figure 2, indicates that eliminating larger misread errors in daily rainfall, using < –10 mm and > +10 mm as thresholds, would decrease the misread error from –1.0 mm/day to –0.2 mm/day overall. Thus, applying quality control protocols (similar to those used to analyze automated gauge data) to eliminate outlier errors in volunteer data could improve volunteer estimates substantially.

Incorporating citizen science into hydrological models to better inform payment for hydrologic services programs

Evaluations of PHS effectiveness typically use changes in forest cover as a surrogate to measure ecosystem service provision (Shedekar et al. 2016). In Mexico, CONAFOR targets payments to areas of high deforestation risk, with primary forest cover receiving priority, particularly cloud forests, and where policy makers assume that greater forest cover increases hydrological services (Wunder 2007). However, other metrics of success for PHS programs include additionality, which measures program-caused forest conservation. (Muñoz-Piña et al. 2008). Recent studies have shown that the majority of payments in Mexico (61%) go to areas with low to medium deforestation risk (Von Thaden et al. 2019). bringing into question the additionality of such programs. Adding to efficacy concerns, studies have shown that PHS programs are often lacking in areas identified as priority areas for hydrological service provisioning (Mokondoko et al. 2018). Recent work (Costedoat et al. 2015; Berry et al. In review) suggests that hydrologic metrics, such as measures of dry-season streamflows, can be valuable indicators of ecosystems services provided by forests and converted land uses. The results in Figure 5 show that including a baseflow metric for prioritizing PHS can substantially increase baseflow in the study area. The reliability of these hydrologic metrics depends on the availability of accurate, frequently collected hydroclimatic data, such as the data collected in the Quiahua citizen science rainfall effort. The results in Figure 5 also show that, despite the fact that the citizen scientists consistently underestimated rainfall, particularly during the wet season, data contributed by the citizen scientists could lead to substantially greater targeting of baseflow contributions from the PHS policy design at a much lower cost than automated tipping bucket gauges.

Conclusions

To our knowledge, the Quiahua citizen science rainfall project is the first citizen science network for collecting information in Mexico that has produced results that are relevant to conservation policy design. Surveys revealed that volunteers participated in the network because they were highly motivated to be in nature, as well as to learn about watershed hydrology, and that these motivations did not change six months after the training. The most important challenges for volunteer citizen scientists were insufficient time to make the rainfall measurements at the same time each day, and organizing among partner volunteers to determine who had the responsibility for making measurements on a given day. While volunteer-collected data tended to be lower than data from the automatic gauges, on average, a substantial fraction of the underestimation (29%) resulted from missing observations. The spatial patterns of rainfall distribution across the watershed were similar between the citizen science and automatic gauge data. These patterns reveal a large fraction of the rainfall in the middle elevations of the watersheds that was not apparent from the existing rainfall station network. Use of the data from paired citizen science automatic gauge stations as inputs to a calibrated hydrologic model revealed that there are areas of the watershed that should be conserved because they contribute substantially to dry-season flows. The importance of these areas would not have been recognized with the existing, sparse rainfall measurement network. We encourage the use of citizen science programs to supplement government-run monitoring networks because they can result in the advancement of understanding of hydrologic systems and can correspondingly improve the effectiveness of forest conservation programs.

Data Accessibility Statement

The datasets generated during and/or analyzed during the current study are available from the corresponding author, XAS, on reasonable request.

Supplementary Files

The Supplementary files for this article can be found as follows:

Supplemental File 1

Pre/Post-Survey of Motivations for Participation in Rainfall Monitoring. DOI: https://doi.org/10.5334/cstp.316.s1

Supplemental File 2

Predicted Estimate of Least Squares Means for Factor 5, Community Motivations, and Factor 2, Learning Motivations. DOI: https://doi.org/10.5334/cstp.316.s2

Acknowledgements

We thank all of the citizen science volunteers who participated in this study and continue to monitor rainfall; project collaborators SENDAS, FIDECOAGUA, and CoCoRaHS; and all GWW-Mexico staff for their tireless support. This work was funded by the National Science Foundation [Award CBET 1644860].

Competing Interests

The authors have no competing interests to declare.

Author Contributions

Xoco Shinbrot, Lyssette E. Muñoz-Villers, Alex Mayer, Kelly W. Jones, Miriam G. Ramos-Escobedo, and Robert H. Manson conceived the ideas and designed the methodology; Xoco Shinbrot, Lyssette Muñoz-Villers, and Carlos Lezama-Alcocer collected the data; Xoco Shinbrot, Alex Mayer, Melissa López-Portillo, and Sergio López-Ramirez analyzed the data; Xoco Shinbrot, Lyssette Muñoz-Villers, and Alex Mayer led the writing of the manuscript. All authors contributed critically to the drafts and gave final approval for publication.

References

  1. Abdi, H and Williams, LJ. 2010. Tukey’s honestly significant difference (HSD) test. Encyclopedia of Research Design. Thousand Oaks, CA: Sage, pp. 1–5. 

  2. Alender, B. 2016. Understanding volunteer motivations to participate in citizen science projects: a deeper look at water quality monitoring. Journal of Science Communication, 15(3): A04. DOI: https://doi.org/10.22323/2.15030204 

  3. Arnold, JG and Allen, PM. 1999. Automated methods for estimating baseflow and ground water recharge from streamflow records 1. JAWRA Journal of the American Water Resources Association, 35(2): 411–424. DOI: https://doi.org/10.1111/j.1752-1688.1999.tb03599.x 

  4. Arnold, JG, Moriasi, DN, Gassman, PW, Abbaspour, KC, White, MJ, Srinivasan, R, Santhi, C, Harmel, RD, Van Griensven, A, Van Liew, MW and Kannan, N. 2012. SWAT: Model use, calibration, and validation. Transactions of the ASABE, 55(4): 1491–1508. DOI: https://doi.org/10.13031/2013.42256 

  5. Asah, ST and Blahna, DJ. 2012. Motivational functionalism and urban conservation stewardship: implications for volunteer involvement. Conservation Letters, 5(6): 470–477. DOI: https://doi.org/10.1111/j.1755-263X.2012.00263.x 

  6. Berry, ZC, Jones, KW, Gomez Aguilar, L, Congalton, RG, Holwerda, F, Kolka, R, Looker, N, Manson, R, Mayer, A, et al. In review. Evaluating ecosystem service trade-offs along a land-use intensification gradient in central Veracruz, Mexico. Ecosystem Services. 

  7. Bonney, R, Shirk, JL, Phillips, TB, Wiggins, A, Ballard, HL, Miller-Rushing, AJ and Parrish, JK. 2014. Next steps for citizen science. Science, 343(6178): 1436–1437. DOI: https://doi.org/10.1126/science.1251554 

  8. Brossard, D, Lewenstein, B and Bonney, R. 2005. Scientific knowledge and attitude change: The impact of a citizen science project. International Journal of Science Education, 27(9): 1099–1121. DOI: https://doi.org/10.1080/09500690500069483 

  9. Bruyere, B and Rappe, S. 2007. Identifying the motivations of environmental volunteers. Journal of Environmental Planning and Management, 50(4): 503–516. DOI: https://doi.org/10.1080/09640560701402034 

  10. Buytaert, W, Zulkafli, Z, Grainger, S, Acosta, L, Alemie, TC, Bastiaensen, J, De Bièvre, B, Bhusal, J, Clark, J, Dewulf, A and Foggin, M. 2014. Citizen science in hydrology and water resources: opportunities for knowledge generation, ecosystem service management, and sustainable development. Frontiers in Earth Science, 2: 26. DOI: https://doi.org/10.3389/feart.2014.00026 

  11. Carifio, J and Perla, RJ. 2007. Ten common misunderstandings, misconceptions, persistent myths and urban legends about Likert scales and Likert response formats and their antidotes. Journal of Social Sciences, 3(3): 106–116. DOI: https://doi.org/10.3844/jssp.2007.106.116 

  12. CONANP, CONAFOR, INECC y FMCN- Proyecto C6 cuencas costeras. Mapa de uso de suelo y vegetación 2014 en la cuenca del río la Antigua” a partir de los productos elaborados por Brockmann Consult con apoyo de GeoVille y financiados por el programa EOWorld de la Agencia Espacial Europea-Banco Mundial. 2015. 

  13. Conners, DE, Eggert, S, Keyes, J and Merrill, MD. 2001. Community-based water quality monitoring by the Upper Oconee Watershed Network. Proceedings of the 2001 Georgia Water Resources Conference, (March 2001), University of Georgia, Athens, Georgia, USA (2001). 

  14. Cooper, CB, Shirk, J and Zuckerberg, B. 2014. The invisible prevalence of citizen science in global research: migratory birds and climate change. PloS One, 9(9). DOI: https://doi.org/10.1371/journal.pone.0106508 

  15. Cortina, JM. 1993. What is coefficient alpha? An examination of theory and applications. Journal of Applied Psychology, 78(1): 98. DOI: https://doi.org/10.1037/0021-9010.78.1.98 

  16. Costedoat, S, Corbera, E, Ezzine-de-Blas, D, Honey-Rosés, J, Baylis, K and Castillo-Santiago, MA. 2015. How effective are biodiversity conservation payments in Mexico? PloS One, 10(3). DOI: https://doi.org/10.1371/journal.pone.0119881 

  17. Crall, AW, Jordan, R, Holfelder, K, Newman, GJ, Graham, J and Waller, DM. 2013. The impacts of an invasive species citizen science training program on participant attitudes, behavior, and science literacy. Public Understanding of Science, 22(6): 745–764. DOI: https://doi.org/10.1177/0963662511434894 

  18. Crall, AW, Newman, GJ, Stohlgren, TJ, Holfelder, KA, Graham, J and Waller, DM. 2011. Assessing citizen science data quality: an invasive species case study. Conservation Letters, 4(6): 433–442. DOI: https://doi.org/10.1111/j.1755-263X.2011.00196.x 

  19. Danielsen, F, Pirhofer-Walzl, K, Adrian, TP, Kapijimpanga, DR, Burgess, ND, Jensen, PM, Bonney, R, Funder, M, Landa, A, Levermann, N and Madsen, J. 2014. Linking public participation in scientific research to the indicators and needs of international environmental agreements. Conservation Letters, 7(1): 12–24. DOI: https://doi.org/10.1111/conl.12024 

  20. Deutsch, WG, Busby, AL, Orprecio, JL, Bago-Labis, JP and Cequina, EY. 2005. Community-based hydrological and water quality assessments in Mindanao, Philippines. Forests, Water and People in the Humid Tropics. Cambridge University Press, UNESCO, pp. 134–148. DOI: https://doi.org/10.1017/CBO9780511535666.014 

  21. Deutsch, W, Lhotka, L and Ruiz-Cordova, S. 2009. Group dynamics and resource availability of a long-term volunteer water-monitoring program. Society and Natural Resources, 22(7): 637–649. DOI: https://doi.org/10.1080/08941920802078216 

  22. Deutsch, WG and Ruiz-Córdova, SS. 2015. Trends, challenges, and responses of a 20-year, volunteer water monitoring program in Alabama. Ecology and Society, 20(3). DOI: https://doi.org/10.5751/ES-07578-200314 

  23. Domroese, MC and Johnson, EA. 2017. Why watch bees? Motivations of citizen science volunteers in the Great Pollinator Project. Biological Conservation, 208: 40–47. DOI: https://doi.org/10.1016/j.biocon.2016.08.020 

  24. ESA. 2015. Report of the European Space Agency (www.brockmannconsult.deas) provided as part of the “Coastal Watersheds Conservation in the Context of Climate Change Project”. Financed by the Global Environmental Facility (GEF) through the World Bank. http://www.brockmann-consult.de. Accessed 11 September 2019. 

  25. Flores-Díaz, AC, Ramos-Escobedo, MG, Ruiz-Córdova, SS, Manson, R, Aranda, E and Deutsch, WG. 2013. Monitoreo comunitario del agua: retos y aprendizaje desde la perspectiva de Global Water Watch-México. México, DF: GWW. Recuperado de http://www.researchgate.net/publication/268803861. 

  26. Follett, R and Strezov, V. 2015. An analysis of citizen science based research: usage and publication patterns. PloS One, 10(11): p.e0143687. DOI: https://doi.org/10.1371/journal.pone.0143687 

  27. Fortson, L, Masters, K, Nichol, R, Edmondson, EM, Lintott, C, Raddick, J and Wallin, J. 2012. Galaxy zoo. Advances in Machine Learning and Data Mining for Astronomy, 2012: 213–236. DOI: https://doi.org/10.1201/b11822-16 

  28. García, E. 1988. Modificaciones al sistema de clasificación climática de Köppen. México, DF: Offset Larios. 

  29. García Coll, I, Martínez Otero, A, Ramírez Soto, A, Niño Cruz, A, Juan Rivas, A and Domínguez Barrada, L. 2008. La relación agua-bosque: Delimitación de zonas prioritarias para Pago de Servicios Ambientales Hidrológicos en la cuenca del Río Gavilanes, Coatepec, Veracruz. In: Cotler, H (Ed.), El manejo integral de cuencas en México: Estudios y reflexiones para orientar la política ambiental. Instituto de Ecología, pp. 99–115. 

  30. Gomani, MC Dietrich, O, Lischeid, G, Mahoo, H, Mahay, F, Mbilinyi, B and Sarmett, J. 2010. Establishment of a hydrological monitoring network in a tropical African catchment: An integrated participatory approach. Physics and Chemistry of the Earth, Parts A/B/C, 35(13–14): 648–656. DOI: https://doi.org/10.1016/j.pce.2010.07.025 

  31. Goodrich, DC, Keefer, TO, Unkrich, CL, Nichols, MH, Osborn, HB, Stone, JJ and Smith, JR. 2008. Long-term precipitation database, Walnut Gulch Experimental Watershed, Arizona, United States. Water Resources Research, 44(5): W05S04. DOI: https://doi.org/10.1029/2006WR005782 

  32. Gupta, HV, Sorooshian, S and Yapo, PO. 1999. Status of automatic calibration for hydrologic models: Comparison with multilevel expert calibration. Journal of Hydrologic Engineering, 4(2): 135–143. DOI: https://doi.org/10.1061/(ASCE)1084-0699(1999)4:2(135) 

  33. Hobbs, SJ and White, PC. 2012. Motivations and barriers in relation to community participation in biodiversity recording. Journal for Nature Conservation, 20(6): 364–373. DOI: https://doi.org/10.1016/j.jnc.2012.08.002 

  34. Hochachka, WM, Fink, D, Hutchinson, RA, Sheldon, D, Wong, WK and Kelling, S. 2012. Data-intensive science applied to broad-scale citizen science. Trends in Ecology & Evolution, 27(2): 130–137. DOI: https://doi.org/10.1016/j.tree.2011.11.006 

  35. Holwerda, F, Bruijnzeel, LA, Barradas, VL and Cervantes, J. 2013. The water and energy exchange of a shaded coffee plantation in the lower montane cloud forest zone of central Veracruz, Mexico. Agricultural and Forest Meteorology, 173: 1–13. DOI: https://doi.org/10.1016/j.agrformet.2012.12.015 

  36. Instituto Nacional de Estadística, Geografía e Informática. 2013a. Conjunto de datos de Perfiles de suelos. Escala 1:250 000. Serie II (Continuo Nacional). Available at: https://www.inegi.org.mx/app/biblioteca/ficha.html?upc=702825266707. 

  37. Instituto Nacional de Estadística, Geografía e Informática. Continuo de Elevaciones Mexicano (CEM). 2013b. Available at: http://www.beta.inegi.org.mx/app/geo2/elevacionesmex/index.jsp. 

  38. Kelling, S, Fink, D, La Sorte, FA, Johnston, A, Bruns, NE and Hochachka, WM. 2015. Taking a ‘Big Data’approach to data quality in a citizen science project. Ambio, 44(4): 601–611. DOI: https://doi.org/10.1007/s13280-015-0710-4 

  39. Kim, M, Boithias, L, Cho, KH, Silvera, N, Thammahacksa, C, Latsachack, K, Rochelle-Newall, E, Sengtaheuanghoung, O, Pierret, A, Pachepsky, YA and Ribolzi, O. 2017. Hydrological modeling of fecal indicator bacteria in a tropical mountain catchment. Water Research, 119: 102–113. DOI: https://doi.org/10.1016/j.watres.2017.04.038 

  40. Kosmala, M, Wiggins, A, Swanson, A and Simmons, B. 2016. Assessing data quality in citizen science. Frontiers in Ecology and the Environment, 14(10): 551–560. DOI: https://doi.org/10.1002/fee.1436 

  41. Kremen, C, Ullman, KS and Thorp, RW. 2011. Evaluating the quality of citizen-scientist data on pollinator communities. Conservation Biology, 25(3), 607–617. DOI: https://doi.org/10.1111/j.1523-1739.2011.01657.x 

  42. Land-Zandstra, AM, Devilee, JL, Snik, F, Buurmeijer, F and van den Broek, JM. 2016. Citizen science on a smartphone: Participants’ motivations and learning. Public Understanding of Science, 25(1): 45–60. DOI: https://doi.org/10.1177/0963662515602406 

  43. Lanza, LG and Vuerich, E. 2009. The WMO field intercomparison of rain intensity gauges. Atmospheric Research, 94(4): 534–543. DOI: https://doi.org/10.1016/j.atmosres.2009.06.012 

  44. Liu, BM, Abebe, Y, McHugh, OV, Collick, AS, Gebrekidan, B and Steenhuis, TS. 2008. Overcoming limited information through participatory watershed management: Case study in Amhara, Ethiopia. Physics and Chemistry of the Earth, Parts A/B/C, 33(1–2): 13–21. DOI: https://doi.org/10.1016/j.pce.2007.04.017 

  45. López-Ramírez, SM, Sáenz, L, Mayer, A, Muñoz-Villers, LE, Asbjornsen, H, Berry, ZC and Aguilar, LRG. 2020. Land use change effects on catchment streamflow response in a humid tropical montane cloud forest region, central Veracruz, Mexico. Hydrological Processes. DOI: https://doi.org/10.1002/hyp.13800 

  46. McAfee, K and Shapiro, EN. 2010. Payments for ecosystem services in Mexico: nature, neoliberalism, social movements, and the state. Annals of the Association of American Geographers, 100(3): 579–599. DOI: https://doi.org/10.1080/00045601003794833 

  47. Mokondoko, P, Manson, RH, Ricketts, TH and Geissert, D. 2018. Spatial analysis of ecosystem service relationships to improve targeting of payments for hydrological services. PloS One, 13(2). DOI: https://doi.org/10.1371/journal.pone.0192560 

  48. Molini, A, Lanza, LG and La Barbera, P. 2005. The impact of tipping-bucket raingauge measurement errors on design rainfall for urban-scale applications. Hydrological Processes: An International Journal, 19(5): 1073–1088. DOI: https://doi.org/10.1002/hyp.5646 

  49. Moriasi, DN, Zeckoski, RW, Arnold, JG, Baffaut, C, Malone, RW, Daggupati, P, Guzman, JA, Saraswat, D, Yuan, Y, Wilson, BN and Shirmohammadi, A. 2015. Hydrologic and water quality models: Key calibration and validation topics. Transactions of the ASABE, 58(6): 1609–1618. DOI: https://doi.org/10.13031/trans.58.11075 

  50. Muñoz-Piña, C, Guevara, A, Torres, JM and Braña, J. 2008. Paying for the hydrological services of Mexico’s forests: Analysis, negotiations and results. Ecological Economics, 65(4): 725–736. DOI: https://doi.org/10.1016/j.ecolecon.2007.07.031 

  51. Muñoz-Villers, LE, Geissert, DR, Holwerda, F and McDonnell, JJ. 2015. Factors influencing stream water transit times in tropical montane watersheds. Hydrology & Earth System Sciences Discussions, 12(10). DOI: https://doi.org/10.5194/hessd-12-10975-2015 

  52. Muñoz-Villers, LE, Holwerda, F, Gómez-Cárdenas, M, Equihua, M, Asbjornsen, H, Bruijnzeel, LA, Marín-Castro, BE and Tobón, C. 2012. Water balances of old-growth and regenerating montane cloud forests in central Veracruz, Mexico. Journal of Hydrology, 462: 53–66. DOI: https://doi.org/10.1016/j.jhydrol.2011.01.062 

  53. Muñoz-Villers, LE, and López-Blanco, J. 2008. Land use/cover changes using Landsat TM/ETM images in a tropical and biodiverse mountainous area of central-eastern Mexico. International Journal of Remote Sensing, 29(1): 71–93. DOI: https://doi.org/10.1080/01431160701280967 

  54. Muñoz-Villers, LE and McDonnell, JJ. 2013. Land use change effects on runoff generation in a humid tropical montane cloud forest region. Hydrology and Earth System Sciences, 17(9): 3543. DOI: https://doi.org/10.5194/hess-17-3543-2013 

  55. Nava-López, M, Selfa, TL, Cordoba, D, Pischke, EC, Torrez, D, Ávila-Foucat, S, Halvorsen, KE and Maganda, C. 2018. Decentralizing payments for hydrological services programs in Veracruz, Mexico: Challenges and implications for long-term sustainability. Society & Natural Resources, 31(12): 1389–1399. DOI: https://doi.org/10.1080/08941920.2018.1463420 

  56. Neitsch, SL, Arnold, JG, Kiniry, JR and Williams, JR. 2011. Soil and water assessment tool theoretical documentation version 2009. Texas Water Resources Institute. 

  57. Newman, G, Wiggins, A, Crall, A, Graham, E, Newman, S and Crowston, K. 2012. The future of citizen science: emerging technologies and shifting paradigms. Frontiers in Ecology and the Environment, 10(6): 298–304. DOI: https://doi.org/10.1890/110294 

  58. Plesca, I, Timbe, E, Exbrayat, JF, Windhorst, D, Kraft, P, Crespo, PVKB, Vaché, KB, Frede, HG and Breuer, L. 2012. Model intercomparison to explore catchment functioning: Results from a remote montane tropical rainforest. Ecological Modelling, 239: 3–13. DOI: https://doi.org/10.1016/j.ecolmodel.2011.05.005 

  59. Raddick, MJ, Bracey, G, Gay, PL, Lintott, CJ, Cardamone, C, Murray, P, Schawinski, K, Szalay, AS and Vandenberg, J. 2013. Galaxy Zoo: Motivations of citizen scientists. arXiv preprint arXiv:1303.6886. 

  60. Reges, HW, Doesken, N, Turner, J, Newman, N, Bergantino, A and Schwalbe, Z. 2016. CoCoRaHS: The evolution and accomplishments of a volunteer rain gauge network. Bulletin of the American Meteorological Society, 97(10): 1831–1846. DOI: https://doi.org/10.1175/BAMS-D-14-00213.1 

  61. Riesch, H and Potter, C. 2014. Citizen science as seen by scientists: Methodological, epistemological and ethical dimensions. Public Understanding of Science, 23(1): 107–120. DOI: https://doi.org/10.1177/0963662513497324 

  62. Rotman, D, Preece, J, Hammock, J, Procita, K, Hansen, D, Parr, C, Lewis, D and Jacobs, D. 2012, February. Dynamic changes in motivation in collaborative citizen-science projects. In Proceedings of the ACM 2012 conference on computer supported cooperative work, pp. 217–226. DOI: https://doi.org/10.1145/2145204.2145238 

  63. Roy, HE, Pocock, MJ, Preston, CD, Roy, DB, Savage, J, Tweddle, JC and Robinson, LD. 2012. Understanding citizen science and environmental monitoring: final report on behalf of UK Environmental Observation Framework. 

  64. Ryan, RL, Kaplan, R and Grese, RE. 2001. Predicting volunteer commitment in environmental stewardship programmes. Journal of Environmental Planning and Management, 44(5): 629–648. DOI: https://doi.org/10.1080/09640560120079948 

  65. Saldaña, J. 2015. The coding manual for qualitative researchers. Sage. 

  66. Sarabia, L, Ortiz, MC, Leardi, R and Drava, G. 1993. A program for non-orthogonal rotation in factor analysis. DOI: https://doi.org/10.1016/0165-9936(93)87061-2 

  67. Schmalz, B and Fohrer, N. 2009. Comparing model sensitivities of different landscapes using the ecohydrological SWAT model. Advances in Geosciences, 21. DOI: https://doi.org/10.5194/adgeo-21-91-2009 

  68. Shedekar, VS, King, KW, Fausey, NR, Soboyejo, AB, Harmel, RD and Brown, LC. 2016. Assessment of measurement errors and dynamic calibration methods for three different tipping bucket rain gauges. Atmospheric Research, 178: 445–458. DOI: https://doi.org/10.1016/j.atmosres.2016.04.016 

  69. Strauch, M, Kumar, R, Eisner, S, Mulligan, M, Reinhardt, J, Santini, W, Vetter, T and Friesen, J. 2017. Adjustment of global precipitation data for enhanced hydrologic modeling of tropical Andean watersheds. Climatic Change, 141(3): 547–560. DOI: https://doi.org/10.1007/s10584-016-1706-1 

  70. Sullivan, BL, Aycrigg, JL, Barry, JH, Bonney, RE, Bruns, N, Cooper, CB, Damoulas, T, Dhondt, AA, Dietterich, T, Farnsworth, A and Fink, D. 2014. The eBird enterprise: an integrated approach to development and application of citizen science. Biological Conservation, 169: 31–40. DOI: https://doi.org/10.1016/j.biocon.2013.11.003 

  71. Theobald, EJ, Ettinger, AK, Burgess, HK, DeBey, LB, Schmidt, NR, Froehlich, HE, Wagner, C, HilleRisLambers, J, Tewksbury, J, Harsch, MA and Parrish, JK. 2015. Global change and local solutions: Tapping the unrealized potential of citizen science for biodiversity research. Biological Conservation, 181: 236–244. DOI: https://doi.org/10.1016/j.biocon.2014.10.021 

  72. Upton, GJG and Rahimi, AR. 2003. On-line detection of errors in tipping-bucket rain gauges. Journal of Hydrology, 278(1–4): 197–212. DOI: https://doi.org/10.1016/S0022-1694(03)00142-2 

  73. van Griethuijsen, RA, van Eijck, MW, Haste, H, den Brok, PJ, Skinner, NC, Mansour, N, Gencer, AS and BouJaoude, S. 2015. Global patterns in students’ views of science and interest in science. Research in Science Education, 45(4): 581–603. DOI: https://doi.org/10.1007/s11165-014-9438-6 

  74. Vaske, JJ. 2008. Survey research and analysis: Applications in parks, recreation and human dimensions. Venture Pub. 

  75. Vianna, GM, Meekan, MG, Bornovski, TH and Meeuwig, JJ. 2014. Acoustic telemetry validates a citizen science approach for monitoring sharks on coral reefs. PloS one, 9(4). DOI: https://doi.org/10.1371/journal.pone.0095565 

  76. Von Thaden, J, Manson, RH, Congalton, RG, López-Barrera, F and Salcone, J. 2019. A regional evaluation of the effectiveness of Mexico’s payments for hydrological services. Regional Environmental Change, 19(6): 1751–1764. DOI: https://doi.org/10.1007/s10113-019-01518-3 

  77. Walker, D, Forsythe, N, Parkin, G and Gowing, J. 2016. Filling the observational void: Scientific value and quantitative validation of hydrometeorological data from a community-based monitoring programme. Journal of Hydrology, 538: 713–725. DOI: https://doi.org/10.1016/j.jhydrol.2016.04.062 

  78. Watson, DF and Philip, GM. 1985. A refinement of inverse distance weighted interpolation. Geo-processing, 2(4): 315–327. 

  79. West, SE and Pateman, RM. 2016. Recruiting and retaining participants in citizen science: What can be learned from the volunteering literature? Citizen Science: Theory and Practice. DOI: https://doi.org/10.5334/cstp.8 

  80. Winchell, M, Srinivasan, R, Di Luzio, M, and Arnold, J. 2013. ArcSWAT interface for SWAT2012: user’s guide. Blackland Research and Extension Center, Texas Agrilife Research. Grassland, Soil and Water Research Laboratory, USDA Agricultural Research Service, Texas. 

  81. Wolkovich, EM and Cleland, EE. 2011. The phenology of plant invasions: a community ecology perspective. Frontiers in Ecology and the Environment, 9(5): 287–294. DOI: https://doi.org/10.1890/100033 

  82. Worthington, JP, Silvertown, J, Cook, L, Cameron, R, Dodd, M, Greenwood, RM, McConway, K and Skelton, P. 2012. Evolution MegaLab: a case study in citizen science methods. Methods in Ecology and Evolution, 3(2): 303–309. DOI: https://doi.org/10.1111/j.2041-210X.2011.00164.x 

  83. Wright, DR, Underhill, LG, Keene, M and Knight, AT. 2015. Understanding the motivations and satisfactions of volunteers to improve the effectiveness of citizen science programs. Society & Natural Resources, 28(9): 1013–1029. DOI: https://doi.org/10.1080/08941920.2015.1054976 

  84. Wunder, S. 2007. The efficiency of payments for environmental services in tropical conservation. Conservation Biology, 21(1): 48–58. DOI: https://doi.org/10.1111/j.1523-1739.2006.00559.x 

comments powered by Disqus