Abstract
Several United States Department of Agriculture (USDA) conservation programs are promoting conversion of cropland to nonagricultural use, e.g., buffer establishment for wildlife habitat and Conservation Reserve Program (CRP). These areas set aside for nonagricultural use may serve as barriers to mitigate runoff within a watershed where agricultural activities are conducted. CRP and wildlife buffer areas were established in Beasley Lake watershed (BLW), Sunflower County, Mississippi, United States, in 2003 and 2006, respectively. The objectives of this study were to assess catchment runoff and runoff water quality through both measured data and model simulations from three scenarios of land management in BLW from 2011 to 2017: row crop with no edge-of-field buffer (Crop), row crop fields adjacent to wildlife buffers (CropBuff), and CRP. Measured data were collected at the field-scale from nine subcatchments. Median runoff from Crop sites was greater than that from CRP catchments (123 m3 ha−1 versus 39 m3 ha−1 across seasons and years). Median loads of suspended solid loads in runoff averaged 108 kg ha−1 for Crop compared to 42 kg ha−1 in CropBuff areas (across seasons and years). Similar trends were observed for median total Kjeldahl nitrogen (TKN) and total phosphorus (TP) loads in runoff: TKN 0.26, 0.18, and 0.07 kg ha−1; and TP 0.06, 0.06, and 0.02 kg ha−1 for Crop, CropBuff, and CRP, respectively. For both CRP and CropBuff, the proportion of soluble/fine particle (<0.45 μm) TKN and TP lost in runoff was greater than that of particulate-bound TKN and TP from Crop. Overall, efficacy in mitigating runoff losses were generally in the order of CRP, CropBuff, and Crop. In addition to field measurements, watershed simulations were conducted with and without the practices using Annualized Agricultural Non-Point Source (AnnAGNPS) pollution model. AnnAGNPS simulations independently confirm the measured results and account for differences in soil types among the different land use practices that could influence measured results. These results demonstrated that wildlife buffers and CRP conservation practices helped to improve and maintain environmental quality of soil and water resources in BLW.
Introduction
Agriculture's challenge is maintaining the balance between accommodating the growing demands for food, fiber, and biofuel production and protecting water, soil, and biodiversity resources (Amundson et al. 2015; Assouline et al. 2015). For decades, government programs enacted through US farm policy have addressed the goals of agriculture (whether by surplus control or promoting production), protecting farmers from natural disasters through crop insurance subsidies, or mitigating the effects of agriculture on vulnerable natural resources. However, benefits of agricultural programs have not always been clearly quantifiable. Benefits of a management practice observed at small plot scales may not be clearly demonstrated at watershed scales. Several factors contribute to these discrepancies (Tomer and Locke 2011). For example, management practices should be strategically targeted in vulnerable areas within a watershed to maximize efficacy (Tomer and Locke 2011). Strategic targeting not only includes placement, but also optimum coverage of the practice within the targeted area. Improper placement of practices within a watershed would prevent proper assessment of benefits due to those practices. Another factor complicating assessment of management practice efficiency at watershed scales is the difficulty in delineating the contribution of a single practice to the outcome when multiple practices are implemented within a farm system.
Concern for the cost-effectiveness of agricultural programs and benefits to society led to increasing demand for better accountability (Mausbach and Dedrick 2004). The need for better accountability was underscored by substantial increases in funding levels for conservation programs in the 2002 Farm Bill (Farm Security and Rural Investment Act of 2002). In response, the USDA Natural Resources Conservation Service (NRCS) partnered with the USDA Agricultural Research Service (ARS) in 2003 to establish and lead the Conservation Effects Assessment Project (CEAP), a multi-institution collaboration with the goal of measuring environmental benefits of conservation practices at watershed and national scales.
Ever since the Dustbowl Era in the United States in the early part of the twentieth century, the USDA has used many programs such as the Soil Bank Program (Agricultural Act of 1956, P.L. 84-540) to promote conservation of soil and water resources (Locke et al. 2010). Within the last three decades, USDA conservation programs were established to promote the conversion of cropland to nonagricultural use. One such program administered by the USDA Farm Service Agency jointly with the USDA NRCS is the Conservation Reserve Program (CRP) enacted under the 1985 Farm Bill (Food Security Act of 1985) to remove environmentally sensitive land from agricultural production and implement resource-conserving plant cover in those areas for a period of 10 to 15 years. Vegetative land cover was reestablished in CRP areas, and expected outcomes included improved runoff water quality, reduced erosion, and enhanced wildlife habitat.
Many special practices were developed under the Continuous CRP (CCRP) in the 1996 Farm Bill (Food Security Act of 1996). One of those implemented in 2004 is the “Habitat Buffers for Upland Birds” practice (CP-33), which, as the name implies, was promoted to remove field edges from agriculture and establish native grasses and legumes in these buffer areas (CRP CP-33 Fact Sheet). While the primary purpose of these buffer areas is to attract wildlife and provide wildlife habitat, buffers may also serve as a barrier to mitigate erosion and runoff from adjacent cropland.
The USDA ARS established a network of monitored watersheds to address goals of CEAP (Duriancik et al. 2008). One of these watersheds is Beasley Lake watershed (BLW), an oxbow lake watershed located in the state of Mississippi in the alluvial plain of the Mississippi River (Locke et al. 2008). Resource issues of concern in this region include water availability, soil erosion, water quality (including hypoxia), and agricultural sustainability. Conservation practices implemented in BLW (625 ha) include CRP (NRCS Practice 612) in approximately 18% of the area in 2003 and approximately 9 ha of bird habitat buffers (CP-33, NRCS Practice 601) along the edges of row crop fields in 2006.
A conventional hypothesis of edge-of-field assessments is that runoff water quality will improve under agricultural land management practices designed to intercept, process, and remove sediment and nutrients (Cullum et al. 2010; Locke et al. 2008). Practices such as tillage management (no-till, reduced till), cover crops, structural pipes and pads, and vegetated buffers have exhibited varying degrees of success in mitigating sediment and nutrients in runoff (Aryal et al. 2018; Baker et al. 2018; Cullum et al. 2010; Helmers et al. 2012; Lerch et al. 2015; Lizotte and Locke 2018; Locke et al. 2015). Objectives of this study were to assess impacts of land management practices in BLW on surface runoff, sediment, and nutrients utilizing measured data at the field scale and simulations with the AnnAGNPS watershed model, specifically focused on CRP and row crop areas with edge-of-field riparian buffer conservation practices, as compared to row crop subdrainages with no buffers. The AnnAGNPS watershed model has been utilized extensively in the Mississippi Alluvial Plain for simulation of runoff and sediment yield (Yuan et al. 2001, 2008), nitrogen (N) loading (Yuan et al. 2003), conservation practices (Yuan et al. 2002; Lizotte et al. 2017; Yasarer et al. 2017), and crop irrigation (Momm et al. 2019). Therefore, the AnnAGNPS model was utilized in this study to provide a parallel analysis of water quality reductions from conservation practices in addition to the measured data. The model simulations are able to evaluate the effects of changing conservation practices while keeping soil, climate, and topography constant.
Materials and Methods
Study Area. Beasley Lake watershed, Sunflower County, Mississippi, United States (33°24′15″ N, 90°40′05″ W), was selected as a CEAP watershed in 2003. Beasley Lake is an abandoned meander (oxbow lake) adjacent to and just south of the Big Sunflower River (figure 1), located within the Big Sunflower 8-digit hydrologic unit code (HUC) (08030207) and the Indian Bayou—Big Sunflower 10-digit HUC (0803020710). The watershed encompasses approximately 625 ha of land with low topography typical of the alluvial plain of the Lower Mississippi River Basin and is located within the Mississippi Delta Cotton and Feed Grains Region, Southern Mississippi River Alluvium (USDA NRCS 2006). Approximately 339 ha or 54% of the watershed was in intensive row crop production, with soybeans (Glycine max [L.] Merr.), corn (Zea mays L.), cotton (Gossypium hirsutum L.), sorghum (Sorghum bicolor [L.] Moench), and winter wheat (Triticum aestivum L.) produced during the study period of 2011 to 2017. Typical deltaic fluvial floodplain soil types and textures occur throughout the watershed, with soil textures ranging from sandy loam to clays. Major soil series in the watershed include Alligator (very fine, smectitic, thermic Chromic Dystraquerts); Bosket (fine-loamy, mixed, active, thermic Mollic Hapludalfs); Dowling (very fine, smectitic, nonacid, thermic Vertic Endoaquepts); Dundee (fine-silty, mixed, active, thermic Typic Endoaqualfs); Forestdale (fine, smectitic, thermic Typic Endoaqualfs); and Sharkey (very fine, smectitic, thermic Chromic Epiaquerts) (figure 1) (Locke et al. 2008; Lizotte et al. 2017; Lizotte and Locke 2018; USDA NRCS Soil Survey Staff 2018a, 2018b).
Nine runoff study subdrainage basins were established per methods of Cullum et al. (2010) with triplicate slotted-inlet drainage pipe locations in each of three land use categories: (1) three areas of conventional row crop production with no edge-of-field buffer (Crop), (2) three areas of conventional row crop production with established edge-of-field riparian vegetated buffers (CropBuff), and (3) three areas established in CRP (CRP).
Row crop production areas were planted in corn, soybean, or cotton and were managed with agricultural practices conventional to the region (table 1). Agricultural management decisions were made by farmers and typically included tillage and bed preparation in the fall after crop harvest, with tillage again in the spring prior to planting the next crop. Fertilizer, when applied, was added in the spring prior to planting a crop, and was usually knifed in. Crop-only subdrainage basins are delineated as follows from east to west in figure 1: Crop 1 (3.07 ha), Crop 2 (3.5 ha), and Crop 3 (4.8 ha), respectively, and were planted in cotton (2011 only) and/or soybean (2012 to 2017) row crops during the study period (table 1).
In CropBuff subdrainage basins that included a composite of both row crop and edge-of-field buffer vegetation, proportions of cropped area ranged from 34% to 70% of the drainage area. Within the row crop portion of the CropBuff subdrainage areas, soybean was the only crop planted during the 2011 to 2017 period (table 1). The buffer portion of the CropBuff areas was established in 2006, and vegetated buffers were planted to attract northern bobwhite quail (Colinus virginianus) according to NRCS practice CP-33. By 2010, buffer areas were colonized with a variety of annual and perennial vegetation, including eastern cottonwood (Populus deltoides Bartr. ex Marsh.) and various oak (Quercus sp.) trees, Alamo switchgrass (Panicum virgatum), Canadian horseweed (Conyza canadensis [L.] Cronquist), little bluestem (Schizachyrium scoparium), and goldenrod (Solidago spp.). CropBuff subdrainage basin areas are delineated as follows from east to west in figure 1: CropBuff 1 (2.6 ha), CropBuff 2 (4.3 ha), and CropBuff 3 (1.4 ha), respectively. Field observations indicated that runoff flowing into CropBuff sites were primarily sheet flow. Within Cropbuff sites, flows were a combination of sheet flow and concentrated flow until topology produced preferential flows concentrating in small channels draining to the sampling site culvert.
Map of Beasley Lake watershed with varying land management subdrainage basin runoff sampling sites: conventional row crop production with no edge-of-field buffer (Crop; triangles) 1, 2, 3; buffers adjacent to row crop fields (CropBuff; circles) 1, 2, 3; and Conservation Reserve Program (CRP; squares) 1, 2, 3, from 2011 to 2017. US Geological Survey (USGS) gauges used in model validation are also shown at location B1 (grey circle) and B3 (same as CRP 3). The lower six images show runoff drainage basins in greater detail and their respective soil series.
Influence of Climate and Geography. Landscape, climate, and agricultural practices all contributed to runoff patterns in BLW. The terrestrial landscape in BLW is typical for the Mississippi Delta alluvial plain, with relatively flat topography (elevation above sea level ranges 30.4 to 35.8 m across the watershed) and highly variable soils. Striated patterns for soils across the landscape are a result of stream meandering across the floodplain over millennia, with higher elevations associated with remnants of old river banks and depressions occurring with distance from the old river banks (Aslan and Autin 1999). Soils in this region are inherently low in organic matter due to the humid, warm climate, and are predominantly finely textured, low permeability, and highly erodible (Iqbal et al. 2005; Vanderford 1975). In subdrainage areas within BLW associated with this study, silt loam-silty clay loams are primarily Dundee and Forestdale (47% of land area), and clay-silty-clays are predominantly Alligator, Dowling, and Sharkey (49% of land area) (Web Soil Survey, SSURGO). Because of level topography and low permeability, networks of ditches are required to accommodate drainage from agricultural fields (figure 1).
During periods of heavy rainfall, finely textured soils in BLW are vulnerable to erosion and loss of sediment, which moves through drainage ditches to the adjacent lake (Lizotte and Locke 2018). Every year of the study (2011 to 2017), fields were tilled and harrowed following harvest to prepare row beds for planting the next spring. Tillage leaves the field soil bare and vulnerable to heavier precipitation the next spring. This is in stark contrast to the CRP area where soils have not been tilled since 2004. Vegetative buffers in BLW provided a barrier mitigating sediment and nutrient loss.
CRP subdrainage basin runoff sampling sites were vegetated with eastern cottonwood, oak, and hickory (Carya sp.) trees planted from 2003 to 2004 (Cullum et al. 2010; Lizotte and Locke 2018; Locke et al. 2008). CRP subdrainage basin areas are delineated as follows from east to west in figure 1: 6.0 ha, 2.4 ha, and 0.8 ha, respectively.
Runoff Collection and Analysis. Runoff events occurring in study subdrainage basins within each land use category from October of 2011 to December of 2017 were measured and sampled according to methods described in Cullum et al. (2010) and Lizotte and Locke (2018). Water sampling occurred throughout the year whenever runoff was generated. Within each subdrainage, runoff was measured and surface runoff water was sampled from a slotted-inlet drainage pipe equipped with a Doppler area-velocity sensor for flow measurement integrated with an ISCO automatic electronic composite water sampler (Teledyne ISCO, Lincoln, Nebraska). Composite water sampler programming automatically triggered surface runoff water collection when flow was detected. Samples collected were flow proportional with initial sampling occurring when water volumes exceeded thresholds based upon subdrainage area size. Samplers were set to initiate based upon 0.4 mm of water falling per 0.4 ha. As a result, sample collection volume intervals ranged from approximately 3 to 23 m3 for subdrainage areas from 0.8 to 6.0 ha. Surface runoff water samples were retrieved, preserved on wet ice at 4°C and immediately transported to the USDA ARS National Sedimentation Laboratory, Oxford, Mississippi, for sample processing and laboratory water quality analysis. Runoff to slotted-inlet drainage pipes flowed based on topography of the landscape, which may be influenced by producers' management practices. Sheet flow is deterred from entering the ditch directly due to a slightly elevated earthen berm that directs water to the inlet pipe. Rainfall was measured using a tipping bucket-style rain gauge with an accuracy of 0.025 mm.
Farm management in Beasley Lake watershed subdrainage areas of interest with row crops (row crop without buffer [Crop] and row crop with buffer [CropBuff]) from 2011 to 2017.
Surface runoff water samples were processed and analyzed using the following American Public Health Association standard procedures (APHA 2005). Unfiltered samples were subsampled and analyzed for total suspended solids (TSS) by drying at 103°C to 105°C. TSS was measured in runoff due to its potential for causing a wide range of environmental impacts including effects on algae, aquatic invertebrates, and fish as well as transport of other sorbed contaminants into receiving aquatic systems (Davies-Colley and Smith 2001). When water volume was sufficient, additional subsampled water was analyzed for nutrients as described by Lizotte and Locke (2018). Nutrients, N, and phosphorus (P), were measured in runoff due to the strong potential for exacerbating eutrophication (Baulch 2013). Briefly, unfiltered samples were analyzed for total phosphorus (TP) and total Kjeldahl nitrogen (TKN) after digestion using sulfuric acid (H2SO4) with mercuric oxide (HgO) and potassium sulfate (K2SO4) using a Lachat QuickChem 8500 Series II auto analyzer (Lachat Instruments, Hach Company, Loveland, Colorado) with detection limits (DL) of 0.015 mg L−1 and 0.018 mg L−1 for TP and TKN, respectively. Samples filtered through a 0.45 μm cellulose nitrate (NO3) filter were analyzed for solids, soluble N and P including: nitrate-nitrogen (NO3-N, DL = 0.002 mg L−1), nitrite-nitrogen (NO2-N, DL = 0.002 mg L−1), ammonium-nitrogen (NH4-N, DL = 0.003 mg L−1), and soluble reactive phosphate (PO4-P, DL = 0.001 mg L−1) (APHA 2005). Filtered samples also were analyzed for TP and TKN after digestion as described above for unfiltered samples. As with TP and TKN for filtered and unfiltered samples, all soluble nutrient constituents were analyzed using Lachat QuickChem 8500 Series II auto analyzer (Lachat Instruments, Hach Company, Loveland, Colorado). Total (digestible) N and P were categorized into the following fractions. Overall total TKN and TP was the sum of the soluble to fine particulate phase (<0.45 μm) and coarse particulate solid phase (>0.45 μm). The soluble to fine particulate phase fraction was measured in water with particulates <0.45 μm (total dissolved solids [TDS]). The coarse particulate solid phase fraction (particles >0.45 μm) was obtained by the difference between the total and the soluble to fine particulate phase fraction (APHA 2005). Total dissolved organic carbon (TDOC, DL = 0.004 mg L−1) was determined by taking water filtered through a 0.45 μm filter and analyzing it on a Shimadzu TOC-L analyzer (Shimadzu Corporation, Kyoto, Japan). TDOC was measured in runoff due to the potential to also affect primary productivity and nutrient bioavailability in aquatic systems (Brett et al. 2017).
Runoff Data Analysis. For data analysis, loads were calculated for each water quality constituent using measured concentration per volume discharge. Concentrations below DLs were assigned a value of one-half DLs (e.g., PO4-P, DL = 0.001 mg L−1 × ½ = 0.0005 mg L−1) for analysis. Final water quality data are combined for all years (2011 to 2017) and reported as loads per unit area as determined by total discharge of each sampled runoff event within each subdrainage basin as previously described. Runoff data from Crop subdrainage basins were used as negative controls (business-as-usual farming practices) relative to CRP and CropBuff for the purposes of assessing buffer effectiveness. To address potential seasonal effects on runoff, each year was divided into three month sets: February to May, representing winter to early spring; June to September, representing late spring to early fall; and October to January, representing early fall to early winter. The rationale used for dividing the years into these four-month “seasons” included farming operations and rainfall and temperature patterns. In this region, February through May can be described as winter to early spring and is a period when farming operations phase from fallow conditions to the season for planting of crops and early row crop establishment. This period includes most of the pesticide and fertilizer applications and spring tillage operations. June through September, or late spring to summer, is the major portion of the growing season for row crops. This season is characterized by the highest temperatures and lowest rainfall (figure 2) for the year, and row crops in the region often require irrigation. October through January, late summer to early winter, is the transition period from crop harvest to fallow season. Farming operations during this time include tillage after harvest, crop row and bed preparation, and planting of cover crops (cover crops not applicable in the present study). Rainfall patterns during this period transition from low precipitation in October and November to higher in December and January (figure 2).
Gaps in sampling at one or more runoff sampling sites occurred throughout the 2011 to 2017 period of study. Various reasons included malfunction of equipment, flooding of sample sites, and inaccessibility due to landowner restrictions, particularly during hunting seasons. On average, 25% of events were not sampled in the CropBuff and Crop sites, and 38% of events were missing from CRP sites. CRP was more affected by landowner restrictions during hunting seasons and flooding. As a result, runoff sampling data were not assessed as annual loads or summed mass values across the entire study period (Aryal et al. 2018), but rather examined seasonally using nonparametric analyses to minimize the influence of outliers and missing data as described below.
For statistical analysis, Kruskal-Wallis One Way Analysis of Variance (ANOVA) on Ranks with Dunn's Multiple Comparison test was conducted due to uneven sample sizes and nonparametric nature of water quality data (Burton and Pitt 2002). Analysis of variance was conducted separately for water quality parameters measured and reported in two different units. First, an ANOVA was conducted on parameters reported in units of load (solids and nutrients [kg ha−1], and water volumes [m3 ha−1]). Because runoff literature has sometimes reported values in concentrations in conjunction with or instead of loads (Helmers et al. 2012; Lerch et al. 2015; Aryal et al. 2018; Harmel et al. 2018; Lizotte and Locke 2018), a second ANOVA was conducted on the same parameters (except water volumes) reported in units of concentration (mg L−1). Data were sorted by season (as described previously) and land management for statistical analyses. An ANOVA on Ranks was conducted across land management within season. Next, an ANOVA on Ranks across seasons within management practices was conducted. If p-values were marginally statistically significant (Wasserstein and Lazar 2016), greater than 0.05 and less than 0.15, a one-way ANOVA on rank-transformed data was used (Conover and Iman 1981; Locke et al. 2013), and a Tukey's Multiple Comparison was used to test for differences among groups. For rank-transformation, all data within a defined variable across all treatments are listed from lowest value to highest value. Values are then assigned an integer rank value to the data. Equal values are tied in rank, and an average rank is assigned to all tied values. This rank is the average of the ranks that would have been assigned to all the tied values if they were not tied (Conover and Iman 1981).
Beasley Lake AnnAGNPS Simulation Development. The Annualized Agricultural Non-Point Source pollution model (AnnAGNPS) provides a continuous, spatially distributed simulation of water, sediment, and chemical transport in agricultural watersheds (Bingner et al. 2015). AnnAGNPS integrates technology from the Revised Universal Soil Loss Equation (RUSLE) to simulate effects of crops and vegetation, farm operations, and management practices on soil disturbance and estimate sheet and rill erosion (Renard et al. 1997). Runoff is estimated using the curve number considering differences between dry and wet conditions. The Hydro-geomorphic Universal Soil Loss Equation (HUSLE) is used to calculate a delivery ratio of sediment yield from erosion to sediment delivery to the stream (Theurer and Clarke 1991). Chemical routing considers N and P in both the soluble and adsorbed phases, and field-scale routines consider the uptake of N and P by plants, application of fertilizers, residue decomposition, and downward movement of chemicals. Equations for these processes are derived from a variety of sources such as the Environmental Policy Integrated Climate (EPIC), Trace Element Transport (TETRANS), and RUSLE models (Bingner et al. 2018). The AnnAGNPS model was chosen for use in this study as it has been validated in several watersheds within the Mississippi Delta. A full description of AnnAGNPS can be found in Bingner et al. (2018).
Seasonal temperature and precipitation in Beasley Lake watershed during the study period from 2011 to 2017 (USDA NRCS 2018).
The Beasley Lake model simulation utilized a 1.5 m resolution hydrologically corrected LiDAR digital elevation model to generate the watershed boundary, cells, flow paths, and cell slopes. Soil shapefiles and characteristics were derived from the SSURGO database (USDA NRCS Soil Survey Staff 2018a). Climate information during the simulation period was from the USDA NRCS SCAN climate station within BLW (USDA NRCS 2018). Other inputs for Beasley Lake simulations were derived from long-term data sets collected by the USDA ARS National Sedimentation Laboratory. Land use and associated management practices were recorded for each field from 1996 to 2018. Average fertilizer application rates were calculated for each crop type and applied across the watershed. Variation in tillage practices were simulated for each crop and each year to represent the reality of the practices within the watershed. Dates for planting, tilling, fertilizing, and harvesting were typified dates determined from the 22-year record.
Runoff was validated for two locations (B1 and B3) within the watershed using data collected by the US Geological Survey between 2000 and 2002 (USGS 2010). Event-based Nash-Sutcliffe Efficiency and percentage bias values of 0.66 and −15% for location B3, and 0.51 and −3.6% for location B1 were determined (figure 1). Manual calibration of runoff at B1 and B3 determined that the default AnnAGNPS values produced the best representation of the runoff events. Historical and alternative scenarios were run from 1996 to 2014 to include the background period before CRP and grass buffers were implemented. Model scenario results were then analyzed utilizing annual average loads from 2003 to 2014, which includes the time period when both CRP and CropBuff were in place within the watershed. Alternative scenarios with CRP and CropBuff converted to cropland were also simulated and the difference between historical and alternative scenarios was assessed. The scenarios with and without CRP and buffer practices allow for teasing out the effects of the practices alone, while keeping soil, climate, and background conditions constant—as these factors may differ amongst the field subcatchments. Also, as the AnnAGNPS model was not calibrated with the runoff data utilized in this study, model simulation results provide an independent source for comparison with measured results.
Results and Discussion
Weather and Measured Hydrology. The climate in this region is humid subtropical, and the long-term (1971 to 2017) average annual, summer, and winter temperatures measured at Moorhead, Sunflower County, Mississippi (14 km from BLW), are reported in table 2 (NOAA 2018). Average annual temperatures at BLW during the study period (2011 to 2017) were comparable to long-term averages (table 2). However, annual precipitation during the study period was almost 200 mm lower than the long-term average, and this trend was apparent for all seasons (February to May, June to September, and October to January).
Seasonal weather patterns for BLW for each year during the study period of interest (2011 to 2017) are presented in figure 2. Average temperatures during June through September did not vary greatly among years during the study period. Average temperatures during February through May were the lowest from 2013 to 2015, while October through January temperatures were highest from 2015 to 2016. Average precipitation during October through January was highest in 2012 and 2015, and precipitation during February through May was highest from 2013 to 2016. During the spring of 2016, precipitation in the region resulted in the Sunflower River backflowing into the BLW. Precipitation at BLW in spring of 2016 was 653 mm (figure 2) compared to averages for that season during the study period (478 mm, figure 2) and long term (524 mm, table 2). Overflow of the lake banks flooded several sites with runoff gauges or rendered them inaccessible for sampling. Damage to equipment at the sampling sites and slow receding of floodwaters resulted in the loss of data for several months in 2016. This was the first time that the elevation of the Sunflower River was high enough to flood BLW since USDA ARS began monitoring activities in the watershed in 1995.
Comparison of averages (± standard deviation) for selected weather data parameters at the Beasley Lake watershed site during the study period (2011 to 2017) with long-term (1971 to 2017) averages for weather data from a nearby location (USDA NRCS 2018; NOAA 2018).
Typical of watershed studies, runoff results over seven years (January of 2011 to December of 2017) were highly variable, within and among watershed practices, with runoff volumes ranging widely among seasons and years. However, within treatments, no differences in runoff were observed among years (p > 0.15), so data over years were pooled to provide a comparison among landscape management areas (Crop, CropBuff, and CRP) for each season of the year (February to May, June to September, and October to January).
For all three seasons, the general pattern was that the greatest runoff came from Crop areas, followed by CropBuff, and the least runoff from CRP (table 3). Low runoff from the CRP areas was attributed to water infiltration due to runoff impeded by dense vegetation six years after CRP establishment in 2004. However, interpretation of the runoff patterns with respect to the other two land management practices (Crop and CropBuff) was complicated by confounding factors during some years. For example, during the 2016 and 2017 harvest season (October to January) average precipitation was 310 mm and 265 mm, respectively (figure 2), compared to the study period harvest season average of 433 mm (table 2). In October of 2016, total precipitation was only 0.30 mm. Although soil moisture was not measured consistently, it is logical to conclude that soil moisture levels were also low with limited precipitation. Low soil moisture levels together with soil tillage in the fall would have resulted in more water infiltration and, thus, less runoff in the Crop areas. Additionally, access to sampling sites due to restrictions by landowners during the hunting season (October to January) was also particularly more limited during the 2016 and 2017 harvest seasons, which resulted in under-sampling of events during this time period.
Another notable exception to typical runoff patterns occurred in 2014. Based on long-term mean precipitation (table 2), lower quantities of runoff would be anticipated during the June through September season. However, in the first week of June of 2014, 8.45 cm of rainfall over the course of three days coincided with excessive runoff from one Crop area compared to other years during that time period for that site. Precipitation and resulting runoff resulted in high median runoff in the Crop areas for that period (table 3). During the 12 days previous to the three-day rainfall event in 2014, 10.1 cm of rain fell. If the soil was already wet from the earlier rainfall event, it is feasible that the soil became quickly saturated, resulting in higher runoff.
Irrigation applications also influenced runoff in the Crop and CropBuff areas. There were a series of runoff events in July of 2014 for one of the Crop sites (Crop 2 on the map figure), but no recorded rainfall for some of those events. This may be a result of runoff from irrigation events, but that is not certain since records of irrigation events were not consistent. Other Crop sites (Crop 1 and Crop 3) did not register as much runoff during those dates. Therefore, runoff from Crop 2 may be biasing the average for the June to September time period. Similarly, one potential factor that may have contributed to greater runoff and influenced buffer efficiency was concentrated flow through wheel tracks from the center pivot irrigation units, particularly in CropBuff 1 (figure 1). For center pivot irrigation systems, ruts are cut to facilitate movement of wheels as the irrigation system circulates. These ruts are apparent in LiDAR imagery that was collected in 2018 and demonstrate a slightly lower elevation where the wheels pass through the buffer. Potential concentrated surface flow through the diminished vegetation along the wheel track rut may have impeded the ability of the buffer to limit runoff and contributed to higher runoff in CropBuff 1 than for the other two CropBuff sites, which did not have irrigation wheel ruts.
Median and interquartile range (IQR 25th to 75th percentile) for total runoff (m3 ha−1), and runoff water quality parameter concentrations (mg L−1) and loads (kg ha−1): total dissolved organic carbon (TDOC), total suspended solids (>0.45 μm) (TSS), and total dissolved solids (<0.45 μm) (TDS) from row crop without buffer (Crop), row crop with buffer (CropBuff), and Conservation Reserve Program (CRP) subdrainage areas for each season (2011 to 2017).
Measured Runoff Water Quality. Similar to the runoff data, loads of solids measured in runoff were highly variable within and among land management practices. Total suspended solids rarely varied across years and there was not a clear trend in median suspended solids across years. Sediment loss in runoff tended to follow the general pattern of total runoff, with losses greatest in Crop, followed closely by CropBuff, and the least sediment loss from CRP.
Runoff loads (kg ha−1) of TSS from Crop areas with no buffer were higher than the other two areas of management during the February to May period but were similar to that of CropBuff in the June to September and October to January seasons (table 3). The pattern for concentrations (mg L−1) of TSS in runoff for the fall to winter and winter to spring seasons was the highest from Crop sites and lowest from CRP sites. Although the TSS results indicated that edge-of-field buffers and, particularly, CRP management were effective in trapping runoff solids, runoff loss of the smaller-sized (<0.45 μm) particles (TDS) did not follow the same pattern as that of TSS (table 3). For all seasons, both concentrations and loads of TDS were always lowest from CRP areas and highest from CropBuff areas (table 3).
Though not always consistent, a similar pattern was observed for TDOC, a constituent of the TDS fraction. Concentrations and loads of TDOC tended to be higher in runoff from CropBuff areas than from the other two management practices (table 3). Higher TDS and TDOC and lower TSS observed in runoff from CropBuff may be a result of the vegetative buffer trapping larger particles in runoff, while finer-sized runoff constituents pass through the buffer. While finer-sized particles may be lost in runoff despite the presence of a vegetative buffer, the relative proportion of TDS to the total amount of solids lost is small. When ratios of TDS to TSS loads lost in runoff for all three management practices are compared, the ratio increases dramatically for CropBuff and CRP (0.046 in Crop versus 0.159 and 0.542 for CropBuff and CRP, respectively). Implications for CRP are that the vegetation (100% of the area) and increased infiltration reduces runoff, and half of the loss of solids lost in runoff is represented by the finer-sized fraction, likely organic. This was observed to a lesser extent in CropBuff, but the percentage of the CropBuff area that is under dense vegetation is much less than that of CRP (30% to 66%). In both CRP and CropBuff, labile organic material from decaying vegetative residues likely contributes to the organic material in runoff.
Nitrate-N was the predominant soluble form of N found in runoff (table 4). Generally, the pattern for greatest to least NO3 loads and concentrations in runoff was in the order of Crop, CropBuff, and CRP, respectively (table 4). Ammonium-N in runoff followed a pattern of the greatest loads and concentrations in Crop and CropBuff and the least in CRP (table 4). Total N includes all soluble and solid phase components of N in the runoff (table 5). Runoff from Crop areas tended to have the most TN loss (loads and concentrations), followed by CropBuff and CRP. This trend was also observed in the TKN associated with the solid phase (>0.45 μm), but was less consistent for the TKN in the soluble or fine particle (<0.45 μm) phase of runoff (table 5).
For P components of runoff, patterns were inconsistent. Soluble P (PO4-P) concentrations in runoff tended to be higher in the CropBuff and CRP areas than from Crop areas, particularly in the wetter seasons (table 4). However for TP, both runoff loads and concentrations were higher in Crop and CropBuff than in CRP (table 6). Similar trends were observed for TP associated with the solid fraction (>0.45 μm) (table 6). Consistent with soluble PO4, concentrations of TP associated with the soluble or fine particle (<0.45 μm) phase of runoff tended to be higher in CropBuff and CRP and lower in Crop (table 6).
Analyzing ratios of nutrient components in runoff may provide further insight into the effect of management practices. Typically, nutrient runoff soluble phase to solid phase ratios greater than 1 indicate runoff is predominantly in the soluble to fine particle phase whereas ratios less than 1 indicate runoff is more in the solid to coarse particle phase in association with elevated TSS. For both concentrations and loads, the ratio of TKN soluble and fine particle phase to the TKN solid phase increased in the order of CRP > CropBuff > Crop (e.g., for loads in February to May 7.21 > 1.87 > 0.92, respectively). Similarly, for concentrations and loads of TP, the ratio of TP soluble to fine particle phase to the TP solid phase also increased in the order of CRP > CropBuff > Crop (e.g., for loads in February to May 2.00 > 0.67 > 0.17). This pattern was consistent for February to May and October to January seasons for both TKN and TP, but was less consistent in the drier June to September season when runoff was lower. Implications for this observation are likely similar to that described previously for solids lost in runoff.
Summarizing the analysis of ratios for runoff constituents, CRP nutrient runoff was predominantly in the soluble to fine particle phase and not the solid coarse particle phase (i.e., ratio >1). Crop nutrient runoff consisted more of the solid to coarse particle phase in association with elevated TSS (i.e., ratio <1), and constituents in CropBuff runoff were intermediate to that of Crop and CRP. The increase in the ratio of soluble and fine particle nutrient fraction to that of the solid fraction as the proportion of the drainage area was covered with vegetation (Crop < CropBuff < CRP) is indicative of nutrients associated with larger particles being trapped or settling out due to vegetation impeding runoff flow. In addition, a greater proportion of nutrients transported in runoff from CRP and CropBuff may be derived from more labile organic material such as decaying vegetative residue in those areas.
Another relationship of interest among runoff water quality constituents is the ratio of NO3-N to PO4-P. This ratio for both concentration and loads in runoff decreased in the order of CRP < CropBuff < Crop (e.g., for loads in February to May 0.45 < 0.68 < 5.86). This pattern was consistent for all seasons. Implications for this are that the areas with more crop land (Crop > CropBuff) could be contributing more labile N in runoff.
For many water quality parameters, there was a seasonal effect with respect to concentrations in runoff for Crop and CropBuff (data not shown). Concentrations of TSS, NO3, TDOC, TKN solid, TKN soluble, TN, and TP solid in runoff from Crop and CropBuff areas were higher during the winter to spring season than during the summer to early fall season. This same trend was observed for TSS, TDOC, TKN soluble, and TP solid in runoff from CRP areas. For runoff loads from CRP areas, there was a consistent pattern of higher loads during the winter to spring season compared to that during the summer to early fall season for all water quality parameters (TSS, TDS, NO3, PO4, NH4, TDOC, TKN solid, TKN soluble, TN, TP solid, and TP soluble). However, no seasonal patterns were observed for loads of water quality constituents from Crop and CropBuff areas. Variability observed in runoff is likely contributing to the lack of seasonal effect observed in the loads of water quality parameters lost in runoff from Crop and CropBuff areas, while a seasonal effect was observed in water quality concentrations in runoff. Seasonal effects observed in CRP water quality loads were likely due to consistently low runoff volumes during the summer to early fall period due to low precipitation, dry soil, and high infiltration. Higher runoff was observed during the winter to late spring period due to higher precipitation and more saturated soils.
AnnAGNPS Simulated Runoff and Water Quality. The historical simulation of Beasley Lake found that average annual unit area loads were higher for cropland than subbasins with grass buffers or within the CRP land use (table 7). Sediment load reductions were 20% and 99% in grass buffers and CRP, respectively. Similarly, TN load reductions were 38% and 91% in grass buffers and CRP, respectively. Total P load reductions were 12% in the grass buffers and 70% in CRP. It is likely that there were not large changes in P simulated between cropland and the conservation practices because watershed soils are naturally rich in P (Shields et al. 2009).
Median and interquartile range (IQR 25th to 75th percentile) for soluble runoff water quality parameter concentrations (mg L−1) and loads (kg ha−1): nitrate-N (NO3-N), ammonium-nitrogen (NH4-N), and phosphate-phosphorus (PO4-P) from row crop without buffer (Crop), row crop with buffer (Crop-Buff), and Conservation Reserve Program (CRP) subdrainage areas for each season (2011 to 2017).
Comparing unit area loads from different land uses demonstrated the difference in expected nutrient and sediment loads from subwatersheds in the current situation. However, these results do not show the full effects of applying conservation practices, as variability in topography and soil conditions between subbasins can complicate comparing nutrient runoff results. Alternatively, “what if” simulations can be performed to convert CRP and grass buffer land back to cropland and assess the difference in loads under the scenario with conservation practices and the scenario with only cropland. When CRP land is converted back to cropland, runoff increased an average of 223 mm or 41.8%. Total suspended sediment, TN, and TP increased 99.8%, 93.7%, and 62.8%, respectively. When grass buffers were converted back to cropland, runoff increased an average of 172 mm or 30.7%. Total suspended sediment, TN, and TP increased 33.2%, 46.5%, and 34.7%, respectively. The AnnAGNPS simulations with and without conservation practices provide a point of comparison for the measured data and also remove the differences in topography, soil, and microclimate that are present in the observed runoff results from the subbasins.
Land Management Effectiveness. The current study clearly demonstrated improved water quality in runoff from CRP. Sediment yield reductions of >90% relative to Crop management were observed and improved from an earlier study by Cullum et al. (2010), which saw 36% to 85% reductions in sediment yield from the same watershed measured more than five years earlier. Such results indicate improved efficiency for sediment yield reductions with CRP and provide valuable information about the benefits of long-term CRP establishment (Harmel et al. 2018). CropBuff management reduced sediment yield, but mitigation of sediment loss was less effective than CRP, due, in part, to the smaller buffer area and the interaction of crop management and buffer at the edge-of-field (Helmers et al. 2012). This interaction with variation in the ratio of cropped area to buffer area in the CropBuff subdrainages that ranged from 34% to 70% produced increased variation in measured sediment yields and decreased the ability to assess effectiveness of these smaller edge-of-field buffers. Continued runoff monitoring in CropBuff areas will be valuable in assessing long-term changes in efficiencies (Harmel et al. 2018) and/or additional management practices such as tillage management (no-till or reduced till) stacked or integrated with the CropBuff management (Tomer 2018).
Median and interquartile range (IQR 25th to 75th percentile) for runoff water quality parameter concentrations (mg L−1) and loads (kg ha−1): total nitrogen (TN), total Kjeldahl N (TKN), TKN solid fraction, and TKN soluble from row crop without buffer (Crop), row crop with buffer (CropBuff), and Conservation Reserve Program (CRP) subdrainage areas for each season (2011 to 2017).
Similar to sediment reductions, mitigation of nutrients, N, and P in the current study was significant in CRP relative to Crop management. Nitrogen load reductions ranged from 56% to 100% for NO3-N and TN and from 69% to 100% for TP. These load reductions were comparable with those observed by Cullum et al. (2010) from the same watershed measured more than five years earlier, who noted reductions of 71%, 45%, and 97% for NO3-N, TN, and TP, respectively. The comparison indicates modest improved efficiency for nutrient load reductions with increased longevity of CRP, providing useful information about achieving steady state conditions with respect to nutrient mitigation. Although reductions in nutrient loads from CropBuff management areas relative to Crop management areas were often observed, results were mixed, due, in part, to the more soluble and transportable nature of nutrients through buffers (Lizotte and Locke 2018). Several previous studies at plot and farm level have documented the efficacy of vegetated buffers, no-till, and other structural conservation practices in mitigating top soil erosion and associated TSS in runoff (Zeimen et al. 2006; Liu et al. 2008; Schreiber et al. 2001), and USDA NRCS has specifically designed several of these practices to reduce erosion and TSS transport (USDA NRCS 2019). While physical structural barriers such as vegetated buffers are effective at mitigating TSS, soluble nutrients can vary substantially due to time of year (winter versus summer), biological activity (senescent versus actively growing plants), bioavailability, and even release of the more water soluble nutrient species from plant senescence (Kröger et al. 2007).
Median and interquartile range (IQR 25th to 75th percentile) for water quality parameter concentrations (mg L−1) and loads (kg ha−1): total phosphorus (TP), TP solid fraction, and TP soluble fraction from row crop without buffer (Crop), row crop with buffer (CropBuff), and Conservation Reserve Program (CRP) subdrainage areas for each season (2011 to 2017).
Watershed Sustainability. Watershed-scale assessment of environmental sustainability requires using a variety of tools. The ability to project long-term changes in land management practices and other climatic and edaphic factors requires the use of models to predict how these changes will affect nonpoint source runoff and downstream water quality (Yasarer et al. 2018). The current study utilized the AnnAGNPS model to assess relative trends in runoff and water quality constituents due to the applied management practices and compare with field data. Percentage decreases in modeled versus measured runoff of TN, TP, and TSS among land management systems were highly similar. Projected reductions in nutrient and sediment loads matched closely with observed differences in runoff among land management practices. Modeled TSS reductions of >90% in CRP relative to Crop were observed in measured runoff loads. Comparable results for TN and TP occurred with some seasonal differences. Good agreement between AnnAGNPS generated reductions in nutrient and TSS loads, and those observed in measured runoff indicate that the model can be representative of relative differences in nonpoint source runoff of nutrients and TSS in the study watershed. This would allow for reliable long-term predictions of land management changes, climate, and water quality that can assist farmers in improving agricultural production and maintaining environmental sustainability (Yasarer et al. 2017).
Simulated average annual unit-area loads of sediment, nitrogen, and phosphorus for row crop without buffer (Crop), row crop with buffer (Crop-Buff), and Conservation Reserve Program (CRP) land use.
Summary and Conclusions
Our study clearly demonstrated that land management use of edge-of-field vegetated buffers and conservation reserve can be integral components in an agricultural landscape that reduce topsoil loss and transport of nutrients downstream concomitantly mitigating water quality impacts on rivers and lakes. In BLW, combined well-managed conservation practices at the watershed scale can produce significant reductions in TSS and TP in agricultural runoff that is clearly observed in changes in lake water quality. However, despite observed reductions in TN in runoff, such reductions did not translate to within-lake TN changes, indicating a need for better understanding of within-lake biogeochemical processes. These results help improve our understanding of the effectiveness and limitations of land management practices in improving and maintaining environmental quality of soil and water resources.
Acknowledgements
We thank support personnel from the USDA Agricultural Research Service National Sedimentation Laboratory, Oxford, Mississippi, including agricultural research science technician Calvin Vick, biologists Mark Griffith and Wade Steinriede, and physical science technician Tim Sullivan for site setup, maintenance, sample collection, and data management; and biological science technician Lisa Brooks and chemist James Hill for laboratory water chemistry analysis. This research and assessment was supported by the USDA Natural Resources Conservation Service Conservation Effects Assessment Project Watershed Assessment Studies and Agricultural Research Service National Program 211.
Footnotes
Disclaimer
Mention of trade names or commercial products in this publication is solely for the purpose of providing specific information and does not imply recommendation or endorsement by the USDA. USDA is an equal opportunity provider and employer.
- © 2020 by the Soil and Water Conservation Society