Abstract
Despite efforts to improve water quality in US watersheds, recent assessments of water quality trends in these watersheds indicate concentrations of nitrogen (N) and phosphorus (P) show minimal changes in the majority of studied streams across the nation. Moreover, results of the US Geological Survey (USGS) National Water-Quality Assessment (NAWQA) indicate that nutrient concentrations in streams and groundwater in basins with significant agriculture or urban development are substantially greater than naturally occurring or background levels. Long-term hydrologic and water quality data from watersheds are a key component to building an understanding that will dictate what and where changes in watershed management need to be made to achieve improvements in water quality. Forty-one years of hydrologic and water quality data from the Little River Experimental Watershed (LREW) were assembled to evaluate trends in streamflow quantity and quality from the LREW and relationships to changes in land cover and management. Concentrations and loads of chloride (Cl), ammonium-N (NH4-N), nitrate plus nitrite-N (NO3-N), total kjeldahl N (TKN), total P (TP), and dissolved molybdate reactive P (DMRP) from 1974 to 2014 were determined. In general, concentrations of N and P have remained low and stable, with some increases observed from 1980 to 1999. In contrast, Cl concentration appears to be steadily increasing, possibly due to increased fertilization in the LREW. Flow-weighted mean concentrations were 0.21 mg L−1 y−1 for NO3-N, 0.08 mg L−1 y−1 for NH4-N, 1.41 mg L−1 y−1 for TKN, 9.30 mg L−1 y−1 for Cl, 0.15 mg L−1 y−1 for TP, and 0.03 mg L−1 y−1 for DMRP. Average annual loads were 0.64 kg ha−1 y−1 for NO3-N, 0.25 kg ha−1 y−1 for NH4-N, 4.47 kg ha−1 y−1 for TKN, 29.54 kg ha−1 y−1 for Cl, 0.46 kg ha−1 y−1 for TP, and 0.10 kg ha−1 y−1 for DMRP. The low nutrient loading in the watershed is attributed to the dense riparian buffers within the LREW, which have been shown to be effective for reducing N loading to the stream. Due to (1) small variations in nutrient concentrations and loads, (2) large variability in precipitation, (3) large variability in streamflow, and (4) small variations in land cover, no clear relationships among changes in land cover and management and nutrient loads were found. Loading rates of N and P, although somewhat influenced by changes in concentration, are largely dictated by changes in streamflow volume in the LREW.
Introduction
Water resource quantity and quality issues will continue to be of concern throughout the United States as the population expands, land development continues, and food and bio-energy production demands grow. While water quality improvements have been made, assessments of water quality trends from 1993 to 2003 in the United States indicate concentrations of nitrogen (N) and phosphorus (P) show minimal changes in the majority of studied streams across the nation, and more upward than downward trends in concentrations at sites with changes (Dubrovsky et al. 2010; Murphy et al. 2013). Moreover, results of the US Geological Survey (USGS) National Water-Quality Assessment (NAWQA) indicate that nutrient concentrations in streams and groundwater in basins with significant agriculture or urban development are substantially greater than naturally occurring or background levels. A large body of research shows that nutrients originating from agricultural activities are associated with increased levels of nitrate (NO3) in groundwater, degraded stream and lake water quality, and environmental problems in estuaries and coastal waterways (Ritter and Bergstrom 2001). As a result, reducing the negative effects of nutrients on ground, surface, and coastal waterways has been a major focus of agricultural research in the past decades (Mostaghimi et al. 2001).
Many federally supported conservation programs are targeted toward improving the environmental conditions of agricultural lands. The Conservation Effects Assessment Project (CEAP), established in 2003, is a multiagency effort formed to quantify conservation practice adoption rates and impacts across the United States. The original goals of CEAP were to establish the scientific understanding of the effects of conservation practices at the watershed scale and to estimate conservation impacts and benefits for reporting at the national and regional levels. The USDA Natural Resources Conservation Service (NRCS) and Agricultural Research Service (ARS) are working together to quantify the effects of conservation measures on water quality at the watershed scale (Mausbach and Dedrick 2004). There are three principal components of CEAP—national assessments, watershed assessment studies, and bibliographies and literature reviews. Watershed assessments are viewed as a good means for evaluating the aggregated effectiveness of conservation practices implemented at the field scale. The Tomer et al. (2014) summary of the past 10 years of research on ARS CEAP watersheds suggested (1) wider adoption of minimum disturbance technologies to reduce runoff risks associated with applying manure, nutrients, and agrichemicals; (2) adoption of winter cover crops; and (3) a renewed emphasis on riparian corridors to control loads of sediment, P, and other contaminants originating from within (and near) stream channels.
Long-term streamflow and chemistry records are of particular value in assessing watershed-scale effects of trends in climate, land cover, and land management on water resources, and in gaining an understanding of governing watershed processes. Such records provide the basis for long-term trend analysis and a standard against which to calibrate and validate simulation models. A long-term hydrology and stream chemistry record exists for the Little River Experimental Watershed (LREW) in southwestern Georgia, near Tifton—one of the original 12 benchmark ARS watersheds identified in the CEAP initiative (Mausbach and Dedrick 2004). Streamflow and nutrient data through 2003 have been analyzed and published previously (Bosch and Sheridan 2007; Bosch et al. 2007a, 2007b; Feyereisen et al. 2007, 2008; Sheridan et al. 1983). Sheridan et al. (1983) developed streamflow NO3-N, orthophosphate (PO4-P), and chloride (Cl) budgets for a 1,665 ha LREW subwatershed (subwatershed K) using data collected from 1975 to 1978. Feyereisen et al. (2008) reported LREW annual loading rates for NO3-N, Cl, and dissolved molybdate reactive phosphorous (DMRP) for the period from 1974 to 2003.
Early research on the LREW indicates that nutrient transport processes in the watershed are heavily influenced by dense riparian buffers. The LREW is in an area of broad floodplains, river terraces, and gently sloping uplands. Moderately wide inter-stream areas divide separate relatively broad valleys. Land cover over the entire LREW has been classified recently as 50% forest, 41% mixed agricultural, 7% urban, and 2% water (Bosch et al. 2006). Agricultural cropping rotations have changed over the decades from peanut (Arachis hypogea L.), corn (Zea mays L.), soybean (Glycine max L.), winter wheat (Triticum aestivum L.), and tobacco (Niotina tobaccum), to cotton (Gossypium hirsutum L.), peanut, and vegetable crops. Forested lands consist of pines in the upland areas and hardwoods in the dense riparian vegetation in the flat, broad swamp areas. Bosch et al. (2006) reported that a substantial portion of the watershed forestland is composed of wooded riparian buffers bordering the stream-channels. Lowrance et al. (1984a) hypothesized that the riparian ecosystem acts as a nutrient sink and reduces the concentrations and loads of nutrients in the shallow aquifer before the nutrients reach the stream channel. Their research indicated that the riparian forest acted as a sink for nutrients, with a 20× reduction in NO3-N and a 6× reduction in total N (TN; NO3-N plus total kjeldahl nitrogen [TKN]). Research on the LREW indicates that of the NO3-N entering the riparian zone from upland water discharge and bulk precipitation, only about one-third was discharged through streamflow (Lowrance et al. 1984b). The annual streamflow load of TN was about 29% of the precipitation input, and NO3-N concentration was highest during the winter months (January, February, and March) and lowest during the summer months (July, August, and September) (Lowrance et al. 1984a).
Flow and water quality data from 2004 to 2014 were added to the 1974 to 2003 database (Feyereisen et al. 2007) and analyzed as a whole for this study. Historical trends in land cover and cropping practice were quantified for comparison to hydrologic and water quality trends. The objectives of this research were to (1) evaluate 41 years of hydrologic and water quality data from the Little River watershed in the Gulf Atlantic Coastal Plain, (2) quantify changes in land cover and management in the watershed over the 41-year period, and (3) examine relationships among flow quantity and quality from the watershed and changes in land cover.
Materials and Methods
The LREW, located near Tifton, Georgia, in the Southeastern Plains US Environmental Protection Agency (USEPA) Level III Ecoregion of the United States (31°28’54” N, 83°35’03” W) was selected for the study (figure 1). Data for this study were obtained from the 334 km2 primary drainage area, Little River Station B (LRB; figure 1). The LREW is at the headwaters of the Suwannee River Basin. Hydrology and water quality of the LREW have been monitored by the Southeast Watershed Research Laboratory (SEWRL) since 1967 (Bosch et al. 2007a). The LREW is currently instrumented to measure streamflow for LRB and seven subwatersheds that range from 3 km2 to 115 km2. Construction of the original eight streamflow measurement devices began in 1967 and was completed in 1972 (Bosch and Sheridan 2007). Watershed stream slopes range from 0.1% to 0.5%, while upland slopes are typically less than 5%. The mean sea level (MSL) elevation in the LREW ranges from 146 m at the headwaters to 81 m at the outlet (Sheridan 1997). The climate is humid subtropical with a long growing season. Annual precipitation averages 1,200 mm y−1. Mean annual temperature is around 18.7°C, with the coldest month of the year being January with an average temperature of 10.6°C, and the warmest being July with an average temperature of 26.8°C.
The LREW is a mixed land use watershed that contains row crop agriculture, pasture and forage, upland forest, riparian forest, and wetlands. The LREW landscape is dominated by a dense dendritic network of stream channels bordered by riparian forest wetlands (Asmussen et al. 1979) with drainage densities varying from 1.6 to 2.0 km km−2 (Sheridan 1994). The agricultural fields are typically less than 40 ha in size and nested among forested drainages (Bosch et al. 2004). The surface soil textures are generally sands and sandy loams with high infiltration rates. Many of the upland soils of the LREW contain argillic horizons that include an accumulation of silicate clays that restrict vertical infiltration of water and cause shallow lateral subsurface interflow. Surface soils in the LREW are underlain by the upper part of the shallow and relatively impermeable Hawthorne formation that restricts downward movement of infiltrated precipitation and directs groundwater flow to the stream channels (Sheridan 1997). Within the LREW, the Hawthorne can vary from 5 m below the land-surface in the uplands to within a meter of land-surface in the lower landscape floodplain (Shirmohammadi et al. 1986; Bosch et al. 1996). Baseflow within this landscape is driven by flow within the surficial aquifer above the Hawthorne in combination with the portion of the shallow interflow that remains below the ground surface. Within the LREW, baseflow consists of the slower contributions to the streamflow while the stormflow represents more rapid contributions (direct surface runoff, and in some cases, the seepage component of the shallow subsurface flow).
Research within the LREW indicates that direct surface runoff can vary between 5% and 40% of annual precipitation while shallow subsurface interflow can vary between 2% and 35% of annual precipitation (Asmussen and Ritchie 1969; Hubbard and Sheridan 1983; Bosch et al. 2012). Further research indicates that within the Coastal Plain landscape, saturation within the floodplain can lead to surface seepage by the shallow subsurface interflow in lower landscape positions (Bosch et al. 1996; Inamdar et al. 1999). Thus, streamflow is a complex process composed of direct surface runoff, interflow that seeps to the land surface, interflow that joins the surficial groundwater in the lower landscape position, and surficial groundwater. Because of the hydrologic characteristics of the watershed where surface and subsurface flow pass through the riparian buffers prior to entering the stream, the buffers play a critical water quality function in the LREW.
Little River Experimental Watershed streamflow measurement locations.
Hydrologic Monitoring. Daily watershed weighted precipitation and LRB streamflow data from 1974 through 2014 were obtained from the SEWRL database (Bosch and Sheridan 2007; Bosch et al. 2007b). Precipitation instrumentation within and immediately surrounding the LREW were installed in the late 1960s and early 1970s. Currently the rain gauge network consists of 46 stations (Bosch et al. 2007b). Temporally aggregated daily precipitation totals are used to calculate daily watershed weighted precipitation for LRB using the inverse distance, or the reciprocal distance weighting technique (Smith 1992; Dean and Snyder 1977). Daily data were further aggregated into monthly and annual totals. For this analysis, a hydrologic year beginning in December of the prior year and extending through November of the current year was used. This hydrologic year has been used to fit regional hydrologic patterns in prior research (Feyereisen et al. 2008). A Virginia V-notch weir is used as the primary flow structure at LRB (Ree and Gwinn 1959). The primary structure at LRB has a weir length of 69.4 m and a V-notch depth at the center of 0.93 m. LRB has a secondary overflow weir, which consists of a 22.9 m horizontal weir installed 0.15 m higher than the primary V-notch weir. At high flows, streamflow occurs through both structures, and data are collected from the secondary structure as well as the primary one. Recorded stage data are converted into streamflow head, and discharge is calculated from an established rating curve. Measurements of stage yield subhourly measurements of streamflow rate and volume. These data were temporally aggregated to yield daily streamflow volume. Surface runoff was presented as area-mm, the volume of runoff (mm3) divided by the area of the watershed (mm2). Additional details are provided by Bosch and Sheridan (2007) and Bosch et al. (2007a, 2007b).
Water Chemistry Measurements. Water quality data have been collected intermittently since the 1970s (Feyereisen et al. 2007). Over the period, several collection and analytical techniques have been used (Feyereisen et al. 2008). Grab, discrete, and composite streamflow samples have been collected in conjunction with the streamflow volume data. The temporal frequency of the streamflow samples has varied from daily to weekly to bi-weekly. Descriptions of the data from 1974 through 2003 are presented by Feyereisen et al. (2007, 2008). Since 2003, primarily automated flow composite samples have been collected. These composites were obtained by programming automated water quality samplers to draw sample quantities at intervals of equal streamflow volume (Feyereisen et al. 2007). Weekly composite samples were collected from January of 1995 up until March of 2009, after which two-week composites were collected. If no automated sample was collected during the sampling period and the stream was flowing, a grab sample was collected to replace the composite sample for that period. Prior to 2003 the samples were not refrigerated. Since 2003 the samples have been refrigerated on-site. Each composite sample consisted of 14 to 180 subsamples depending on the flow rate. Within the sampling interval, as flow increases, pumping frequency increases. Thus, the composite samples represent the mean concentration of various analytes over the sampling period (weekly or bi-weekly). Examination of samples collected from 1974 through 2003 by Feyereisen et al. (2008) indicated no differences due to sample collection technique (composite or grab). Here we group all the data from 1974 through 2014 for single analytes.
All water samples were placed on ice in coolers and transported from the collection site to the laboratory for analysis of sample concentration for various analytes. Historically, concentrations of solids and various analytes have been measured. Here we examined dissolved NO3 + nitrite-nitrogen (NO2-N), ammonium nitrogen (NH4-N), TKN, Cl, DMRP, and total phosphorus (TP), the analytes with the longest period of record. In the lab, a portion of the collected sample was filtered and stored for further analysis. The filtered sample was analyzed for NO3 + NO2-N, NH4-N, Cl, and DMRP using USEPA approved colorimetric techniques (American Public Health Association 1976; Clesceri et al. 1998). An unfiltered sample was analyzed for TKN and TP using digestion and colorimetric techniques adapted from USEPA approved methods (Clesceri et al. 1998). From the beginning of the record period through 1986, NO3 + NO2-N, NH4-N, Cl, DMRP, TKN, and TP analyses were conducted on a Technicon Autoanalyzer II instrument (SEAL Analytical Inc., Mequon, Wisconsin) (Feyereisen et al. 2008). Beginning in 1987 through 2014, these analyses were conducted using Lachat flow injection analyzers (Hach Company, Loveland, Colorado) (Feyereisen et al. 2008). Analysis of TKN and TP ended in 2010 when the laboratory switched to measuring total dissolved N (TDN) and total dissolved P (TDP). Total dissolved N and TDP were measured on a Lachat flow injection analyzer. Results for NO3 + NO2-N are reported as NO3-N, the dominant form in natural waters (Fawell 2011).
For the post-2004 data, minimum detection limits (MDLs) were calculated for each year. Annual MDLs were obtained by multiplying the standard deviation of each analyte for the annual lab blanks by the t-distribution critical value (p ≤ 0.05) (Oblinger Childress et al. 1999). Average annual MDLs for the 2004 to 2014 data were 0.04 mg L−1 for NO3 + NO2-N, 0.03 mg L−1 for NH4-N, 0.08 mg L−1 for TKN, 0.62 mg L−1 for Cl, 0.07 mg L−1 for TP, and 0.01 mg L−1 for DMRP. All samples were run in duplicate. Duplicate samples that exceeded a 10% difference were reanalyzed. Samples were run with check standards (midrange standard) after every 10 unknowns. If the check standard agreed within 10% of the expected value, runs continued. If it exceeded 10%, the instrument was automatically restandardized. Blanks were run with each set of standards. Minimum detection limits for samples prior to 2003 were not reported. However, since similar methods were used, similar MDLs would be expected.
Monitoring of NO3-N, DRMP, and Cl began in 1974. The analysis of NH4-N, TKN, and TP began in 1978. Nitrate-N data from the 1992 to 1997 and TP data from 1992 to June of 1996 were not usable because of an issue with laboratory analysis (Feyereisen et al. 2008). From 1997 to 1999, NO3-N, NH4-N, Cl, and DMRP samples were not filtered. However, Feyereisen et al. (2008) reported no observable differences in these samples, and they were left in the database. Additional details about the LREW water chemistry measurement system and availability of concentration and load data can be found in Feyereisen et al. (2007) and Feyereisen et al. (2008).
Daily Concentration and Load Estimates. Because the streamflow measurements and water sample temporal frequency are not the same, the concentration data must be extrapolated to yield concentration estimates that match the temporal frequency of the flow volume data. Although in many cases the water quality sampling interval was greater than one day, concentrations were estimated and loads calculated for each day in an effort to extend the usefulness of the data and to determine temporal loading. Assumptions were made to estimate the concentrations on nonsample dates based upon the sample type. For the purposes of this description, this process of associating a concentration with a nonsampling date will be termed “filling.” Daily concentration and load data for the 1974 to 2003 period were described and published by Feyereisen et al. (2007, 2008). Filling procedures for the 2004 to 2014 period are described here.
Analyte concentrations were measured for each sample. Rules for extrapolation to nonsample days were based upon the type of sample collected following the procedure established in Feyereisen et al. (2007). Water quality samples taken from 2004 to 2014 were predominately automated flow composite, refrigerated. For all composite samples (refrigerated and nonrefrigerated), the observed concentration was used to represent the concentration for the entire period of collection. This period begins the day following the previous sample date and ends on the sample date. Concentrations for grab samples were assumed to apply for half the period between the previous and the next sample. For purposes of this study, a grab sample preceded by and followed by a composite sample was treated the same as if it were a composite sample, and the concentration of the grab was assumed to represent the concentration for the period of the day after the previous sample was obtained through the day the grab was obtained. For the 2004 to 2014 period, no sequential grab samples were collected. For cases where flow was occurring and no sample was collected or there was a problem with the sample, concentrations were extrapolated either forward or backward a maximum of 90 days. This may have occurred due to instrument errors or insufficient sample collection. Beyond 90 days, the filled data were marked as missing. Streamflow loads were calculated by summing the product of the estimated nutrient concentration and volume of streamflow for each day.
Monthly, annual, and total loads for the filled data were calculated for each analyte. Total N load was calculated as TKN plus NO3-N for samples collected through 2010. Total N was only calculated for years where both NO3-N and TKN data were available. The flow-weighted mean concentration was calculated by dividing the total load of the analyte based on the filled data for the observation period by the total streamflow volume for that period.
Land Cover. Land cover from 1975 to 2014 was assessed for the LREW. Land cover data from 1975 to 2003 were obtained from Bosch et al. (2006), in which Landsat imagery was classified in roughly five-year increments. Since that initial publication, high-quality, frequent land cover data products have become publicly available, along with revised and streamlined methods for assessing the accuracy of those data products. For this study, data from 2006 were obtained from the USGS National Land Cover Database (NLCD) (Fry et al. 2011). Data for 2010 and 2014 were obtained from the National Cropland Data Layer (CDL) (USDA NASS 2011, 2015) for the State of Georgia. The NLCD and CDL follow very similar land cover classification systems such that, for noncropped areas, the classification categories are identical. The 2006, 2007, and 2014 data were combined and reclassified into five categories to match the classification system used in the previous analysis (figure 2).
Specific crop types were assembled for Tift, Turner, and Worth counties in Georgia, the three counties with which the LRB overlaps. USDA National Agricultural Statistics Service (NASS) data for these counties from 1974 to 2014 were assembled for corn, cotton, peanut, sorghum (Sorghum bicolor L.), soybean, and wheat area (USDA NASS 2018). Both the land cover and cropping history in the region were used to provide insight on observed patterns in water quality.
Data Analysis. Hydrologic and chemistry data were grouped based upon the decades the samples were collected in for purposes of comparison to long-term changes in land use. Due to climatic variability and the intermittent nature of the Little River, both flow and sample numbers over any given period are highly variable. In order to reduce bias introduced by unequal flow and sample concentration numbers where results may have been biased by numerous samples collected during short temporal sampling intervals, annual mean concentrations were calculated as follows: monthly means were calculated from samples taken within the month, bimonthly means were calculated from the monthly means, semiannual means were calculated from the bimonthly means, and annual means were calculated from the semiannual means. These annual means were then grouped into temporal periods and analysis of variance (ANOVA) conducted to determine differences among the period means with residual error based on variability among years within periods. Pair-wise comparisons across periods were examined using the Fisher's least significance difference (LSD) test (p ≤ 0.05). Statistical tests used hydrologic year totals.
The accuracy of land cover for 2006 to 2014 was assessed by comparing classified land cover with high resolution orthophoto mosaics of Tift and Turner counties, following assessment and change “good practices” described by Olofsson et al. (2014). Aerial imagery was obtained from the National Agriculture Imagery Program (NAIP) for dates that corresponded as near as possible to the land cover year. Within the LRB, 300 points were randomly located across the entire region. Technicians compared the land cover classification at each point with land cover that was observed in the aerial photography. Consistent with Bosch et al. (2006), producers' accuracy values were used to provide error margins for land cover amounts, with the error equal to ±0.5(1 − ACi), where ACi is the producer's accuracy value for class i.
Land cover in the Little River Experimental Watershed for 2014, based on the 2014 Cropland Data Layer.
Results and Discussion
Precipitation and Streamflow. Annual watershed weighted precipitation for the hydrologic years from 1974 to 2014 varied from a maximum of 1,624 mm to a low of 789 mm with a 41-year average of 1,194 mm (standard deviation = 206 mm) (table 1). Annual streamflow varied from a maximum of 732 area-mm to a minimum of 24 area-mm (figure 3), with an annual average of 315 area-mm. Annual streamflow as a percentage of annual watershed weighted precipitation averaged 25%. LRB is a perennial stream with frequent periods of no flow. Over the 41-year observation period, LRB averaged 134 days of zero flow each year (table 1). No significant differences were found among the periods examined for the precipitation or the flow (ANOVA, p ≤ 0.05) (table 2). Despite no significant difference among the selected periods, some differences were apparent. Extended drought conditions were observed from 1999 to 2008 with below average precipitation and streamflow (figure 3). Periods of high flow were observed from 1991 to 1998 (figure 3).
Sample Concentrations. Nitrogen concentrations were found to be low for the entire period of record. Nitrate-N for all the samples collected from 1974 to 2014 (n = 1,179) averaged 0.15 mg L−1 (sd = 0.45 mg L−1). No significant differences were found among the NO3-N period annual average concentrations (p ≤ 0.05) (table 2). While considerable variability was observed, average annual NO3-N values were typically less than 0.30 mg L−1. Ammonium-N concentrations for all samples over the entire period of record (n = 1,145) averaged 0.09 mg L−1 (sd = 0.18 mg L−1). ANOVA indicated significant differences among the NH4-N period average annual concentrations. Fisher's LSD indicated significantly greater concentrations from 1990 to 1999 when compared to either the 1974 to 1979 or the 2000 to 2014 period (table 2). TKN analysis ended in 2009. Between 1974 and 2009, 1,069 samples were analyzed with a TKN average concentration of 1.67 mg L−1 (sd = 3.03 mg L−1). ANOVA indicated statistical differences among the TKN period averages. As with NH4-N, Fisher's LSD indicated elevated TKN concentrations from 1990 to 1999 when compared to either the 1980 to 1989 or the 2000 to 2009 period (table 2).
Hydrologic characteristics for the Little River Experimental Watershed Station B for the hydrologic years from 1974 through 2014.
Average annual Cl concentrations have increased since 1974 (table 2). Chloride concentrations for all the samples from 1974 to 2014 (n = 1,350) averaged 11.52 mg L−1 (sd = 4.31 mg L−1). ANOVA indicated statistical differences among the Cl period annual average concentrations (p ≤ 0.05). Fisher's LSD indicated increasing Cl concentration in the 2010 to 2014 period (table 2). The observed increasing Cl concentration trend agrees with results reported for the 1974 to 2003 period by Feyereisen et al. (2008). Concentrations for Cl for the 1974 to 2003 period were less, 9.75 mg L−1 versus 12.88 mg L−1 for 2000 to 2009 and 17.50 mg L−1 for 2010 to 2014. Chloride is not subject to biological reactions in the riparian zone, and stream channel and Cl concentrations have been shown to decrease during stormflow and years with greater flow rates (Lowrance and Leonard 1988). Despite the drought influenced lower flow from 2000 to 2009, streamflow Cl average annual concentrations did not vary significantly during that period (table 2).
Total P analysis also ended in 2009. Between 1974 and 2009, 866 samples were analyzed with an average of 0.23 mg L−1 (sd = 0.81 mg L−1). ANOVA indicated no statistical differences among the TP period annual average concentrations (table 2). DMRP sample concentrations over the entire period of record (n = 1,337) averaged 0.04 mg L−1 (sd = 0.08 mg L−1). When periods were compared, the periods from 1980 to 1989 and 1990 to 1999 stood out with statistically greater concentrations than the periods from 2000 to 2009 and 2010 to 2014 (p ≤ 0.05) (table 2).
As with prior studies (Lowrance et al. 1984a, 1984b, 2007; Lowrance and Leonard 1988; Feyereisen et al. 2008), nutrient concentrations in the watershed were found to be low. Examination of the 1974 to 2003 concentrations indicated similar NO3-N, NH4-N, TKN, TP, and DMRP for the LRB (Feyereisen et al. 2008). Comparisons made between water quality concentration data collected by the USGS from tributaries of the Upper Suwannee River Basin in Georgia revealed similar concentrations of NO3-N, NH4-N, Cl, and TP for grab samples collected from 1993 to 2010 to those observed for the LRB (USGS 2018). Grab samples collected from the 5,491 km2 Withlacoochee River watershed, also a tributary of the Suwannee River, from 1989 through 1991 indicated similar NO3-N concentration ranges to those observed for the LRB (Suwannee River Water Management District 2018). Mean concentrations and loads of NO3-N obtained from samples collected from 1968 to 2007 from tributaries of the Altamaha River, which flows through central Georgia, were generally greater than those found for the LRB by a factor of 1.5 to 3, while mean concentrations and loads of NH4-N were similar (Weston et al. 2009). The reported percentage of agricultural land use in the Altamaha River basin in the 1970s was similar to that of LRB, but decreased in the 1990s due to urbanization (Weston et al. 2009).
Hydrologic year watershed weighted annual precipitation and area weighted flow in the Little River Station B from 1972 through 2014.
Average annual and standard deviations (listed in parentheses) grouped by decade for the precipitation, streamflow, and analyte concentrations from 1974 through 2014.
Trench et al. (2012) reported nutrient concentrations for several watersheds in the northeastern United States. The studied watersheds ranged from less than 100 km2 to 70,000 km2 and included a diversity of land covers. Reported NO3-N concentrations were generally an order of magnitude greater than those found for the LRB, while TP was generally the same (Trench et al. 2012). Sohngen et al. (2015) reported data for five midwestern US agricultural watersheds indicating TN concentrations 3× the sum of the NO3-N and TKN concentrations observed for LRB, and TP concentrations 2× that for LRB. Data from five large Mississippi River tributaries indicated NO3-N concentrations from these watersheds were 10 to 20 times greater than those for the LRB (Murphy et al. 2013).
As reported by Feyereisen et al. (2008), no relationship was found between the observed concentrations and the analytical or on-site storage methods. In addition, no visual differences were found in the data from 1997 to 1999 for the unfiltered samples.
Sample Loads. Chemical loads measured in the LRB were primarily related to patterns in flow. With some exceptions in individual samples, concentrations were not highly variable. Because of this, loads were proportional to streamflow. Streamflow in the LRB is greatest in the early months of the year when precipitation is high and evapotranspiration is low (Bosch et al. 2017). Nitrate-N loads from 1974 to 2014 averaged 0.64 kg ha−1 y−1 (sd = 0.75 kg ha−1 y−1) (table 3). Flow-weighted mean NO3-N concentration based upon the total load divided by the total flow volume for the observation period was 0.21 mg L−1 y−1. In contrast to the concentration data, significant differences were found among the NO3-N load period averages (p ≤ 0.05). Subsequent Fisher's LSD indicated elevated loads in the 1980 to 1989 period and reduced loads in 2000 to 2009 (table 3). While the differences were not significant (p ≤ 0.05), there was some indication that the elevated loads carried over into the 1990s, with elevated loads during that period as well (figure 4). The elevated loads observed in 1980 to 1989 coincided with greater than average concentrations and average flow (table 2). Flows were elevated from 1990 to 1999 but concentrations were near average (table 2). The reduced NO3-N observed from 2000 to 2014 coincided with reduced concentrations and below average flows (table 2). Ammonium-N loads from 1974 to 2014 averaged 0.25 kg ha−1 y−1 (sd = 0.33 kg ha−1 y−1) (table 3). Flow-weighted mean NH4-N concentration for the observation period was 0.08 mg L−1 y−1. Significant differences were found among the NH4-N period averages (p ≤ 0.05). Subsequent post-hoc tests indicated significantly elevated NH4-N loads from 1990 to 1999 (p ≤ 0.05). Elevated NH4-N loads from 1990 to 1999 (table 3) were associated with greater concentrations and flows (table 2). TKN loads from 1974 to 2009 averaged 4.47 kg ha−1 y−1 (sd = 5.95 kg ha−1 y−1) (table 3). Flow-weighted mean TKN concentration for the observation period was 1.41 mg L−1 y−1. Significant differences were found among the TKN period averages (p ≤ 0.05), with significantly greater loads from 1990 to 1999. Elevated TKN loads during this period were a function of significantly greater concentrations and greater than average flows (table 2). Flow-weighted mean TN concentration for periods when both NO3-N and TKN data were available was 1.34 mg L−1 y−1. Total N loads were elevated from 1990 to 1999 (figure 4). Total N loads were typically dominated by NH4-N and organic-N, measured by TKN. Based on the long-term average, TKN loads were 7 times greater than NO3-N loads and 18 times greater than NH4-N loads.
Chloride loads from 1974 to 2014 averaged 29.54 kg ha−1 y−1 (sd = 14.70 kg ha−1 y−1) (table 3). Flow-weighted mean Cl concentration for the observation period was 9.30 mg L−1 y−1. Total P loads from 1974 to 2014 averaged 0.46 kg ha−1 y−1 (sd = 0.45 kg ha−1 y−1) (table 3). Flow-weighted mean TP concentration for the observation period was 0.15 mg L−1 y−1. No significant differences were found among the Cl or TP period average loads (p ≤ 0.05). DMRP loads from 1974 to 2014 averaged 0.10 kg ha−1 y−1 (sd = 0.10 kg ha−1 y−1) (table 3). Flow-weighted mean DMRP concentration for the observation period was 0.03 mg L−1 y−1. Significant differences were found among the TP period averages (p ≤ 0.05). Fisher's LSD indicated elevated DMRP loads from 1974 to 1979 and 1990 to 1999 and reduced loads from 2000 to 2009 and 2010 to 2014 (table 3). The elevated loads were associated with greater than average flows, while the reduced loads were associated with reduced concentrations and flows.
Average annual loads and standard deviations (listed in parentheses) (kg ha−1 y−1) grouped by decade from 1974 through 2014.
Nitrate-nitrogen (NO3-N) and total nitrogen (NO3-N + TKN) loading in Little River Station B sorted by period for the entire period of record; error bars are one standard deviation.
Nitrate-N loads from LRB are around an order of magnitude less than those reported for similar sized watersheds in the midwestern United States. McIsaac et al. (2016) analyzed data from the 69,264 km2 Illinois River in the midwestern United States and reported flow-weighted NO3-N concentrations from 2.9 to 6 mg L−1 y−1. Tomer et al. (2003) reported flow-weighted NO3-N concentrations of 9.2 mg L−1 y−1 from a 5,134 ha tile-drained watershed in central Iowa. Saad et al. (2018) reported SPARROW model estimates for watersheds across the upper midwestern United States. Their findings indicated TP and TN loading for the LRB was on the low end of their reported spectrum, with some estimates for TN as high as 100 times the LRB loads.
Seasonality. As reported by Bosch et al. (2017), the majority of the streamflow in LRB occurs during the months from December through April (figure 5). Average Cl concentrations by month were greater in the months from September through January, while NO3-N and TKN had some increases in average monthly concentration during the summer and fall months (figure 6). Contrary to the findings of Lowrance and Leonard (1988), average monthly Cl concentrations decreased during the low flow months of June through August. Despite these monthly deviations in concentration, loads of these analytes tracked monthly flow patterns (figure 7). The highest loading in the LRB watershed occurs during the months of December through April, typically the period with the greatest flow in this watershed.
Land Cover Changes. Land cover data indicate row crop area has increased from 31% to 36% from 1975 to 2014 in LRB (figure 8). Forest cover decreased from 49% to 43% during the 1975 to 2014 period, but fluctuated between 37% (2006) and 56% (1980). Similarly, pasture cover fluctuated between 2% (1980) and 18% (2010) of the total LRB area. While large increases in row crop area were observed in 1985, overall the changes have been small. Field surveys from the upper third of LRB conducted from 1982 to 1985 found cropland in 35% of the watershed versus the 51% found from the Landsat assessment for 1985 (Lowrance and Leonard 1988). While differences can be expected between the two different methods and coverage area sampled, based upon 1980 and 1990 Landsat assessments it is likely the estimate for 1985 was too high. It should be noted that the increase in row crop area reported for 1985 coincided with a decrease in upland forest area. This also coincided with a reported significant harvest of upland forest in the watershed in 1982 (Lowrance and Leonard 1988). It's possible that a recent harvest of upland forest could have been classified as fallow or row crop area. The dynamics of fluctuating forest, pasture, and row crop in the LRB is consistent with recent findings in the Southeastern Coastal Plains that 83% of the extent of change in the region was due to cyclical processes of fire, forest harvest, and replanting (Drummond et al. 2015). Although fire was not a major contributor to land cover change in the LRB, the recurrent forest growth harvest cycle is typical of planted pine forests.
Overall accuracy of the 2006 NLCD and the 2010 and 2014 CDL land cover classifications for the LRB was similar to prior results, averaging 83%. While the accuracy values for pasture, open water, and urban cover classes were low (mean producer's accuracy 59% to 71%), row crop areas and forests were correctly classified on average 91% and 92% of the time, respectively. By contrast, analysis conducted by Bosch et al. (2006) indicated that the accuracy of land cover classifications derived from 1975 to 2003 satellite imagery averaged 87%. However, for this set, resources for high quality reference data set collection were unavailable, affecting the accuracy assessment of pasture, open water, and urban land covers in particular. As such, some of the reported values of 100% accuracy were likely an overestimation. For the entire period of record, land cover change dynamics were slow. In the 2006 to 2014 period, the change in percentage values ranged from −0.1% (water) to 5% (forest) and were small enough such that classification errors may have obscured the true value of land cover class transitions in the LRB.
As first reported by Feyereisen et al. (2008), large changes have occurred in the crop types grown in LRB from 1974 to 2014 (figure 9). Due largely to the eradication of the boll weevil (Anthonomus grandis), there was a dramatic resurgence of cotton grown in the region from 1990 to 1995 (figure 9). From 1974 to 1990, cotton was grown on less than 10% of the row crop land in the area. Since 1995, cotton has been grown on 50% to 60% of the row crop area. Corn area has dropped from 50% of the row crop area in 1975 to less than 10%. Peanut production has remained fairly stable at around 30%, with some increase in peanut production from 1985 to 1995 and a moderate decrease since the mid-1990s. Trends from 2000 to 2014 indicated an increase in cotton area and a decrease in peanut area.
Average monthly watershed weighted precipitation and area weighted discharge for the Little River Station B from 1972 to 2015. Error bars illustrate the monthly standard error of the means (Bosch et al. 2017).
These changes in land cover and crop type are significant because of the way these crops are managed. The growing season for corn in the watershed is typically from March through July, with typical fertilization rates of 200 kg ha−1 y−1 N throughout the growing season (University of Georgia 2018). This assumes a mixture of irrigated and nonirrigated crops and production goals. Fertilization rates have likely fluctuated over the 41-year period due to economic forces. The growing seasons for peanuts, cotton, and soybeans in the watershed are typically from May through October. Peanuts and soybeans do not require N fertilization, but cotton requires 85 kg ha−1 y−1 N throughout the growing season (University of Georgia 2018). For average wheat production, 100 kg ha−1 y−1 N would be applied throughout the growing season (University of Georgia 2018). Moreover, wheat is fertilized in the fall and the winter when surface and subsurface runoff losses are typically high. Because corn and wheat are fertilized during times of the year when the watershed is more likely to be saturated and surface and subsurface transport rates are typically greater, the timing of fertilization on these crops make it more likely for N losses to occur during production.
Land Cover Relationship to Loading. Estimates of fertilizer and precipitation N applied over the entire watershed for the period of record were developed based upon three factors: (1) the assumed estimates of applied fertilizer N for the dominant crops within the watershed of 200 kg ha−1 y−1 for corn, 85 kg ha−1 y−1 for cotton, 90 kg ha−1 y−1 for sorghum, 100 kg ha−1 y−1 for wheat, and 0 kg ha−1 y−1 for peanuts and soybeans; (2) precipitation N inputs of 12 kg ha−1 y−1; and (3) estimates of cropped area from the land cover assessment (figure 10). Based upon these assumptions, N fertilization peaked in 1976, decreased from that period until the early 1980s, and has remained fairly stable since that time (figure 10). Higher production goals in the last 20 years has likely led to increased fertilization rates, which would cause more recent loading rates to be greater. Data collected by the USDA Economic Research Service (2018) indicate that N fertilization rates on cotton increased by 25% from 1990 to 2005.
Nitrate-N loading in the LRB had the largest increase from 1980 to 1999, while TN had the greatest load increase from 1990 to 1999 (figure 4). The 1980 to 1999 NO3-N increases in concentration and load may have been associated with the observed increase in row crop area, but did not appear to be influenced by fertilization, which has decreased since the 1970s (figure 10), or the timing of fertilization. It is also possible that the higher fertilization rates, which accompanied corn production in the 1970s, did not produce elevated nutrient loads until a decade later due to a delayed release in the system. In addition, there is some evidence of an increase in winter wheat in the early 1980s, which may have impacted NO3-N loading (figure 9). Winter wheat accompanied by fall fertilization has been shown to increase N loading from agricultural fields (Lowrance and Leonard 1988). Increased N fertilization associated with the increase in wheat area from 1982 to 1990 may have contributed to increases in N losses from 1980 to 1999 (figure 4). The increased TN loading from 1990 to 1999 occurred at a time when cotton area was increasing dramatically (figure 9). However, since cotton was replacing corn in many cases, overall N inputs likely remained stable or decreased (figure 10). The observed TN loading increase from 1990 to 1999 occurred at a time when flows in the LRB were greater than normal (figure 3). Since cropping practices and N input have remained largely unchanged since 1995, one would anticipate greater NO3-N and TN loading if flow rates increased once again.
Average monthly chloride (Cl), nitrate-nitrogen (NO3-N), and total nitrogen (TKN) sample concentration for the Little River Station B; error bars are one standard deviation.
While Cl concentrations have increased over time (table 2), no similar increase has occurred with Cl loads (table 3). Chloride loading appears to have been confounded by highly variable flow rates from 1974 to 2014 (figure 3). Chloride is found in several fertilizer formulations, primarily in potassium chloride (KCl). Potassium requirements for corn and cotton are both high. Lowrance et al. (1985) reported Cl precipitation and fertilization inputs of 100 kg ha−1 y−1. Application of Cl has likely exceeded that for N in the watershed. Lowrance and Leonard (1988) reported that Cl concentrations can increase during periods of low flow. While flows have decreased somewhat since 2000, these changes were not significant (p ≤ 0.05). As was found with N, increases in Cl loads appear to be most closely tied with streamflow volume. The similarity of NO3-N and Cl seasonal loading patterns (figure 7) would be expected given that both ions are poorly adsorbed in the soil column.
No significant changes were found in TP loading over the observation period, although loading rates were elevated from 1974 to 1989 (table 3). While DMRP concentrations decreased significantly from 2000 to 2014, concentrations have remained at the levels of detection, and loads have been very small for the entire observation period (table 3).
Average monthly chloride (Cl), nitrate-nitrogen (NO3-N), and total nitrogen (TKN) loads for the Little River Station B; error bars are one standard deviation.
Summary and Conclusions
Concentrations and loads for the LRB detailed here are in agreement with prior research conducted on the watershed. Average annual streamflow losses of NO3-N, DMRP, and Cl for the 1,665 ha Little River subwatershed K (figure 1) for 1975 to 1978 were 0.3, 0.15, and 37.1 kg ha−1 y−1, respectively (Sheridan et al. 1983). Lowrance et al. (1985) reported loading rates of 0.32 to 0.88 kg ha−1 y−1 for NO3-N, 3.21 to 3.95 kg ha−1 y−1 for TN, 0.07 to 0.08 kg ha−1 y−1 for NH4-N, 1.02 to 1.14 kg ha−1 y−1 for TP, 0.11 to 0.14 kg ha−1 y−1 for DMRP, and 28.03 to 30.86 kg ha−1 y−1 for Cl in subwatersheds N and K of the Little River (figure 1) from 1979 to 1980. Later research reported average annual loading rates of 0.84 kg ha−1 y−1 for NO3-N, 0.32 kg ha−1 y−1 for NH4-N, 0.59 kg ha−1 y−1 for TP, 0.12 kg ha−1 y−1 for DMRP, and 31.1 kg ha−1 y−1 for Cl in LRB for the period from 1974 to 2003 (Feyereisen et al. 2008). It is apparent from these data and the 2003 to 2014 data that nutrient loading in the LRB has remained low since 1974.
Heavily buffered watersheds such as LRB have a large capacity to buffer N and P but have little impact on Cl inputs. Research by Lowrance et al. (1983, 1984a, 1984b, 1985) indicated a 66% reduction of NO3-N loading due to riparian buffers within the watershed. Lowrance et al. (1984b) reported that within this watershed, riparian buffers act as a filter for NO3-N. Soluble N that enters the riparian buffer is transformed from inorganic to organic forms in the bottomland forests (Lowrance et al. 1984b). As the water flows through the riparian buffer the organic-N becomes the dominant form in the water entering the stream. Chloride, which is biologically inactive, is not retained at high levels in the soil (Tullock et al. 1975). Increases in Cl concentration observed in this study accompanied by no similar increases in N or P concentration indicate the riparian buffers within the watershed are continuing to remove excess N and P from agricultural fertilization.
Watersheds with a high percentage of woody vegetation riparian buffers are common throughout south-central Georgia. Geographical information system analysis of 2017 CDL data from the Alapaha, Little River, Middle Flint, and Withlacoochee HUC8 drainages in the region found riparian forest percentage cover ranging from 25% (Alapaha) to 10% (Middle Flint). The riparian forest percentage cover in the Little River HUC8 to which the LRB is a tributary was found to be 18%, whereas the neighboring Withlacoochee drainage contained 22% riparian forest cover. Similar buffering of NO3-N by riparian forests can thus be expected from many regional watersheds.
Changes in land cover over the Little River Experimental Watershed Station B from 1975 through 2014.
As reported by Lowrance and Leonard (1988) and Feyereisen et al. (2008), few clear relationships between cropped area and nutrient loads exist. Prior results have indicated that NO3-N loading in heavily buffered watersheds is not impacted by increasing fertilizer application (Gilliam and Terry 1973). Nutrient loads appear more strongly related to streamflow volume than to small changes in cropped area or fertilization observed in LRB. Monthly streamflow increases driven by El-Nino/Southern Oscillation events have been reported to be a strong indicator of NO3-N loading in the LREW (Keener et al. 2010). Episodic increases in nutrient loading in the LREW can also be impacted by forest harvest. Lowrance and Leonard (1988) reported an increase in streamflow on one of the subwatersheds of the LREW due to harvest of forest on nearly 25% of the subwatershed. The increase in streamflow was accompanied by an increase in nutrient loading, which could be the result of increased flow, but may also be associated with presenescence defoliation in the riparian zone as has been reported following forest ecosystem harvest (Likens et al. 1970) and insect defoliation outbreaks (Swank et al. 1981). Regional studies indicate that conversion to strip-tillage may increase subsurface and baseflow in the watershed (Bosch et al. 2015). As more area is converted to strip-tillage, this may result in increased loading. However, it is also possible that vigorous forested riparian buffers within this watershed could buffer greater N and P loading than currently exists.
Changes in crop type over the three county Little River Station B area from 1975 through 2014.
Estimates of fertilizer and precipitation nitrogen (N) inputs over Little River Station B from 1974 through 2014.
Conservation practice data for the LRB are available for the period from 1980 to 2006 (Sullivan and Batten 2007). These data indicate that some form of conservation practice has been implemented on 25% of the LRB total area. The most predominant conservation practices observed consisted of nutrient management (13.1%), pest management (12.9%), grassed waterways (9.6%), contour farming (9.5%), seasonal residue management (8.9%), and terraces (8.8%) (Sullivan and Batten 2007). The greatest rate of adoption appeared to occur from 1990 to 2000 and then again from 2004 to 2006. Lower NO3-N and TN observed from 2000 to 2014 (figure 4) may in part have been due to increased conservation practices implemented from 1990 to 2000. However, it is impossible to separate the effects of these practices from changes in land cover, fertilization, and streamflow. An analysis of conservation practice selection and placement within the LREW between 1980 and 2006 (Settimi et al. 2010) found that conservation practices were implemented on approximately 50% of cropland area. These authors also reported that, although 40% of practices implemented were not targeted to erosion or water quality protection, 60% of the fields at risk for surface runoff and erosion had implemented at least one erosion control practice, and 65% of fields at risk for contaminant transport to water bodies via lateral subsurface flow had implemented nutrient and or pest management plans. Simulated alternate conservation practice scenarios targeted to water quality improvement (Cho et al. 2010) indicate that full implementation of conservation practice suites targeting nutrient reduction have the potential to reduce N loads by 10.3%. These authors report that intact riparian forest buffers offer the most comprehensive potential for reducing nonpoint source loads—20.5% for sediment, 19.5% for P, and 7.0% for N. Although riparian buffers can be enrolled in various USDA assistance programs, most of the riparian buffers within the Little River watershed are part of the natural floodplain and are not supported by USDA programs.
Water quality within the LRB continues to be maintained at a high level. Concentrations of N and P have remained low and stable, with some increases observed from 1980 to 1999. In contrast, Cl concentration appears to be steadily increasing. This may be an indication of higher fertilization loading in the LRB. Despite this, riparian buffers in the watershed appear to be continuing to remove excess N and P prior to entry into the stream. Loading rates of N and P, although somewhat influenced by changes in concentration, are largely dictated by changes in streamflow volume.
Acknowledgements
This research is a contribution of the USDA Agricultural Research Service (ARS) Gulf Atlantic Coastal Plain Long-Term Agroecosystem Research site. We gratefully acknowledge statistical and data analysis guidance provided by ARS Area Statistician Deborah Boykin. The authors are also grateful for the assistance of the many scientists and field and laboratory technicians who have supported the research. This research and assessment was supported by the USDA Natural Resources Conservation Service Conservation Effects Assessment Project Watershed Assessment Studies and Agricultural Research Service National Program 211.
Footnotes
Disclaimer
Mention of company or trade names is for description only and does not imply endorsement by the USDA. The USDA is an equal opportunity provider and employer.
- © 2020 by the Soil and Water Conservation Society