Abstract
The Soil Vulnerability Index (SVI) uses widely available inputs from the SSURGO database to classify cropland into four levels of vulnerability to sediment and nutrient losses: Low, Moderate, Moderately High, and High. Previous work has identified inconsistencies in SVI assessments across the United States, possibly because neither precipitation amount nor intensity were included in the development of SVI. This study aimed to determine if rainfall characteristics influence the SVI classification and which ones are most critical. The objectives were to (1) evaluate the impact of precipitation characteristics on land vulnerability to sediment loss, and (2) evaluate if rainfall characteristics alter the degree of agreement between the simulated sediment yield and SVI classification. The study focused on four Conservation Effects Assessment Project (CEAP) watersheds in Ohio, Missouri, Mississippi, and Pennsylvania for which sediment yields were simulated using previously calibrated models. The models were run with input precipitation data from these four watersheds. In addition, in order to examine a wider range of precipitation characteristics, model runs were made for the same four watersheds utilizing precipitation data from two CEAP areas in Georgia and Maryland. Sediment yields for all the cropland units in four of the watersheds were simulated using the Soil and Water Assessment Tool or the Annualized Agricultural Nonpoint Source Pollution Model using 1985 to 2014 precipitation data from all six areas as inputs. Similarities and differences between precipitation characteristics such as precipitation amount, intensity, and rainfall erosivity R-factors were compared with the similarities and differences in simulated sediment loss. Results confirmed that SVI is a useful tool for relative ranking of cropland at risk of erosion within a region, as SVI and the model-based vulnerability classifications agreed for 55% to 100% of the watersheds’ subunits. However, model-based classification of field vulnerability could shift due to changes in precipitation characteristics. Thus, the range of soil loss for each vulnerability class can shift from one region to another. The results suggest that precipitation intensity or annual R-factor may help improve the correspondence between vulnerability and the range of expected soil loss.
Introduction
Excessive sediments and pollution originating from upland areas is a serious problem for water bodies in the United States with 71% of lakes, reservoirs, and ponds, and 53% of rivers and streams assessed by the US Environmental Protection Agency classified as impaired (Niraula et al. 2013). Cropland areas with certain combinations of soil, land use/cover, and slope are more vulnerable than others in terms of nutrient and sediment loss and are defined as critical areas (Niraula et al. 2013). Identifying these areas is important for cost-effective implementation of conservation practices. Models such as the Agricultural Policy Environmental eXtender (APEX), the Soil and Water Assessment Tool (SWAT), and the Annualized Agricultural Nonpoint Source Pollution model (AnnAGNPS) have helped to identify critical areas for sediment and nutrient runoff (Liu et al. 2016; Mudgal et al. 2012; Niraula et al. 2013; Pradhanang and Briggs 2014; Santhi et al. 2014). However, these simulation models are complex and require significant data resources to evaluate each area of interest. For federal, state, or other local resource management authorities, indices derived from models and experimental knowledge are often preferred since they are simpler to use and require less data and computational power (Thompson et al. 2020).
The Soil Vulnerability Index (SVI) was developed by the USDA Natural Resources Conservation Service (NRCS) to serve as an interpretive tool to classify cropland areas on the basis of surface runoff and leaching risk, and to identify their associated erosion and nutrient loss potential (Chan et al. 2017; Thompson et al. 2020; USDA NRCS 2012). It has been used to assess the needs for conservation practices in Chesapeake Bay watershed, Beasley watershed, and other locations in the United States as a conservation planning tool (USDA NRCS 2019; Yasarer et al. 2020). That need is assessed by relating the vulnerability class of cropland to the number and types of practices implemented on that land; it assumes that the more vulnerable land requires more soil and water conservation practices, and of different types. Currently, the SVI uses inputs from the US soils database (SSURGO) to rank cropland into four vulnerability classes of Low, Moderate, Moderately High, and High (Chan et al. 2017). Major inputs include slope, hydrologic soil group, and soil erodibility. The influence of drainage and the presence of organic matter and rock fragments act as modifiers to the classification where the vulnerability is adjusted to increase or decrease based on the existence of these modifiers within the area of interest. Detailed information on the development of the SVI can be found in Thompson et al. (2020) and the SVI user guide (USDA NRCS 2019).
Even though the SVI is intended to be used throughout the United States, it was developed based on APEX model results in the Upper Mississippi, Ohio, and Tennessee River basins using rainfall characteristics of these areas (Chan et al. 2017; Thompson et al. 2020). Rainfall, which varies throughout the United States, has a direct impact on soil detachment and transport of eroded particles via runoff, but characteristics of rainfall, amount or intensity, are not currently included in the SVI (Yasarer et al. 2020). The Vulnerability Classes structure of the SVI, similar to the NRCS Land Capability Classes, is intended to be relevant at the local level such that a Low SVI classification corresponds to soils less prone to runoff (and resulting sediment loss) than a Moderate classification, Moderate less than Moderately High, and so forth. When applying SVI to the same region, the index helps to identify the relative vulnerability of cropland in that region. For regions where rainfall characteristics are different from those in the Upper Mississippi and Ohio-Tennessee region, differences in rainfall amount, intensity, and frequency may impact the interpretation of SVI classification. For example, cropland areas with high annual precipitation may require more conservation practices to protect water and soil resources than indicated by the SVI (Thompson et al. 2020). In these areas, the SVI would fail to identify undertreated fields. Therefore, a study on the impact of differing rainfall characteristics is needed to further improve the SVI classification for regional interpretation.
The goal of this study is to determine how rainfall characteristics influence the SVI classification. The objectives were to (1) evaluate the impact of precipitation characteristics on land vulnerability to sediment loss, and (2) evaluate if rainfall characteristics alter the degree of agreement between the simulated sediment yield and SVI classification.
Materials and Methods
Study Areas. Four different Conservation Effects Assessment Project (CEAP) watersheds (Moriasi et al. 2020) were selected for studying a wide range of precipitation amount and intensity representing locations both within and outside of the Upper Mississippi and Ohio-Tennessee River basins where the SVI was developed (figure 1). These watersheds include (1) Goodwater Creek Experimental Watershed, Missouri (MOGC); (2) Beasley Lake, Mississippi (MSBL); (3) Maumee River Basin, Ohio (OHMM); and (4) WE38 watershed, Pennsylvania (PAWE). They were selected based on the availability of a calibrated SWAT or AnnAGNPS watershed model (table 1). Among the four watersheds, Maumee is the flattest with an area weighted average slope from all croplands of 1.0% while WE38 is the steepest with a 10.9% average slope. We used these watershed models with each of six precipitation data sets and compared vulnerability distributions as estimated by simulated sediment yields with the monthly characteristics of the precipitation data sets (amounts, maximum 30-minute rainfall, and Universal Soil Loss Equation [USLE] erodibility factor). The following sections provide details on the models used, the precipitation characteristics, and each stage of the methodology.
The study relies on simulations of these watersheds using the SWAT and AnnAGNPS models and comparisons of SVI vulnerability classification with model estimated runoff and sediment fluxes. Precipitation data were substituted from among the six precipitation data sets to evaluate the influence of precipitation on model estimated runoff and sediment fluxes and the corresponding vulnerability. Figure 2 illustrates the overall processes to evaluate precipitation impact on vulnerability classification. Calibrated watershed models from four of the locations (the description of the calibrated models can be found in the “Main Properties of Each Watershed Model” section) were used in determining sediment and runoff: a SWAT model in MOGC, OHMM, and PAWE, and an AnnAGNPS model in MSBL (table 1). Each of the six precipitation data sets were substituted into each of the four models to compare simulated precipitation-driven runoff and sediment losses for the six precipitation regimes under each watershed combination of slopes, soils, and land management conditions.
Precipitation Characteristics. Because rainfall is a driving force of soil erosion, it is important to consider regions where rainfall characteristics differ from those in the Upper Mississippi and Ohio-Tennessee River basin for which the SVI was developed. Precipitation characteristics for six CEAP watersheds including the four already mentioned, the Choptank River watershed in Maryland, and the Little River in Georgia (figure 1) were summarized from 1985 to 2014 using a precipitation gage within the watershed or a nearby gage if the watershed had no station or limited precipitation data. The six precipitation data sets from Missouri (MO_pcp), Mississippi (MS_pcp), Ohio (OH_pcp), Pennsylvania (PA_pcp), Maryland (MD_pcp), and Georgia (GA_pcp) show distinct differences in precipitation amount and intensity as shown in figures 3 and 4, and supplementary table S1. Average annual precipitation ranged from 940 mm to 1,293 mm, with the greater amounts for MS_pcp and GA_pcp and lowest for OH_pcp. Both MS_pcp and GA_pcp have the greatest winter precipitation (December to February) and up to 60 mm difference between the wettest and driest months. Average monthly precipitation in OH_pcp and MO_pcp show a bell shape distribution (low in winter, high in spring and early summer) (figure 3) and about 40 mm difference between the wettest and driest months. In MD_pcp, precipitation is more evenly distributed throughout the year with a maximum monthly difference of 30 mm.
RAINHHMX parameter, representing the most extreme 30-minute rainfall intensity over multiple years, is used in SWAT to calculate the maximum half-hour rainfall of an event that generates runoff, which is in turn used to calculate the peak runoff rate and the corresponding soil loss (Neitsch et al. 2011). RAINHHMX was obtained from the SWAT weather database. This parameter was used in this study as an indicator of rainfall intensity for each of the six precipitation data sets. RAINHHMX is greatest and nearly equal for each of GA_pcp (in May), MO_pcp (in June), and MS_pcp (in July) at approximately 60 mm (figure 4). The overall maximum for the other three watersheds is about half this amount. The January maximum 30-minute precipitation for GA_pcp and MS_pcp is nearly three times greater compared to all other watersheds. These differences in precipitation amount, timing, and RAINHHMX would likely impact sediment yield and nutrient runoff and would be expected to impact the absolute soil losses and the vulnerability of the site.
A parameter that combines different rainfall characteristics (i.e., amount, intensity, energy, and seasonality) is needed to understand the overall impact of precipitation among the various watersheds. The Revised Universal Soil Loss Equation 2 (RUSLE2) rainfall erosivity factor (R-factor) was chosen for this project, as obtained from the RUSLE2 database, which used data from 1960 to 1999 (USDA ARS 2008). Rainfall erosivity combines rainfall kinetic energy (E) and its maximum 30-minute intensity (I30) for each individual storm to describe the effect of rainfall on sheet and rill erosion (Renard 1997; Wischmeier and Smith 1978). The R-factor accumulates the rainfall erosivity of individual rainstorm events over a year and averages them over multiple years (Abdulkadir et al. 2016; Petkovšek and Mikoš 2004). Figure 5 shows the cumulative R-factors by month for the six precipitation data sets included in this study. Note that there is a clear difference in R-factors among watersheds.
Soil Vulnerability Index Vulnerability Classification. The SVI includes several components, one of them being the runoff component, which quantifies sediment and nutrient loss by surface runoff. This is the component we are evaluating in this study. The SVI developers used the APEX field-scale model to simulate edge-of-field sediment and nutrient losses resulting from surface runoff and leaching from croplands managed without conservation practices. The developers expressed soil vulnerability as one of four different classes. The SVI developers identified three thresholds of average annual soil loss that corresponded with the vulnerability classes: 4.5 t ha−1, 11.2 t ha−1, and 17.9 t ha−1 (USDA NRCS 2019). Areas with average annual soil loss values less than 4.5 t ha−1 represented about 70% of the cropland areas for all conditions, including years with high precipitation (USDA NRCS 2012) and were assigned a Low vulnerability. The smallest simulated soil loss value for National Resources Inventory points classified as Highly Erodible Land (HEL) was 17.9 t ha−1; this value became the threshold for the High vulnerability class. The Moderate and Moderately High vulnerabilities were assigned based on the midpoint value of 11.2 t ha−1 (i.e., 4.5 to 11.2 t ha−1 as Moderate and 11.2 to 17.9 t ha−1 as Moderately High). Additional information on assigning vulnerabilities to combinations of hydrologic soil group, slope, and the soil erodibility K-factor can be found in Thompson et al. (2020).
Main Properties of Each Watershed Model. The Goodwater Creek Experimental Watershed, Missouri, is dominated by claypan soils (Baffaut et al. 2015). The SWAT model (table 1) was calibrated and validated on a daily time step for stream flow and a monthly time step for loads of atrazine, sediment, and dissolved phosphorus (P) from 1993 to 2010 (Baffaut et al. 2015). The WE38 Watershed, Pennsylvania, was simulated using the SWAT model. The model was corroborated with observed stream flow at a daily time step, with long-term cumulative and event-based P loads, and with soil moisture and runoff frequency from hillslopes (Collick et al. 2015, 2016). Runoff and sediment and pollution transport processes in the Maumee River Basin in western Ohio were simulated and account for subsurface drainage. Among the four models used in this study, the Maumee model is the only model that has artificial subsurface drainage. The model was previously calibrated at the daily time step for the period 1980 to 2020. The simulations were validated with observed monthly stream flows and fluxes of sediments, nitrogen (N) (N, nitrate N, and total N), and P (P, dissolved reactive P, and total P) at multiple flow gaging locations within the watershed. Further details about the model development and validation are provided in Gildow et al. (2016), Martin et al. (2021), and Scavia et al. (2017). The AnnAGNPS model for the Beasley Lake, Mississippi (Yasarer et al. 2020), was validated with streamflow data at the event scale at two locations from 2000 to 2002. Additional details about the model setup and results can be found in Yasarer et al. (2020).
Model Adjustment for the No-Practice Scenario. Since the SVI was developed based on no conservation practices, the calibrated SWAT and AnnAGNPS models were adjusted to represent the “no-practice” scenario following the method outlined in the CEAP report (USDA NRCS 2012). The conversion to the no-practice scenario included adding tillage operations when Soil Tillage Intensity Rating (STIR) <100 but assumed that all additional tillage would take place in the spring, before planting. Among these watersheds, only the OHMM has fall tillage due to the watershed management practices. This typical fall tillage was retained for the no-practice scenario. The USLE Practice factor (P-factor) was set equal to 1 (e.g., no credit for contouring), and the watershed cropland soil condition was changed to poor. It should be noted that the SWAT model used Modified Universal Soil Loss Equation (MUSLE) and AnnAGNPS used RUSLE2; both equations use the P-factor. These changes were accomplished by adjusting the NRCS runoff curve number (Kent 1973) of all cropland to poor hydrologic condition according to each land unit’s hydrologic soil group of a model, based on recommendations by Cronshey (1986) and adjusting the management to the no-practice scenario, as described in supplementary table S2. The adjustments were made at the field level in AnnAGNPS model, while the adjustments of the SWAT models were made at the Hydrologic Response Units (HRUs) level, which represent a combination of unique land cover, slope, and soil characteristics within a subbasin. According to the USDA NRCS (2012) guideline, no modification is needed for the model’s subsurface drainage in the “no-practice” scenario.
Model Runs and Model Results Classification. To simulate the different precipitation conditions, the models were run using the daily historical precipitation data from each region. The monthly precipitation parameters in the weather generator file, taken from the SWAT database, were changed to that of the corresponding region. Those included the parameters used to generate missing precipitation data if any and, more importantly, the precipitation intensity as described by the monthly maximum 30-minute precipitation used for calculating the peak runoff rate, which is used to estimate sediment yield in SWAT.
The SVI vulnerability was calculated for each HRU/field based on its specific soil and slope characteristics. Average annual sediment yield for each HRU/field was estimated by the model. The APEX sediment yield used as the basis for SVI development was calculated at the edge-of-field, and the sediment yield output from APEX is comparable to the field sediment yield calculated by the AnnAGNPS model. The SVI sediment thresholds were used to dynamically categorize HRU vulnerability independent from soil and slope characteristics. However, the SWAT HRU and subbasin framework do not allow direct transfer between HRU and edge-of-field results because HRU results represent contributions to the stream and include the effects of the sediment routing processes that may occur between the field and the stream (Neitsch et al. 2011). Therefore, the SWAT-simulated sediment yield thresholds defining each vulnerability class were revised using the decision tree method (Xia et al. 2008) to provide meaningful comparison of SVI classification with vulnerability classification based on SWAT simulations (figure 2). Specifically, the new thresholds were defined to match as closely as possible the classification of an HRU based on sediment yield output from SWAT, with the classification of that HRU based on the SVI criteria. By calculating threshold values by matching model-based results with SVI calculations, we removed the influence of climate, global model parameters, and to some extent of management on vulnerability classification. Soil properties are considered by SVI, and this study tested whether when we remove the influence of management and climate, we arrive at similar vulnerability classifications.
For each of the SWAT models, the precipitation data set utilized for the model calibration was used as input to the model to determine the sediment thresholds for that location. The decision tree uses a flowchart-like structure that starts from the root node, and branches extending to internal nodes, and end at a leaf node (supplementary figure S1). The paths from root node to leaf nodes represent decision rules used to divide data sets into various predefined classes. Sediment yield and SVI vulnerability class of the HRUs were used to split data points at the root node and at subsequent internal nodes into two or more categories or “bins” that best separate the target class values. The decision to split at each node is made based on the degree of “purity,” which is characterized by an index that calculates the probability of a specific data point to be classified incorrectly when selected at random. When a node is pure, all the contained elements in the node have the same class. The splitting process continues until predetermined homogeneity of the leaf node or stopping criteria are met (Song and Ying 2015; Yu et al. 2010). In this study, the thresholds were calculated using a python program and the machine learning package Scikit-learn (Pedregosa et al. 2011), using the Gini index (one of the most widely used tests for decision trees) to measure impurity (Kim 2016). The Gini impurity index is calculated by summing the probability of each item being chosen multiplied by the probability of a misclassification of that item (Kim 2016; Kumar 2013). Once sediment yields and SVI classification based on soil and slope are known for each HRU, the decision tree determines the optimal sediment yield cut-off value for each SVI class so that within that range of sediment yield, this cut-off value maximizes the number of HRUs with the same SVI classification and minimizes mixing the number of HRUs of different classes (impurity). After determining the thresholds for vulnerability for each of the calibrated models, the six precipitation data sets were substituted among the models to evaluate the influence of precipitation on matching SVI vulnerability classifications with those identified using SWAT or AnnAGNPS.
Statistical Analysis. The precipitation characteristics, SVI vulnerability classification, and HRUs/fields classifications based on the SWAT and AnnAGNPS models’ results were statistically analyzed. Results were used to determine if there were statistical differences in monthly precipitation characteristics among watersheds, and statistical differences in soil vulnerability classification when precipitation patterns from the six watersheds were substituted into the SWAT/AnnAGNPS models. Significant differences in precipitation characteristics were then compared to significant differences in model simulated sediment loss.
The statistical test selected to assess significant differences in vulnerability classification was the Cohen’s Kappa (Viera and Garrett 2005; McHugh 2012). A Kappa statistic quantifies the similarity between two data sets on a scale from −1 (total dissimilarity) through 0 (no similarity) to 1 (perfect agreement) (Viera and Garrett 2005; McHugh 2012; Jayasinghe and Kumar 2019). For each SWAT and AnnAGNPS model, Kappa coefficients were calculated between results obtained with each pair of precipitation inputs. In this study, for each SWAT or AnnAGNPS model, the Kappa values were calculated for the different sets of HRUs/fields vulnerability classification based on precipitation inputs. For example, in the SWAT model for WE38, there are 10,000 crop-HRUs. When this model is run with each of the six precipitation data sets, each of these HRUs will have its own vulnerability based on its sediment yield output, thus creating six data sets with 10,000 vulnerability values in each. Each data set is then compared to each of the five others using the Kappa coefficient to determine if these two data sets are similar or dissimilar not by chance. This helps to separate the results into multiple groups and determine if precipitation parameters affect these grouping. Cohen suggested interpreting the Kappa result as follows: values ≤0 as less than chance agreement, 0.01 to 0.20 as slight, 0.21 to 0.40 as fair, 0.41 to 0.60 as moderate, 0.61 to 0.80 as substantial, and 0.81 to 1.00 as almost perfect agreement (Viera and Garrett 2005; McHugh 2012).
Since the R-factor and maximum 30-minute precipitation were available only as monthly average values, we used monthly average precipitation amount as well in order to have consistent sample sizes for all precipitation characteristics. Viera and Garrett (2005) noted that the sample sizes for Kappa should consist of more than 30 comparisons. Since precipitation characteristics have a sample size of 12, the statistical t-test was applied. De Winter (2013) stated that a paired t-test is feasible with extremely small sample size, which can be as small as 2.
Results and Discussion
Precipitation Patterns. Precipitation amount t-test values highlight the difference of OH_pcp amount compared with the other watersheds (supplementary table S3). Additionally, rainfall intensity student t-test (supplementary table S4) indicated that GA_pcp and MS_pcp intensity had similar characteristics, and MO_pcp, MD_pcp, and PA_pcp could be grouped together, while OH_pcp was alone in a separate group. From figure 5 and table 2, the R-factor in these watersheds can be divided into three distinct groups. The group with greatest R-factor was MS_pcp and GA_pcp; next was MO_pcp and MD_pcp; and lastly PA_pcp and OH_pcp. Even though annual precipitation values of these watersheds were within 30% of each other, annual R-factors from GA_pcp and MS_pcp were nearly double those of MO_pcp and MD_pcp. The PA_pcp amount was greater than the amount of MO_pcp, but the intensity of PA_pcp was much lower, resulting in an overall lower annual R-factor. Based on the three parameters considered (amount, intensity, and R-factor), precipitation characteristics were moderately to substantially similar in Mississippi and Georgia. They were fairly similar in Missouri and Maryland for precipitation intensity and R-factor, but only slightly similar for precipitation amount. Ohio stood apart from other locations in terms of monthly precipitation amount and intensity. Pennsylvania precipitation was unique in that its grouping with other locations depended on the precipitation parameter considered: Maryland for precipitation amount, Missouri and Maryland for precipitation intensity, and Ohio for R-factor.
Vulnerability Classification in Goodwater Creek, Missouri, Model. Figure 6 shows the number of HRUs in each vulnerability class based on the sediment yield outputs of the MOGC SWAT model with precipitation from each of the watersheds as well as with the SVI. Figure 6 shows that the SVI classification was in agreement with the classification obtained from model results with MO_pcp. Both figures 6 and 7 illustrate the influence of precipitation on sediment loss. Figure 7 shows the distributions of simulated sediment yield for MOGC using the precipitation characteristics for each of the six watersheds. Based on sediment thresholds, the amount of HRU in each vulnerability class for MO_pcp was the same as the distribution based on the SVI criteria. Both indicated that 77.5% of HRUs within the watershed were classified as having Moderate vulnerability to runoff and only 2.5% of the HRUs were highly vulnerable. When MD_pcp precipitation was substituted there was no shift in SVI classification of the HRUs. Note that the annual precipitation of MD_pcp is 17% (+167 mm) greater than MO_pcp; however, the rainfall intensity of MD_pcp is lower compared to that of MO_pcp, especially from March to June, such that the annual R-factor for MO_pcp is 9% greater than MD_pcp.
When considering precipitation from a region with higher rainfall amount and intensity such as MS_pcp or GA_pcp, there is a clear increase in vulnerability of the HRUs (figure 7). Using the MS_pcp in the MOGC SWAT model, 76% of the HRUs were classified as having higher risk of increased sediment yield output compared with MO_pcp (table 3). There were large increases in the number of HRUs in the Moderately High (+92 HRUs) and High class (+30 HRUs). The average annual HRU sediment yields increased by 68% with GA_pcp, and more than doubled with MS_pcp compared with that from MO_pcp. The shift in vulnerability class of the HRUs showed a similar trend as the precipitation trend. The annual MS_pcp is the greatest among the simulated watersheds (+29% compared to MO) and annual GA_pcp is 21% greater than MO_pcp. Precipitation during the nongrowing season (October to March) from MS_pcp and GA_pcp is 88% and 66% greater than MO_pcp, respectively. In addition, the maximum 30-minute precipitation during the nongrowing season in MS_pcp and GA_pcp is about 50% greater than in MO_pcp. Greater amount and intensity of precipitation during the nongrowing season led to greater sediment loss since there is limited soil cover during this season (Ouyang et al. 2010). Under GA_pcp data, the change in class was similar in direction with MSBL but with a lesser magnitude. This result was expected as annual GA_pcp is less than annual MS_pcp, and less during the nongrowing season at 603 mm compared with 687 mm.
In contrast, the results obtained with PA_pcp and OH_pcp showed that the assigned vulnerability of HRUs in the MOGC watershed were reduced with most of the HRUs in Low or Moderate class. Even though annual PA_pcp is greater, it has much lower intensity compared with MO_pcp. Based on OH_pcp precipitation, there was a 78% reduction in the number of HRUs classified as Moderately High or High, and 20 HRUs shifted to Low vulnerability. Not only does OH_pcp have the lowest annual precipitation but it also has only one month where the maximum 30-minute rainfall depth exceeds 30 mm, i.e., August, when cropland maximum vegetative cover is expected. Although this maximum 30-minute precipitation is similar to both MD_pcp and PA_pcp, both of those sites have at least three consecutive months with that level of rainfall intensity. From these results and table 3, we can distinguish three groupings based on precipitation resulting in different levels of soil vulnerability: GA_pcp and MS_pcp in one group; MD_pcp, MO_pcp, and PA_pcp in a second one; and OH_pcp by itself. The greater the Kappa agreement between two scenarios, the more similar were the model sediment results based on the precipitation inputs. When there is less similarity between two scenarios, the differences in precipitation inputs resulted in statistically different SVI vulnerability classifications. The number of HRUs in each vulnerability class for PA_pcp are closer to the MO_pcp and MD_pcp numbers than they are from the OH_pcp numbers. However, only with PA_pcp and OH_pcp precipitations are there low vulnerability classes in this watershed. These groups mirror the groups of R-factors and 30-minute maximum rainfall for the six precipitation data sets, and indicate that rainfall amount, intensity, and timing have an impact on the vulnerability classification of the soil.
Vulnerability Classification in Beasley Lake, Mississippi, Model. Figure 8 and supplementary figure S2 show how vulnerability classification in the Beasley Lake watershed changes as a function of the precipitation input to the AGNPS model. Figure 8 for MSBL shows that there was 55% of agreement between the SVI classification and the classification obtained from AGNPS model results with MS_pcp. While the classification discrepancies were in both directions, the SVI classification was lower than the model classification more frequently. The SVI classification based on soil and slope assigned Low and Moderate vulnerability to the majority of the fields in the Beasley Lake (68%), Moderately High vulnerability to 23% of the HRUs, and High vulnerability to all others (less than 10%). The precipitation patterns that produced results in closest agreement with the SVI classification were those from MS_pcp and GA_pcp, for which both the annual and April through July precipitation were greatest compared to all other watersheds. However, there were only one-fourth as many HRUs classified in the High vulnerability class with both GA_pcp and MS_pcp compared to the SVI classification. Results based on precipitation from the other regions all led to even greater reductions in the number of fields in the High and Moderately High vulnerability classes. Overall, compared to model sediment yield outputs, it appears that SVI is overestimating vulnerability when slope is minimal, even when rainfall and R-values are greater. The results from this model were consistent with those from the MOGC SWAT model where higher precipitation amount and intensity resulted in more areas assigned to the Moderately High and High vulnerability classes.
The Kappa values (table 4) indicate the degree of similarity of results among the scenarios based on the differing precipitation inputs. The values from MSBL model indicated that the results can be separated into three groups similar to the MOGC model with MS_pcp and GA_pcp in the first group, MO_pcp, MD_pcp, and PA_pcp in the second group, and OH_pcp in a separate group.
Vulnerability Classification in WE38, Pennsylvania, Model. Similarly, figure 9 shows the soil classification and thresholds obtained with the WE38 SWAT model. Figure 9 shows 95% agreement between the SVI and the model derived classification, in large part because of the very high fraction of cropland on steep slopes, all classified in the High vulnerability class. For 125 out of the remaining 158 HRUs, SVI assigned a higher vulnerability class than the model-based classification.
The overall shift in sediment yield with precipitation can be observed in supplementary figure S3. As expected, the SWAT simulation using the PA_pcp precipitation resulted in the best agreement with the SVI classification based on soil characteristics. The MD_pcp and PA_pcp produced similar results in the PAWE. The OH_pcp, which has lower precipitation for both the growing and nongrowing seasons, produced a shift toward less vulnerable HRUs compared with the PA_pcp. The MO_pcp has lower annual precipitation than the PA_pcp and resulted in a shift toward lower soil vulnerability in the PAWE watershed than with the PA_pcp, in spite of a greater R-factor. This result is different from those obtained with the MOGC and MSBL models, perhaps due to the steeper average slope found in the PAWE model. As shown in table 1, PAWE has the steepest slope among the other watersheds examined in this study (weighted average slope is more than 10 times greater than any other). It is possible that for a steep watershed such as WEPA, rain intensity is less important than precipitation amount. Both MS_pcp and GA_pcp resulted in the greatest number of HRUs assigned to High vulnerability, likely for similar reasons of higher precipitation amount and intensity as noted for the AnnAGNPS MSBL model. Kappa values (table 5) showed similar patterns to those obtained with previous models; however, GA_pcp and MD_pcp showed substantial agreement. This can be linked to the similar annual precipitation amount of these two areas: 1,169 mm and 1,207 mm for MD_pcp and GA_pcp, respectively—about a 3% difference.
Vulnerability Classification in Maumee Watershed, Ohio. For the OHMM SWAT model (figure 10 and supplementary figure S4), the SVI classification was most similar to that obtained with the OH_pcp precipitation. Figure 10 shows high level of agreement (88%) in classification for the watershed’s cropland HRUs. For 549 of the misclassified 864 HRUs (or 8% of the cropland HRUs), SVI assigned a lower vulnerability than the model-derived classification. However, when grouping together Low and Moderate, and Moderately High and High, the agreement increased to 96%.
Kappa values (table 6) showed moderate agreement between vulnerability classification obtained with MO_pcp and OH_pcp. Since fall tillage is practiced in the OHMM region, the nongrowing season (October to March) precipitation had a greater impact on cropland sediment yield. The lowest nongrowing season precipitation is MO_pcp at 363 mm, followed by OH_pcp at 397 mm, while GA_pcp and MS_pcp have the greatest amounts at 603 mm and 687 mm, respectively. The R-factors of MO_pcp and MD_pcp are similar; however, nongrowing season MD_pcp is 34% greater than MO_pcp. As a result, model-derived classification shows more vulnerability with the MD_pcp than with the MO_pcp (620 HRUs in High and Moderately High classes compared with 403 HRUs). Similarly, the R-factor of PA_pcp is lower than that of MO_pcp, but the nongrowing season precipitation is 23% greater. This difference resulted in a greater number of HRUs classified in the High vulnerability class in PA_pcp compared to MO_pcp (269 HRUs versus 248 HRUs). This suggests that timing of the precipitation may play an important role in proper vulnerability classification, which is not included in SVI. Similarly, both GA_pcp and MS_pcp, which have greater annual and nongrowing season precipitation than OH_pcp and greater maximum monthly 30-minute rainfall, resulted in the greatest number of HRUs considered as Highly vulnerable. Over half of the HRUs in the Low SVI classification shifted to the Moderate vulnerability class when using the GA_pcp or MS_pcp precipitation. A lesser amount was moved from the Moderately High to the High vulnerability class.
Interaction of Site and Precipitation Characteristics on Vulnerability Classification and Conservation Needs. As different regions have their own set of characteristics influencing the range of management practices controlling runoff, and sediment and nutrient losses, an analysis of site characteristics impacting the interpretation of SVI is needed. In all the cases considered here, agreement between the SVI and the model-based classifications ranged from 55% to 100% of the cropland in each watershed. Lower agreement was observed for the flatter MSBL and OHMM watersheds (55% and 88%, respectively). When there was a discrepancy, the results pointed to more frequent lower vulnerability ratings with SVI compared to the model-based classification. However, the largest discrepancy occurred between the Low and Moderate, with more HRU being classified as Low with SVI and Moderate with the model-based classification. When we grouped the Low and Moderate classes together and the Moderately High and High classes together, agreement for these two watersheds jumped to 80% and 96%, respectively. Thus, we can conclude that SVI is useful to classify vulnerability within a watershed or a region.
Vulnerability of HRU and fields and vulnerability distributions did shift up or down depending on the precipitation data set used to drive the model (figures 6, 8, 9, and 10). This relationship was especially pronounced for the PAWE watershed as shown by the larger spacing of the sediment yield distribution curves for that watershed (supplementary figure S3) compared to the other watersheds (figure 7 and supplementary figures S2 and S4). The calculated Kappa values between vulnerability classifications obtained with different precipitation inputs for different slope ranges helped to identify the effect of slope on the agreement between SVI and model-based vulnerability classification (supplementary table S5 for the PAWE watershed). For all watersheds, the Kappa values decreased as slopes increased, implying that the vulnerability classifications that correspond to two precipitation patterns diverged from each other as the slope increased. A possible interpretation is that precipitation characteristics have greater impact on the SVI classification when slopes are greater.
The implication of this is that vulnerability in terms of risk to receiving waters is affected by the precipitation characteristics of the site. A low vulnerability field in Ohio, as defined by SVI, does not carry the same risk of sediment loss as the same field in Mississippi. While this still allows vulnerability classification (as seen by the good agreement between the two methodologies), targeting of the most vulnerable areas for conservation efforts, and prioritizing conservation dollars, the amount of conservation indicated by the SVI may not correspond to the amount of conservation needed to achieve a target level of sediment loss.
This is not problematic if SVI is used locally or regionally. However, in addition to targeting and prioritization, another use of the SVI is to guide conservation needs at field or regional scale by comparison of cropland vulnerability and the corresponding conservation needs with conservation practices already implemented (NRCS 2021). The goal of these conservation practices is to minimize the transport of sediment and nutrient to the receiving streams. For losses by surface runoff, the guidance assumes ranges of soil loss for each vulnerability class. Since the soil loss does vary with precipitation, it derives that accounting for the precipitation is necessary if a goal is to achieve a given amount of soil loss. When it comes to water quality, acceptable amounts of sediment and nutrients caused by land cultivation and entering a water body (stream or a lake) are a function of the receiving water body characteristics, the beneficial uses of that water body (e.g., ecosystem sustenance, recreation, and water supply), the amount of cropland in the watershed, and other contributions in the watershed. To address this, the US Environmental Protection Agency developed the concept of Total Maximum Daily Load so that watershed management plans and desired pollution abatements can be tailored to the watershed and water body characteristics. A vulnerability ranking may be a useful tool to inform these watershed management plans, but depending on the intended stream or lake water use, a low vulnerability ranking may not always imply that no conservation effort is needed.
If SVI needs to infer an approximate risk of soil loss, these study results imply that it should include a precipitation term. Tables 3, 4, 5, and 6 show that the greatest Kappa values are consistently obtained for the following groups in each case: GA_pcp and MS_pcp together; MD_pcp, PA_pcp, and MO_pcp together; and OH_pcp on its own. These grouping are the closest to the grouping based on the t-test values for the maximum 30-minute rainfall (I30, supplementary table S4). This is not surprising as soil loss is sensitive to peak runoff rate, which varies with I30, as indicated by the USLE (Tibebe and Bewket 2011). However, nutrients and especially dissolved nutrients are more sensitive to runoff volume, and therefore to total rainfall volume (Wang et al. 2018). Since SVI’s goal is to address nutrient losses as well, a precipitation parameter that relates to both rainfall amount and rainfall intensity, such as the R-factor, seems a good choice. It also has the advantage of being already calculated and part of the metrics commonly used by conservationists.
Summary and Conclusions
The SVI was developed by the USDA NRCS to rapidly classify cropland vulnerability to loss of sediment and nutrient by runoff and leaching into four categorical classes, which are Low, Moderate, Moderately High, and High. The SVI index uses soil and slope characteristics of the cropland for its classification process but does not use precipitation characteristics. In this study, the soil runoff vulnerability was compared between SVI criteria and model simulated sediment yields from four calibrated watershed models using precipitation inputs from six different regions, which had distinct differences in precipitation characteristics, including amount, intensity, seasonal pattern, and R-factor. The output from the models with the six precipitation data sets were used to study the impact of precipitation on the soil vulnerability classification.
The SVI and the model-based classifications agreed for 55% to 100% of the watersheds’ subunits (supplementary tables S6, S7, S8, and S9), confirming that SVI is a useful vulnerability assessment tool that is consistent with modeling technologies. However, the range of sediment yields within each vulnerability class varied among the watersheds, which implies caution when associating an approximate range of soil loss with each vulnerability class when one is not familiar with interpreting SVI within a specific region.
Higher (or lower) precipitation amount and/or intensity than those of the site’s precipitation resulted in an increase (or decrease) in risk of soil loss by runoff. Other factors, such as the slope of the agricultural fields and the seasonality of the precipitation also affect this risk. As anticipated, the magnitude of the change with precipitation was most linked to the changes in maximum 30-minute rainfall between the regions. This makes this parameter or perhaps the more frequently used R-factor good candidates to adjust the SVI to account for the influence of precipitation on expected sediment loss.
Overall, the results from this study indicated that there is a need to add a precipitation characteristics modifier to the SVI classification system, similar to the existing organic matter and drainage modifiers. Given the low number of sites used in this study, further work that includes additional data sets that span a wider range of precipitation characteristics will be included in proposing modifiers to adjust the SVI based on precipitation characteristics.
Supplemental Material
The supplementary material for this article is available in the online journal at https://doi.org/10.2489/jswc.2023.00065.
Acknowledgements
All authors would like to recognize the USDA Natural Resources Conservation Service (NRCS) Conservation Effects Assessment Project (CEAP), and the NP211 ARS National Program for providing the hydrological data, models, and watershed information for this study. Support for the Maumee River Basin modeling was supported by the NSF grant GRT00022685 and an Ohio Sea Grant to Ohio State University. Funding for this project was supported by the USDA NRCS under NRCS Agreement Number NR193A750023C006, USDA-ARS-NP211, and the University of Missouri, Columbia.
- Received April 26, 2022.
- Revision received November 19, 2022.
- Accepted December 30, 2022.
- © 2023 by the Soil and Water Conservation Society