Abstract
Among the benefits of crop residues is their influence on reducing soil wind erosion. Residue height, diameter, and soil surface cover influence wind speeds and soil susceptibility to wind erosion events. Understanding the role of crop residue type in maximizing residue coverage through time can inform management for improved residue retention, and wind erosion models for better simulation of the residue decomposition process. We used the Dryland Agroecosystem Project (DAP), a long-term, dryland, no-till systems experiment at multiple locations in eastern Colorado, to examine differences between winter annual grain and summer annual forage crop residue dynamics. The DAP utilizes wheat-based rotations plus continuously cropped grain-forage and forage-only rotations. For this study, we focused on residue dynamics of winter wheat (Triticum aestivum) and forage crop (Sorghum bicolor and Setaria italica) residues at two locations in eastern Colorado and following two harvest seasons (2014 and 2015). Decomposition days (DD), a calculation that factors in temperature and rainfall to estimate cumulative conditions that favor decomposition, were used to normalize climate conditions across sites and years. Counts of postharvest standing stems, stem diameters, and residue heights were measured, as was soil surface coverage. Soil cover measurements were used to estimate the length of time before soil surface cover fell below a 30% coverage threshold and to model residue persistence. Results showed that winter wheat consistently produced more residue cover immediately after harvest, and cover also persisted almost twice as long as forage crop residues. The hypothesis that residue cover could be represented using an exponential decay model was supported for forage sorghum and forage millet, while wheat residue maintained postharvest coverage of the soil for a period of time before beginning to decline and followed a quadratic decay model. The combined effects of standing stem density, initial residue cover, and coverage longevity point to wheat being a valuable protector against wind erosion in these systems. The different residue trajectories by crop type suggest that shifts in crop rotations within no-till management systems can have important implications for wind erosion control in the semiarid Great Plains.
Introduction
Globally over 400 million ha are susceptible to the damaging effects of wind erosion, and wind erosion is one of the primary contributors to land degradation in arid and semiarid regions (Ravi et al. 2011).Approximately 38 million ha are susceptible to wind erosion in North America, and, while cropland soil losses have decreased over time, the United States still sees losses of around 700 billion kg of soil from cropland per year (USDA 2020). Wind erosion contributes to irreversible land degradation, and its control is a major challenge for agricultural systems of the Great Plains, which lose more than 6,000 kg of soil ha–1 y–1 (Dregne 2002; Ravi et al. 2010; Hansen et al. 2012). Wind erosion can be forestalled by three key factors: limiting wind speed at the soil surface, maximizing soil surface cover, and having wind-resistant soils (Nordstrom and Hotta 2004; Borrelli et al. 2014). Within annual cropping systems, tillage and crop residues are two major management factors that can influence wind erosion potential.
In the semiarid Great Plains of the United States, traditional dryland (nonirrigated) winter wheat (Triticum aestivum)–fallow crop rotations incorporate fallow years to store water for the following crop. During the fallow period, fields are maintained to limit the growth of live plants (i.e., crops and weeds) for 14 months following harvest, resulting in crop production for only 10 months out of every 24. Historically, tillage was used to control weeds during the fallow period. Tillage incorporates crop residues into the soil, exposing the soil surface, resulting in less soil protection (Lopez et al. 2003). Traditional tillage and summer-fallow management of the Great Plains, similar to those during the US Dust Bowl era of the 1930s, exposes the soil surface and creates a highly wind erodible state (Lee and Gill 2015). These practices still persist to a limited extent today, which contributes to wind erosion events in the region, especially during low rainfall periods.
The development of improved herbicides and herbicide-tolerant crops made reductions in tillage for weed control possible by the 1980s (Unger and Skidmore 1994; Derpsch et al. 2010). Due to reduced soil disturbance and increased surface residue cover, no-till systems have reduced wind erosion susceptibility relative to tilled systems (Merrill et al. 1999; Triplett and Dick 2008; Gao et al. 2016). The adoption of no-till management in the western Great Plains has often been accompanied by an increase in cropping intensity and the addition of summer annual crops such as corn (Zea mays) and forages. Diversification of crops from the strict wheat–fallow system has reduced financial risk for producers and increased water use efficiency in these water-limited systems and increased soil carbon (C) (Sherrod et al 2003; Rosenzweig et al. 2018); however, the lower amount of crop biomass remaining in the field following harvest of summer annual crops may not provide adequate wind erosion protection.
While the benefits of reduced tillage for wind erosion prevention are well known, less research has focused on the effects of different crop residue types within no-till systems. This is particularly relevant in dryland cropping systems of the semiarid Great Plains where the adoption of no-till practices has allowed producers to diversify and intensify rotations, reducing the frequency of fallow. Rotations such as winter wheat–corn–fallow or winter wheat–corn/millet (Panicum miliaceum)–fallow reduce fallow periods from 14 months out of 24, to 10 to 13 months out of 36 or 48 months (Farahani et al. 1998; Hansen et al. 2012). The combination of no-till management and cropping intensification has shifted the physical structure and quantity of crop residues within dryland systems by not physically destroying residues and burying them with tillage, incorporating different crops into the system, and reducing the frequency of summer fallow periods (Ortega et al. 2002).
Residue is a physical deterrent to wind erosion, acting as a barrier, increasing wind erosion threshold wind speeds, as well as changing wind energy dynamics to reduce transport capacity (Hagen 1996). Maintaining a soil surface area cover of 30% has been shown to reduce soil erosion by 70% over bare soil, with only marginal improvements in soil loss prevention achieved with higher percentages of cover (Fryrear 1985). Thus 30% is the threshold used by USDA Natural Resources Conservation Service (NRCS), US Environmental Protection Agency (USEPA), and most state extension services as the minimum amount of cover needed for a soil conservation practice, particularly at the periods when wind magnitude is greatest (USEPA 2003; Lyon and Smith 2010). In addition to surface residues maximizing protective cover, standing residue characteristics of stem height, stem diameter, and stand density also reduce wind erosion by absorbing wind energy and thus slowing wind speeds at the surface. The integrated “silhouette” area created by stalk diameter, height, and number of standing stalks per measured unit of surface area reduces near-surface wind speeds (Hagen 1996). Standing residue initially decomposes at a slower rate than residues in contact with the soil surface (Lyles and Allison 1981), but standing residue eventually becomes part of the pool of surface residues as the stems fall. In no-till systems, crop residues can persist on and above the soil surface following harvest and into the following year’s crop growth.
Fundamental to understanding the role of residues in reducing wind erosion is the accurate measurement of residue cover. Surface residue cover changes can be indirectly estimated from residue mass using a single time-point measurement of residue mass, then simulating residue decomposition over time using models rather than empirical measurements. Past studies generally rely on one-time measurement of residue mass following harvest with a smaller number of studies measuring residue mass samples throughout the growing season (Ruffo and Bollero 2003; Steiner et al. 1999). These past studies have informed the wind erosion prediction system (WEPS) that uses an exponential decay model to simulate residue decomposition dynamics (USDA ARS 2020). Field measurements of surface residue cover through time are less common. Crop residue cover is a direct barrier to wind erosion, protecting the soil surface from wind contact and slowing wind speeds. A one-time sampling of residue mass may not correlate to soil protection over time due to temporal variability in residue characteristics and persistence. Frequent visual assessments are time and labor intensive but are nonetheless important to understand the dynamics of residue persistence and decay.
In addition to residue quantity and standing residue characteristics, crop residue biochemical quality can also influence residue decomposition dynamics. For example, it is well established that residues with lower nitrogen (N) concentrations such as wheat straw tend to decompose more slowly than legume residues that tend to have higher N content (Xu et al. 2017). However, there are multiple biochemical quality indicators in addition to N content that can influence litter decomposition rates including residue lignin, cellulose, and hemicellulose composition (Johnson et al. 2007; Xu et al. 2017). Xu et al. (2017) attributed faster corn than spring wheat litter decomposition to these differences in biochemical composition. However, overwintering cool season and summer warm season grass crops are harvested at different times of the year and, thus, are exposed to different climate conditions that may interact with residue quality to influence residue decomposition dynamics. There have been relatively few direct in-field comparisons of residue dynamics between winter annual and summer annual grass crops, which represent the major crop types in semiarid cropping systems of the Central Great Plains.
The objectives of this study were to (1) quantify the soil surface cover by crop residue type and (2) quantify the persistence of crop residue cover to the 30% threshold by crop type. We hypothesized that residue cover would fit an exponential decay model, and that soil cover would be higher and persist longer for winter wheat than summer annual crops.
Materials and Methods
Site Descriptions. The Dryland Agroecosystem Project (DAP) was initiated as a long-term cropping systems study in eastern Colorado in 1985 to investigate the economic and agronomic implications of increasing crop rotation intensity under no-till systems (Petersen et al. 1993). The rotation systems range along an intensity spectrum from winter wheat–fallow to continuous cropping. This study utilized the DAP crop rotation systems to quantify and compare the temporal residue cover dynamics of winter wheat and forage crops. The DAP is comprised of three field sites in eastern Colorado representing a north to south evapotranspiration gradient, and the sites are referred to by the names of the nearest towns: Stratton, Sterling, and Walsh. For this study, only Stratton and Sterling sites were included due to crop failures at the drier Walsh site in 2015. Table 1 contains site information, including location, precipitation, evapotranspiration, and soil properties of the two sites.
Site characteristics and average annual precipitation (30 years), total precipitation for study years, average growing season open pan evaporation (March to October), and soil type and texture (0 to 10 cm) and classification for the summit slopes of the Dryland Agroecosystem Project (DAP) sites near Sterling and Stratton, Colorado. Adapted from Peterson et al. (2001) and Cantero-Martinez et al. (2006).
At each site, cropping systems are replicated across strips that are 6.1 m wide and at least 150 m long. Each strip runs across a catena with summit, sideslope, and toeslope positions. Only the summit areas at each site were included in this study since these areas are likely to be the most susceptible to wind erosion. The summit area within a cropping system strip was at least 46 m long at both sites.
The wheat and forage crop residues included in this study were in one of the following rotations: winter wheat–fallow, winter wheat–corn–fallow, and continuously cropped grain–forage rotation (winter wheat–forage sorghum [Sorghum bicolor] or winter wheat–forage millet [Setaria italica]) or forage-only rotation (forage sorghum– forage millet). Forage crops included forage sorghum and forage millet. Each phase (i.e., rotation year) of each rotation was represented every year and replicated twice at each location.
Management practices were consistent within each crop type across all rotation systems and sites. Winter wheat and summer forage crops were all planted with a Sunflower no-till grain drill (AGCO Corporation, Beloit, Kansas). The wheat variety ‘Byrd’ was planted in October of all years at 67 kg ha–1 on 30.5 cm row spacings. Forage sorghum variety ‘Honey Sweet’ and forage millet variety ‘Golden German’ were planted in early June in all years at 13.5 and 11 kg seeds ha–1, respectively, on 30.5 cm row spacings. Fertilizer was applied at a rate of 45 kg ha–1 N and 22 kg ha–1 phosphorus pentoxide (P2O5) with wheat planting each year. Sorghum and millet plots received 45 kg N ha–1 and 45 kg P2O5 ha–1 with planting each year. Weeds were controlled with herbicides throughout the growing season in both cropped and fallow strips. Crop harvest dates are presented in figure 1. Wheat was harvested with a standard combine grain head, and forage crops were cut, swathed, and baled within two weeks of cutting.
(a) Daily precipitation and mean temperature data; and (b) cumulative decomposition days calculated as the less of the temperature coefficient or moisture coefficient for each day (equations 1 through 4) beginning after wheat harvest in 2013, at Sterling and Stratton, Colorado. Vertical lines correspond to wheat or forage harvest dates at each location (lines overlap in some years) and gray shading corresponds with the time periods when residue data was collected in 2015 and 2016.
Residue measurements were taken between March and October in 2015 and 2016 from all wheat and forage residues. We selected these sampling months to capture the growing season as well as time periods of greatest wind speeds and erosion susceptibility at these sites. The highest 20-year average wind magnitudes at the two sites occurred during the spring (March to May) months (Iowa Environmental Mesonet 2021). We included cropping strips that contained residues from crops harvested in 2014 and 2015. In addition, two strips of wheat residue were included that were harvested in 2013 followed by fallow in 2014 and wheat in 2015. Due to the multiple rotation systems and crop entry points at each site, our approach resulted in measurements of residue dynamics for a total of 23 strips of forage and 17 strips of wheat residues over the 2015 and 2016 growing seasons.
Residue Quantification. To track residue changes over time, flags were placed at two random locations on the summit portion of each study strip in March of 2015. Every month from March through October of both 2015 and 2016 a quadrat measuring 1 m × 0.8 m was aligned with the flag and a picture taken with a camera on a tripod looking straight down from a height of 1.4 m. Photographs were taken with a Panasonic DMC-FZ70 16.1-megapixel digital camera directly above the quadrat.
Photographs were overlain with a 4 × 5 square grid (20 squares). Each square was visually assessed for the percentage cover by crop residue, weed residue, live crop, and live weeds. The individual portions of the grid were scored for percentage of surface covered in 5% increments from 0% to 100%, with residue or live plant type receiving an individual score. All photos for the study results were analyzed by one person to ensure a uniform application of the method. While there have been advancements in software for digital photograph analysis (Chen et al. 2010; Vanha-Majamaa et al. 2000), we selected manual image analysis due to the subtle color differences and shifting shadows between time points. Manual analysis also allowed for greater certainty in the distinct identification of crops and weeds.
The average height of standing residue was measured monthly through the two seasons. Only pieces of crop residue at greater than a 10-degree angle from the ground were considered “standing” (Steiner et al. 1994). Within each quadrat, four standing crop residues were measured for height at random. In March and September of 2015 and March of 2016, the number of all standing residue pieces were counted, and the diameter of 4 stems at about 10 cm above the soil surface was measured.
Tracking declines in crop residue cover in active agricultural systems is complicated by the growth of new crops and weeds. One difficulty of the overhead picture method of assessing residue cover is that the surface crop residue cover can be concealed by live weeds, dead weeds, and growing crops. Analysis of the photo data showed that once live weeds or weed residues were over 20% in any plot, the crop residue surface cover would artificially decline, then increase again once the weeds were terminated. To minimize the effect of other covers obscuring crop residue, data from pictures with more than 20% cover by any other material were excluded from analysis of residue cover decomposition dynamics. This resulted in the exclusion of 30% of residue cover images. In total, 20% were excluded due to obscuring of residue by the next live crop and 10% were excluded due to live weeds and weed residues. After the exclusion of photographs in which residue was obscured by weeds or live crops, an average of 7 forage images per site per month for 2014 residues and 11 forage strips per month per site for 2015 were included in the analysis. For both 2014 and 2015, 6 strips per month per site of wheat residue were included.
Decomposition Days. To normalize the timescale across sites and years, decomposition days (DD) were calculated as proposed by Schomberg et al. (1996) and Steiner et al. (1999) and as used by USDA ARS (2020) to calculate decomposition in the WEPS model. DD are similar to growing degree days as an approach to standardizing decomposition conditions across time and location as a function of cumulative temperature and moisture. The DD equation relies on average daily air temperature and daily precipitation as factors that influence microbial activity that governs decomposition. Residue composition is another factor that can influence decomposition rate, but we did not analyze residue quality in this study. Gilmour et al. (1998) found that C:N composition of residue was only a major influence on decomposition rate for the two weeks following harvest. Out of 520 total pictures used, only 15 were taken within the two-week time period following harvest.
Using the DD normalization approach, ideal conditions for decomposition were assumed to occur with 32°C and at least 4 mm precipitation. These ideal conditions result in one DD, and conditions other than ideal result in a fraction of a DD being added to the cumulative total days (Schomberg et al. 1996; Steiner et al. 1999). Daily weather data for 2013 through 2016 was taken from the on-site CoAgMet stations (http://www.coagmet.colostate.edu/index.php). The weather station at Stratton had periodic missing data. In these instances, the weather station at Kirk, 35.4 km to the north, was used to supplement missing days.
The DD were calculated as the lesser of a temperature coefficient (TC) and a moisture coefficient (MC), neither of which can be greater than 1. The TC was calculated as equation 1:
1
where Topt= 32°C and T = daily mean temperature. A precipitation coefficient (PC) was calculated based on an assumption of a minimum of 4 mm precipitation necessary to wet a layer of residue (equations 2 and 3):
2
3
The MC was constrained to ≤1 (equation 4):
4
where MC = MC, which is the moisture coefficient for the current day, and MC = the moisture coefficient of the previous day. The DD has a limiting factor of either moisture or temperature, thus the lesser of the MC or TC each day is used.
DD were accumulated following each harvest date until the planting of the next crop. Therefore, the earliest DD calculations began following 2013 crop harvests for cropping strips with fallow in 2014 and no harvested crop until 2015.
Data Analysis. To compare crop type effects on residue cover and DD at the first measurement point, we used general analysis of variance (ANOVA) to test for main effects, and interactions were calculated for site, year, and crop type. All ANOVA and mixed effect model analyses were conducted using JMP software v. 8.1. (SAS, Cary, North Carolina). We included crop rotation as a fixed factor in initial models and found that rotation system did not influence residue dynamics (p> 0.5). Therefore, we analyzed all wheat residues together. Forage sorghum and forage millet, which were planted and harvested at the same time and with the same equipment, were grouped and analyzed together as forage crops. We compared residue cover by crop (wheat and forage) and mean DD following harvest to initial residue measurement using pairwise t-tests of least square means. Throughout our analyses, p-values less than 0.05 were considered significant when reporting means comparisons.
To model the decline of crop residue cover over time in each of the study site strips and to extract the mean DD until residue reached the 30% cover threshold, we created loess curves using the statistical software R (version 3.2.4) and the plyr package. The loess curves allow for estimating the point at which each strip reaches the 30% threshold within the existing data. The loess method does not assume a fixed model and therefore cannot predict outside the data set. If a strip did not start with greater than 30% residue cover after harvest or if it did not fall to 30%, it was removed for this portion of the analysis to meet the loess fitting method assumptions. While this removed some strips from the calculation, it does serve to keep the model within the bounds of what the data demonstrates. The loess curves were created for each crop strip, providing a single estimate of DD to reach 30% cover for each replicate crop strip. These values were then used to analyze for differences in DD to reach 30% by crop using the ANOVA model as described above where the fixed factors were site, year, crop type, and their interactions.
Stem counts and stem diameter measurements of each crop at each site were grouped by harvest year and by length of time after harvest that the measurement was taken. Average stem count and stem diameter means and standard errors were calculated for site, year, and crop type. Exponential curves were used to model the decline of wheat and forage crop residue stem heights for each harvest year at each site. Stem heights were analyzed using a repeated measures mixed effects model where strip was a random factor and site, year, crop type, and DD were fixed factors. Due to the nonlinear relationship between stem heights and DD, stem heights were first log transformed.
To model the decline of crop residue cover, we created model curves for each site, strip, harvest year, and crop combination using the lsmeans, car, and lme4 packages in the statistical software R. We tested exponential decay and quadratic functions and compared R2 and Aikaike Information Criterion (AIC) values to determine best model fit.
Results and Discussion
Differences in crop growing seasons as well as weather patterns for each year resulted in different patterns in the accumulation of DD following harvest for each crop at each site. For example, wheat was harvested in July when DD was accumulating more quickly as compared with forage crops harvested in late August/early September (figure 1). Weather conditions also varied over the growth period for the crops in the study. Colorado winter wheat harvested in 2014 received 266 mm of precipitation over the late-September of 2013 to July of 2014 growing season where winter wheat harvested in 2015 received 409 mm of precipitation over the late-September of 2014 to July of 2015 growing season.
Initial and Persistence of Crop Residue Cover. Wheat consistently produced more residue cover immediately after harvest. After harvest in both growing seasons, wheat had 1.5 times higher initial residue cover than forage (tables 2 and 3, p< 0.0001). The initial wheat residue cover average at both sites was 82% at the initial measurement after harvest for both study years. Forage residue covered an average of 56% of the soil surface at the initial measurement after harvest when averaged over both sites for both years together.
Initial crop residue cover after harvest and decomposition days (DD) at initial measurement by crop type (summer annual forage or winter wheat).
Analysis of variance (ANOVA) results of the effects of harvest year, site, crop, and their interactions on initial crop residue cover data presented in table 2.
Both initial residue cover and differences in harvest timing between these crops influenced the persistence of residue soil coverage. Wheat residue cover persisted almost twofold longer than forage crop residues. After harvest wheat took 3.8 times more DD than forage to decline from its initial state to 30% residue cover (table 4). Wheat cover endured 63 DD, which corresponded to about 14 to 15 months after the July wheat harvest in the eastern Colorado climate before falling to the 30% threshold. Forage residue cover remained for 17 DD, which corresponded to approximately 8 to 9 months after September forage harvest before falling to the 30% threshold.
The mean decomposition days (DD) for crop residue cover to fall to 30% following harvest by harvest year and site for forages sorghum and millet and for winter wheat with the least squares means comparison by crop. Different letters represent significant differences within a column (p< 0.05). Standard errors are in parentheses.
These results suggest that forage residue cover is highly likely to fall below 30% cover before the following year’s crop could reach a stage of growth large enough for the crop canopy to provide adequate surface cover if grown in rotation with another summer crop and would leave the soil vulnerable to erosion events if followed with a summer fallow preceding wheat in a common three-year wheat–summer crop–fallow rotation system. In contrast, our data suggest that wheat residue would remain protective of the soil through the next growing season in continuously planted systems. In common wheat/fallow systems, wheat residue cover would fall to less than 30% in approximately 14 months near the planting of the following wheat crop. Thus, wheat residue would provide sufficient cover during the vulnerable spring wind erosion season of the fallow phase.
However, these are underestimates of residue persistence particularly for wheat. Over the study period many wheat strips did not fall below the 30% threshold before the next crop was established and a few forage strips had initial residue cover of less than 30%. In order to estimate the length of time of the decline, data for each included crop strip necessarily must encompass the descent from above the threshold to below it. Of the 17 wheat strips in the study, only 8 strips decreased below 30% cover while the other strips remained above the 30% threshold throughout the time of the study or until the following crop canopy was established. Of the 23 forage strips evaluated, 18 were used in the exponential decay curve. Two strips were excluded because the residue cover began below 30% and two were excluded for never falling below 30%. These differences provide further support that winter wheat residue cover is far more persistent than summer forage residues.
These results are particularly relevant as cropping systems have been intensifying across the region, with decreasing frequency of summer fallow and increased adoption of summer annual crops (Rosenzweig et al. 2018). While these intensified systems have other benefits, such as improved soil aggregation and soil C (Rosenzweig et al. 2018; Sherrod et al. 2003), they may have periods of greater wind erosion susceptibility following summer annual crop phases.
Initial and Persistence of Standing Residue.Stem count means for forage and wheat differed by crop at each site. The number of forage stems remaining after the 2015 harvest declined 20% at Sterling and 25% at Stratton between the 0 to 3 month measurement period and the 6 to 8 month period after harvest (table 5). In the same period, the number of wheat stems had larger percentage declines of 35% and 57% between the 0 to 3 and 6 to 8 month postharvest periods, respectively. However, due to the lower starting number of forage stems, there were no forage stems that remained standing by the end of the next year’s growing season. Standing wheat stems declined 92% at Sterling and 94% at Stratton between the 6 to 8 month postharvest period measurement and the end of the following growing season, but still retained an average of about 28 stems m–2 more than 12 months after harvest.
Stem count and standing stem diameter means by months after harvest. Measurements were taken in March of 2015, September of 2015, and March of 2016 for forage and wheat crops harvested in 2014 and 2015. Standard errors are in parentheses.
As expected, forage stem diameters were larger than wheat stem diameters at both sites in the period immediately following the 2015 harvest (table 5). For crops harvested in 2015, forage stem diameter decreased 50% at Sterling and 32% at Stratton in the period between 0 to 3 months after harvest and 12 months or more. In 2015, harvested wheat stem diameters declined 37% at Sterling and 27% at Stratton over the same timescale.
Wheat harvest produced taller initial residues compared to forage at both Sterling and Stratton (figure 2). This is not surprising considering the differences in standard cutting heights for wheat and forage crops. Wheat had a more rapid decline in standing stem heights over time, though both wheat and forage stem heights fit an exponential decay model. Combining data across years and sites, the exponential decay model with DD explained 70% and 55% of variability for wheat and forage stem heights, respectively (figure 2). At Sterling in 2014 and 2015, the regression coefficients of the exponential decline rate of stem height with DD were similar across years and crop types, ranging from R2= 0.73 to 0.81. The rate of decline did not differ by crop type at Sterling. At Stratton, wheat stem height declines were represented by an exponential curve (R2= 0.86 and 0.94) better than forage (R2= 0.50 and 0.57) in 2014 and 2015, respectively, and wheat stem heights declined at a faster rate than forage.
Decline of standing residue heights at Sterling and Stratton locations for harvest years 2014 and 2015 with fitted exponential decay lines by crop for wheat and forage. Measurements for residues of harvest year 2014 began in March of 2015.
The exponential decline patterns of standing stem height are similar to previous studies of standing stem fractions. Steiner et al. (1994) found that the fraction of initial number of standing stems decreased exponentially across multiple small grains, including winter wheat. Our results suggest that summer crops follow similar patterns, despite different stem diameters and planting densities.
Modeling Crop Residue Cover Decline. The monthly measurements of residue cover from both sites were analyzed together. We hypothesized that residue cover over time would fit an exponential decay model. Our hypothesis was supported for the forage residue cover at both sites (R2= 0.64) that had a rapid initial decline in forage residue cover from the initial postharvest amount and the concave shape of the data (figure 3). While the exponential decay model was the best fit for forage, it was not a good fit for wheat residues, so an alternative model was derived that had a better fit (R2= 0.70). A quadratic model better represented the endurance of wheat residue cover after harvest at both sites, and the convex shape best described the delayed, slow decline (figure 4).
Exponential decay model for the decline of forage residue cover at Stratton and Sterling. Each point represents the data from one photograph of soil surface cover. Mean model parameters: B0= 74.3; B1= –0.18.
Quadratic decay model for the decline of wheat residue cover decline at Stratton and Sterling. Each point represents the data from one photograph of soil surface cover. Mean model parameters: B0= 80.12; B1= 1.87; B2= 0.04.
Residue mass at harvest is often used to estimate erosion control potential using exponential decay models. Surface cover decline may not follow the same patterns of residue mass loss for several reasons. At high initial residue biomass levels, residues may decrease in mass for a period of time while retaining surface area, resulting in a lag period before an exponential decay pattern emerges (Steiner et al. 2000). Also, the initial depth of the residue can consist of several layers of material such that layers in contact with the soil surface will decay exponentially whilst leaving the amount of soil covered unchanged. Thus, residue mass does not always provide sufficient information to estimate vulnerability to wind erosion.
Methodological Considerations. The use of visual assessment of monthly photographs of surface cover and standing stem heights, while time and labor intensive, provided a consistent method for assessing the decline of residue cover and stem heights. Automated analysis of field photographs is possible, but variations in natural light can be a hindrance (Yu et al. 2017). Vegetative cover in multiple layers and with multiple species can also reduce the accuracy of automated image analysis (Vanha-Majamaa et al. 2000). Visual assessment rather than software image analysis of the photographs allowed for distinctions in shadows and shadings. Further, the first-hand knowledge of the field conditions at each sampling made identification of weeds and crops more accurate, which could have been an obstacle to automated assessment approaches. Current plant image analysis software options do not include software designed to identify residues, which can be difficult to distinguish from soil (Lobet et al. 2013).
Summary and Conclusions
Wind erosion remains an issue affecting the arid and semiarid regions of the world, causing irreversible soil degradation. An understanding of the manner and timescale in which crop residue cover declines can provide practical knowledge for management for erosion control of such areas. The hypothesis that soil residue cover would be higher and persist longer for winter wheat than summer annual crops was supported in this study, with wheat providing greater cover immediately after harvest and for a longer duration. The hypothesis that residue cover would fit an exponential decay model was supported for forage sorghum and forage millet, but not for wheat over the course of this study.
Our results indicate that the most vulnerable point in intensified rotations is likely during the shortened fallow period after a summer crop and preceding the next winter wheat crop. More multiyear studies into the decline of residue cover while new crops are growing as well as during fallow periods are needed. Given the postharvest amount of wheat residue cover and its persistence over time, wheat-based rotations may provide more enduring wind erosion protection for the soil compared to rotations based in summer annual crops, such as corn or forages, especially on highly wind erosion prone soils.
- Received January 5, 2021.
- Revision received September 2, 2021.
- Accepted October 13, 2021.
- © 2022 by the Soil and Water Conservation Society