1 Introduction

The timing of spring governs the activity of ecosystems and agriculture (e.g., Lieth 1974; Schwartz et al. 2006; Parmesan 2006; IPCC 2007; Schwartz et al. 2012), as well as the dynamics of the physical climate system (e.g., Schwartz 1992; Pielke 2001) across much of the Northern Hemisphere. Recent work has demonstrated that variations in the timing of spring occur at interannual to multidecadal timescales, are coherent across large spatial scales, and can outpace long-term trends from climate change (Ault et al. 2011; McCabe et al. 2011; Schwartz et al. 2013; Ault et al. 2015a).

Changes in the timing of spring due to climate change are likely to alter ecosystem function and agricultural activities. Such changes will present new opportunities and challenges to natural resource managers, growers, and farmers. For example, a longer growing season at higher latitudes could mean that new varieties of crops could be planted in those areas. The year of 2012 presents a salient example of some potential consequences of an earlier spring. Unusually warm temperatures early in the spring of 2012 led to the warmest March for which instrumental data is available, breaking records at over 15,000 sites across the continental United States (Blunden and Arndt 2013). The warmest temperature anomalies occurred across the Great Lakes. Locations such as Chicago, IL recorded nine consecutive days exceeding or tying high temperature records with an overall \(+8.7\,^{\circ }\hbox {C}\) mean temperature departure for the entire month (US Department 29 of Commerce, NOAA 2012). The summer-like weather brought many agricultural crops out of dormancy including fruit trees and other plants (Knudson 2012; Ellwood et al. 2013). Meteorological indices and remote sensing products characterized March 2012 as the earliest spring since at least 1900 (Ault et al. 2013). However, having been brought out of winter dormancy prematurely by this “false spring,” many crops faced severe risks of frost damage when temperatures returned to their climatological values in April. Economic losses occurred across a large expanse of the Midwest and Northeast as a response to the early season warmth. The state of Michigan, for example, reported nearly $500 million in agricultural damages (Knudson 2012).

Phenological events have been highlighted as a potentially useful integrator of climate information with high societal relevance (Schwartz et al. 2012; Ault et al. 2013; IPCC 2007; Rosemartin et al. 2015), but they are seldom available across large spatial scales or over long periods of time. That is, data from remote sensing provides extensive spatial coverage of land surface phenology, but is limited by the short length of the satellite record. In situ observations are available from a few select sites and in some cases cover many decades, but are not spatially complete or homogeneous (White et al. 2009; Cook et al. 2012; Zurita-Milla et al. 2013). Characterizing the risk of events like the false spring of 2012 into the future requires a definition of “spring onset” that is consistent from one year to the next year, and from one location to another. A number of weather-based indices at the start of the growing season have been introduced and used to study continental and global scale environmental change (e.g., Jolly et al. 2005; Schwartz et al. 2006). One such product, the “extended spring indices” (Schwartz et al. 2013; Ault et al. 2015b) has become particularly widely used in this regard, with recent studies employing it to understand sources of potential predictability on interannual to decadal timescales (McCabe and Ma 2006), long-term historical trends (Schwartz et al. 2006, 2013; McCabe and Ma 2006; McCabe et al. 2013; Ault et al. 2015a), and future climate model projections (Allstadt et al. 2015). These indices have also been recently adopted as part of the national indicators efforts to understand and monitor climate change in North America (http://www.globalchange.gov/sites/globalchange/files/Pilot-Indicator-System-Report_final).

A deeper understanding of the competing roles of natural climatic variability and forced change is required to make meaningful interpretations about how spring transitions will play out in coming decades. Using, for instance, the Climate Model Inter-comparison Project 5 (CMIP5) archive would make it difficult to distinguish model spread arising from model-dependent forced responses or internal climatic fluctuations produced by the individual climate simulations themselves as a result of the limited available number of model ensemble members. We therefore focus the present study on a recently developed single-model “large ensemble” (LENS) experiment, allowing us to rigorously investigate the competing roles of natural variability and climate change in modulating the likelihood of events like the spring of 2012 in unforced, historical, and anthropogenic warming scenarios.

The analysis and results in this study are organized as is follows. Section 3.1 describes the observational spring onset record. Section 3.2 describes the LENS control run with pre-industrial forcings. Sections 3.3 and 3.4 describe spring onset under the LENS historical and future simulations respectively. We focus our analysis over the Great Lakes as a result of using March 2012 as an example of an anomalously early spring in the historical record.

2 Data and methods

2.1 Data

The extended spring indices (SI-x) were computed from both observational data and climate model output. The underlying assumptions, code and algorithm are documented elsewhere (Ault et al. 2015b), but briefly, the SI-x are derived from averaging outputs from phenological models for three different plant species using only \(T_{min}\), \(T_{max}\), and latitude as input. For a given location and year, the SI-x provide a day of year (DOY) estimate for the timing of both “first leaf,” “first bloom,” and “last freeze.” Because the same algorithm is used across space and through time to model these events, the SI-x present a consistent metric of the timing of spring across a wide range of climate zones and even future changes. Moreover, a number of other recent studies have linked these dates to both observer-based phenological changes (Schwartz et al. 1997, 2006, 2013) as well as large-scale fluctuations in the timing of spring green-up observable from space (e.g., White et al. 2009). Phenological coefficients have been optimized from the dates of lilac and honeysuckle leaf-out and bloom (Schwartz 1993) and provide a proxy for spring phenology due to their close similarity to other plant species (Schwartz et al. 2006, 2013).

Three different experiments performed as part of the LENS project (Kay et al. 2014) by the National Center for Atmospheric Research (NCAR) using Community Earth System Model (CESM) were also used to calculate SI-x (Table 1). These included: (1) a 1500-year control simulation with no externally varying forcings; (2) 30 historical simulations from 1920–2005; and (3) 30 climate change simulations from 2006–2080 following the representative concentration pathway 8.5 (RCP8.5) scenario (Moss et al. 2010). As described in Kay et al. (2014), the CESM fully coupled land and atmosphere model based control run was spun up assuming 1850 land, ocean, and atmosphere boundary conditions (Kay et al. 2014). A single ensemble member was initialized from the pre-industrial forcings of an arbitrary date in the control (January 1, 402) and run through until January 1, 1920; thereafter, thirty ensembles were branched off using the same initial conditions other than a small temperature round-off error leading to model spread (Kay et al. 2014). The ensembles were run until the year 2005 representing the historical forcing of the post-industrial era for greenhouse gases and temperature rises. During the period from 2006 to 2100, each of the historical ensemble members were forced using the Representative Concentration Pathway 8.5 (RCP8.5) (Meinshausen et al. 2011) to account for modeled twentyfirst century warming with average global surface temperature increases of around \(5\,^{\circ }\hbox {C}\) in the ensembles by the end of the simulation (Kay et al. 2014).

The Pacific Decadal Oscillation (PDO) global climate mode was compared through the historical and future LENS to understand the forcing of natural climate variability. PDO indices are computed from the sea surface temperature variability in the Pacific Ocean north of \(20^{\circ }\hbox {N}\) (Mantua et al. 1997; Mantua and Hare 2002). These indices were obtained from the Climate Variability Diagnostics Package (CVDP) (Phillips et al. 2014) for each run of the CESM LENS project including the control and future simulations. Although other indices have similar spatial patterns and correlations with the spring indices [e.g., ENSO and the PNA, (Ault et al. 2011)], we focus on the PDO because its long timescale of variability could allow it to mask short term (30 year) trends. Indeed, this appears to have been the case for at least part of the twentieth century (Ault et al. 2015a).

2.2 Methods

For the purposes of this study, “first leaf” index was selected as the primary metric to characterize the onset of spring, whereas the “last freeze” index is defined as the final day of any given year when \(T_{min}\) values are less than \(-2.2\,^{\circ }\hbox {C}\) (Schwartz 1993; Schwartz et al. 2006; Ault et al. 2015b). This temperature threshold has been used through a variety of ecological studies to determine the date of the last potentially damaging frost (Schwartz et al. 2006; Marino et al. 2011; Peterson and Abatzoglou 2014). To resolve the possible ecological risks from an early spring the “damage index” was calculated. This tracks the anomalous amount of time between the first leaf and last freeze index. In this method negative values indicate a higher risk of frost damage because an anomalously long period of time will have passed between when a plant starts growing and when the final freeze event occurs (Schwartz 1993; Schwartz et al. 2006). The timing of last freeze is important for spring phenology, however, we have only preliminarily analyzed changes in future last freezes largely in response to their noisier and more stochastic events and large regional variation (Schwartz 1993; Schwartz et al. 2006; Allstadt et al. 2015).

Mean onset dates and trends in SI-x were computed from observational \(T_{min}\)/\(T_{max}\) gridded data for comparison with climate model experiments. As in Ault et al. (2015a), this temperature data originates from the new \(1^{\circ }\times 1^{\circ }\) Berkeley Earth Surface Temperature (BEST) project (Mueller 2013). The underlying \(T_{min}\)/\(T_{max}\) product is derived from from the Global Historical Climate Network (GHCN) and other observational temperature records. Although most of our analysis encompasses the coterminous US, we focus on the region that was most severely affected by the March 2012 spring anomalies (\(37.5^{\circ }\hbox {N}\)\(50.5^{\circ }\hbox {N}\) and \(-101.5^{\circ }\hbox {W}\) to \(-75.5^{\circ }\hbox {W}\)) (Dole et al. 2014). This area includes the upper Great Lakes and particularly around portions of Wisconsin, Minnesota, and the upper peninsula of Michigan. Thirty year averages for SI-x (Ault et al. 2015a) were calculated from the BEST dataset over the period from 1981–2010 for comparison with the LENS experiments. Trends in spring onset, reported in days/decade, were calculated over the entire North American domain as well as for this particular region. We focused on the short term trends from 1981–2010, because of their relevance to phenological observations made from remote sensing over the satellite era, and longer-term trends (1920–2013) to characterize long-term changes.

Two different approaches were used to characterize similarities between March 2012 and springs in the LENS experiments. Pattern correlations were used to identify “2012-like” years in the LENS control and members. Specifically, we regridded the BEST data to the CESM grid, then correlated the 2012 anomalies with the SI-x anomaly years in the control. The nine earliest springs over the central United States were highlighted for further analysis. We also identified springs in the CESM control with similar relative magnitude to 2012 by averaging the first leaf index values over the focus region, then normalizing this time series. That is, the average first leaf index time series saw a −3 standard deviation anomaly during 2012 over the Great Lakes. We therefore take “−3” as a threshold for identifying an extremely early spring in the time series of LENS experiments averaged over the same region. With stationary, Gaussian statistics, years with spring onset at or below this threshold would only have a \(0.135\,\%\) likelihood of occurring. If we had multiple realizations of 1000 year control runs we might therefore expect to see, on average, one or two such events per run. Therefore, we extend the terminology of “March 2012-like” springs to indicate our threshold for anomalously early leaf-out years in the CESM ensemble simulations.

For comparing relative anomalies in the Great Lakes region from LENS future experiments to the historical observations (and runs), we construct modified z-scores for the first leaf index. These are developed by first estimating the control mean (\(\mu _{control}\)) and standard deviation (\(\sigma _{control}\)) for the averages from the study region. Modified z-scores for the Great Lakes region are then calculated from each ensemble member in the historical and RCP8.5 experiments by subtracting this reference mean (\(\mu _{control}\)) and dividing by the reference standard deviation (\(\sigma _{control}\)). That is, for a given ensemble member of a given experiment (\(x_{LENS}\)), the modified z-score is defined as:

$$\begin{aligned} z_{mod} = \frac{x_{LENS}-\mu _{control}}{\sigma _{control}}. \end{aligned}$$
(2.1)

Using these modified z-scores, early spring onset frequency was analyzed through the LENS historical (1920–2005) and future (2006–2080) periods by identifying years with values below the −2 and −3 standard deviation thresholds. Pearson correlations were computed between each LENS member’s first leaf and the overall ensemble mean for the select time periods to understand the variability in spring onset for each run.

To investigate the advancement of spring onset in the in the CESM control, short (30-year) and long-term (100-year) trends were computed for first leaf and last freeze beginning with model year 402.Footnote 1 Additionally, for the historical and future LENS scenarios we computed the trends in first leaf and last freeze through short (1970–2005, 2006–2040) and long (1920–2005, 2006–2080) model year time horizons. Damage index values were calculated for all control and LENS members to account for potentially damaging freezes during early spring years. It is expected that a longer time period between first leaf and last freeze will likely lead to an increase in the damage threat for plants (Schwartz et al. 2006; Marino et al. 2011; Peterson and Abatzoglou 2014).

Lastly, gridded first leaf values in the LENS scenarios were correlated to the PDO index time series to understand the influence of its variability on decadal to multidecadal trends in spring onset. Previous studies (e.g., Ault et al. 2011, 2015a; McCabe et al. 2013) have identified the PDO as an important mechanism for explaining through the historical record and effects such as the “warming hole” in the Southeast (Ga et al. 2012) and unequal short-term trends in spring onset. Therefore, we focus on the potential climate driver from the PDO in modulating early springs over the Great Lakes region in the LENS historical and future simulations.

3 Results

3.1 BEST observational record

Average SI-x values using the BEST gridded data network are calculated for the Great Lakes over the 1981–2010 reference period. Indices are generally elevation and latitude dependent across the United States as a result of mean climatic conditions (Ault et al. 2015a). Trends in first leaf from 1920–2013 are approximately 0.5 days/decade earlier than normal across the Great Lakes, however, this trend increases to nearly 1 day/decade earlier over the 1981–2010 reference period. For last freeze dates the mean trend is approximately 0.3 days/decade earlier from 1920-2013. We see this trend increase to nearly 2 days/decade earlier than normal over the shorter time horizon of 1981–2010. These averaged trends and SI-x for the Great Lakes are plotted in Fig. 1.

Fig. 1
figure 1

The SI-x gridded indices are used to calculate the first leaf and last freeze indices from 1920–2013 in the BEST data set (Mueller 2013) over the Great Lakes region. A simple least squares regression is additionally plotted to note the trend in earlier than normal spring onset over the post-industrialization era

March 2012 first leaf index anomalies are largest across the Great Lakes and upper Midwest, with values nearly 40 days earlier than normal in this region (Fig. 2). Z-scores for first leaf in March 2012 are approximately \(-3\sigma\). However, earlier than normal first leaf dates were not uniform over North America as areas across the Pacific northwest averaged later than usual.

Fig. 2
figure 2

(Top) The average first leaf index composite map is plotted using the BEST gridded data set (Mueller 2013) over the 1981–2010 climatological period. (Bottom) The 2012 first leaf index anomalies are calculated in their difference from the climatological mean in units of days with negative values denoting earlier onset. The outlined black box denotes the restricted geographic domain used in further calculations (\(37.5^{\circ }\hbox {N}\)\(50.5^{\circ }\hbox {N}\) and \(-101.5^{\circ }\hbox {W}\) to \(-75.5^{\circ }\hbox {W}\))

Using NCEP-DOE Reanalysis II (Kanamitsu et al. 2002) for March 2012 in Fig. 3, we see a hint of the dynamics linked to the early season warmth with the average 500 mb geopotential heights and zonal mean winds for the three days preceding the average leaf out (\(\hbox {DOY} = 86\)) across the Great Lakes and Midwest. A deep trough axis is positioned along the West Coast with an amplified ridge across the center of the continental United States allowing for an anomalous poleward transfer of heat (Dole et al. 2014).

Fig. 3
figure 3

Mean leaf out for March 2012 across the central United States was approximately March 26 (\(\hbox {DOY}=86\)). Average 200 mb geopotential heights and the zonal mean wind are plotted of the three preceding days. (Top) Vectors are plotted for u/v wind components and filled contours represent the zonal mean wind speed. (Bottom) Contours indicate geopotential heights in meters. Maps are derived from the NCEP-DOE Reanalysis II project (Kanamitsu et al. 2002)

3.2 CESM pre-industrial control

Leaf dates for the time series of the Great Lakes region in the CESM LENS control run are show in Fig. 4. Although there is no significant long-term trend spanning the entirety of this simulation (consistent with an unforced climate), short-term 30 year trends for a randomly selected century display spatial variations for first leaf and last freeze across the United States. Somewhat longer-term trends (100 years) indicate little to no spatial variability in spring onset or last freeze (Fig. 4).

Fig. 4
figure 4

First leaf and last freeze (LSTFRZ) indices 30 year trends are calculated from the CESM control simulation for three periods from years 500–599. A long-term trend is also calculated for each index over the entire century. Least-squares regressions are calculated in days/decade

There are several March 2012-like early springs (Fig. 5) in the control run. Two instances in terms of first leaf index dates below the 2012 threshold (\(-3\sigma\)) were found through the control time series (Fig. 6). Both of these years resulted in springs occurring nearly 40 days earlier than normal across the central United States while an area of later than normal leaf out was found across the Southwest. Most of the CESM control’s closest corresponding patterns (e.g., best spatial correlations with 2012) contain this apparent dipole pattern, which has also been documented in observational analyses of trends in spring onset (Schwartz et al. 2013; Ault et al. 2015a).

Fig. 5
figure 5

The nine closest March 2012-like matches are plotted from the CESM control run for years 402–999. Control years are listed per spatial grid. Anomalies are calculated from leaf index climatologies calculated over the entire control simulation. Negative values again indicate earlier than average spring onset

Fig. 6
figure 6

Time series of first leaf indices computed over CESM control (years 402–999) against the normalized 2012 threshold of minus three standard deviations. Units again assume the day of the year beginning with January 1. Indices are averaged over the defined Great Lakes region

Further analysis of these two cases suggests a series of high amplitude waves propagated across the Pacific Ocean eastward into the contiguous United States during the two weeks prior to the leaf out period. A large, but diminishing Aleutian low was evident south of Alaska and an anticyclone across southern Canada and the northeastern United States. This interpretation is again similar to the patterns (Fig. 3) identified as responsible for March 2012 early season warmth (Dole et al. 2014). All nine CESM matches contained significantly positive temperature departures across the central and eastern United States in association with strong warm air advection with a similar spatial pattern as March 2012.

3.3 LENS historical 1920–2005

Long term trends over the 1920–2005 period indicated spring advancement at approximately one to three days per decade across most of the United States, particularly in the Great Lakes (Fig. 7). However from 1970–2005, individual LENS members express greater variability in leaf onset trends with run-to-run spatial pattern disagreement. The overall ensemble mean is generally consistent with BEST observational trends in earlier onset dates across the Great Lakes with a slight trend toward later dates across the Southeast (Fig. 8). The mean first leaf index trend for all ensembles members is approximately 0.23 days/decade earlier, nevertheless some member-to-member variability is evident in the regional trends themselves (Figs. 7, 8). Last freeze trends (Fig. 9) over the same time horizons exhibit high run-to-run variability.

Fig. 7
figure 7

First leaf index trends are calculated from 1920–2005 from the LENS project (Kay et al. 2014). The average first leaf index trends (1880–2013) from the BEST data set (Mueller 2013) are shown in the top left subplot for comparison. Least-squares regressions are calculated in days/decade

Fig. 8
figure 8

Mean ensemble first leaf and last freeze (LSTFRZ) index trends are calculated from 1920–2005 from the LENS project (Kay et al. 2014). Least-squares regressions are calculated in days/decade

Fig. 9
figure 9

Last freeze index (LSTFRZ) trends are calculated over 1920–2005 from the LENS project (Kay et al. 2014). The average last freeze index trends (1880–2013) from the BEST data set (Mueller 2013) are shown in the top left subplot for comparison. Least-squares regressions are calculated in days/decade

Histograms were produced for the −2 and −3 standard deviation thresholds for each ensemble member (Fig. 10). Results are mostly consistent with the pre-industrial forced control run. Moreover, we see no increased frequency in earlier than normal springs across the central United States through the simulated historical period, and we do not see any similar spatial patterns from one run to the next.

Fig. 10
figure 10

Frequency of March 2012-like early springs (per century) are plotted in the CESM control (a) and historical (b) and future LENS members 2–30 (c). Thresholds for early springs assume a −2 and −3 standard deviation. Z-scores are normalized for the historical and future ensemble members. The dates of early springs are identified from the SI-x first leaf index

Upper air analysis of the −3\(\sigma\) years are characterized by a high amplitude long wave pattern with a ridge in the west and trough in the central or eastern United States. Peak warmth periods in the Midwest are consistent with a trough axis along the eastern Pacific and an anomalous jet streak out of the southwestern United States (Fig. 11). Strong poleward transport of heat is often advected northward in these events through south-central Canada.

Fig. 11
figure 11

A mean 500 mb zonal height anomaly was calculated for 3 days prior to the earliest first leaf in each LENS member. The top left subplot displays the mean 500 mb zonal height anomaly for three days prior to March 2012’s first leaf. First leaf dates are defined by the mean leaf out across the Great Lakes. Heights were used from the NCEP-DOE Reanalysis II project (Kanamitsu et al. 2002) with 500 mb height climatologies calculated from 1981 to 2010

The Pacific Decadal Oscillation and first leaf correlations are strongly negative across the Pacific Northwest, while slightly positive correlations are noted across our geographic domain (Fig. 12). Negative correlations are indicative of the positive phase of the PDO favoring early springs (McCabe et al. 2013). These results are consistent with earlier studies on the correlations between observational SI-x data and the PDO (Ault et al. 2015a).

Fig. 12
figure 12

LENS (1920–2005) correlation coefficients between −1 and 1 are plotted for the PDO time series (Phillips et al. 2014) and first leaf index

3.4 LENS future 2006–2080

Analyzing spring onset from the RCP8.5 future LENS members, we see a significant advancement in leaf out with a mean ensemble trend occurring approximately 2.6 days/decade earlier (Table 2). Over 2006–2080 spring arrives approximately 2–4 days/decade earlier across the central United States with close agreement among all ensemble members (Fig. 13).

Fig. 13
figure 13

First leaf index trends are calculated over 2006–2080 from the LENS project (Kay et al. 2014). Least-squares regressions are calculated in days/decade. Note the color bar bounds have been extended from the previous figures (e.g., 7, 8, 9)

Last freeze trends are not as spatially uniform across the contiguous United States. A longitudinal region of little to no advancement in last freeze dates exists across the central United States into the Mississippi River Valley (Fig. 14). Again, comparisons among individual ensemble members and the overall mean indicate a high correlation coefficient and close agreement. Trends in the 2006–2040 period also indicate a mean advancement of first leaf and first bloom, however, there is larger spatial disagreement among ensemble members. We also observed a similar longitudinal pattern in last freeze trends across central portions of the United States. Both trends indicate an increase in the damage index as a result of this unequal advancement of first leaf and last freeze (Fig. 15) because first leaf trends outpace those of last freeze.

Fig. 14
figure 14

Last freeze index (LSTFRZ) trends are calculated from 2006 to 2080 from the LENS project (Kay et al. 2014). Least-squares regressions are again calculated in days/decade

Fig. 15
figure 15

Damage index trends are calculated from 2006 to 2080 from the LENS project (Kay et al. 2014). Least-squares regressions are again calculated in days/decade

Results further support a large increase in both the frequency and magnitude of early spring onset across this region. Strong agreement between each ensemble member support a return period of March 2012-like (\(-3\sigma\)) leaf out events at roughly 20.8 years per run (2006–2080). We even see multiple ensemble occurrences of −4 and −5 standard deviation springs per each run with the earliest leaf date out of all members at approximately day 66 (March 7th). Most ensembles have a minimum of around day 72 (March 13th), which is considerably earlier than March 2012 or other simulations in the control and historical ensembles.

Analysis of the PDO time series for the future LENS (available through the CVDP at http://webext.cgd.ucar.edu/Multi-Case/CVDP_repository/cesm1.lens.2015-2100/) indicates a substantial negative bias in the future PDO under climate warming. Recent studies (e.g., Zhang and Delworth 2016) have also suggested a change in the amplitude and variability of the PDO in response to future climate warming. As a result, future LENS first leaf correlations and the PDO show negative correlations across all of North America (not shown).

4 Discussion

Examining historical spring onset from both the BEST observational record and LENS experiments indicate spring onset is modulated by internal climate variability and regional short term and long term trends (Ault et al. 2015a). Over some periods the historical LENS leaf trends closely reproduce the magnitude of the observations, but differ in spatial pattern variability. However, we do find an overall slightly more negative trend exists in the observations. This suggests that the ensemble members may be too insensitive to the greenhouse gas forcing. It is also possible that regridding techniques (Oyler et al. 2014) in the higher elevations of the western United States are responsible for a bias toward later spring onset (Ault et al. 2015a). Despite this potential bias in the ensemble (or perhaps the observations), the model responds to both forced and natural climate variability in the short and long term and therefore acts as an important proxy of multidecadal climate variability in the twentyfirst century.

The frequency of March-2012 like springs in the CESM control is similar to the observational record, however, many of the historical LENS members fail to produce any such early springs that are as anomalous as that of 2012. Historically, SI-x from observational records indicate anomalously early springs, such as March 2012, are likely enhanced by high amplitude long waves (Ault et al. 2013) with the role of climate change being weaker than interannual variability. Early springs through the modeled and historical record indicate March 2012-like springs are often associated with a high amplitude ridge across the central United States and trough axis along the West Coast. This setup is consistent with a strong poleward transport of heat as warm air intrudes to northern latitudes.

By midcentury, the LENS future members result in an increased frequency of March 2012-like springs to nearly one in every three years. Overall trends double in the advancement of spring onset from the observational (Fig. 1) and modeled historical record (Fig. 10). Across the Great Lakes, we see first leaf-out arriving in early March by the end of the century in contrast to the previously observed beginning of April (Fig. 16).

Fig. 16
figure 16

Mean first leaf (bottom line) and last freeze (top line) indices are plotted for the CESM control and future and historical ensembles in the Great Lakes region. Individual LENS historical and future runs are plotted in gray for first leaf to indicate member spread. Time series values are recorded in respect to the day of the year since January 1st (\(\hbox {DOY}=1\)). The horizontal dashed line indicates the \(-3\sigma \) threshold placed for March 2012-like springs while the vertical dashed line indicates the year the mean spring onset surpasses this threshold

Last freeze trends results in the control and historical ensembles are also consistent with historical analysis of station data in the 20th across the Great Lakes with high year-to-year variability of nearly −8 to +13 days for last freeze dates (Schwartz 1993). However, the overall mean trends are slightly more positive than the trend in the BEST gridded data set (Fig. 1). As expected, the future LENS members also display a gradual earlier trend in last freeze dates across much of the United States. However, this warming trend is not uniform: a latitudinal axis of little to no decadal trend in last freeze dates is distributed across the central United States and lower Midwest through the periods of 2006–2040 and overall 2006–2080. These results are also consistent with other studies using the CMIP5 to determine changes in false springs through the twentyfirst century (Allstadt et al. 2015). While leaf and bloom appearances may extend the growing season, frost and freezes occurring near climatological averages may now increase the damage to plants and agriculture across this area (Allstadt et al. 2015). However, the risk of false springs remains present or even increases across portions of the southern Great Lakes and Midwest where the damage index marginally increases as a result of little to no change in the timing of last freezes. False springs create the risk for significant ecological changes for plants and other species adapting to the transition period between winter and spring (Schwartz et al. 2006; Marino et al. 2011; Peterson and Abatzoglou 2014).

Comparing the first leaf correlation coefficients between the ensemble mean and individual members indicate the highest correlations for the LENS future period from 2006–2080, while the lowest correlations and greatest range are found over 1970–2005 (Fig. 17). This suggests internal climate variability contributes a greater role in early spring onset during the historical period. However, by the middle to end of the twentyfirst century we see a decreased influence of such dynamics, allowing the trend in earlier spring onset to likely be a result of overall surface temperature warming. Temperature anomalies during the earliest spring years across the central United States approach nearly 5–10\(\,^{\circ }\hbox {C}\) above normal with 30-year climatologies calculated from the mean historical LENS. Last freeze LENS and ensemble mean correlation coefficients also are similar to Fig. 17 and highest for the 2006–2040 and 2006–2080 periods.

Fig. 17
figure 17

Correlation coefficients are calculated between each LENS member and the overall ensemble mean for the first leaf index across all of North America

5 Conclusions

Understanding changes in the timing of seasonal transitions are critical for a wide range of natural resource management strategies to cope with climate change as it unfolds this century. However, for such strategies to be successful it is important to disentangle the competing role of internal climatic variability and long-term forced changes on regional to continental scales. In our analysis, we have examined the competing roles these two types of influences play in creating 2012-like years in the continental US and Canada using a new “large ensemble” of climate change simulations (Kay et al. 2014) and an index of spring onset based on plant phenology using the extended spring indices model (Schwartz 1985; Schwartz et al. 2006, 2013; Ault et al. 2015b). Overall, our findings suggest that CESM does a realistic job of simulating the historical range of extremely early spring as well as the influence of at least one important large-scale mode (the PDO) of variability and its impact on spring onset. While natural climate variability has been the dominant control on spring onset in the past, our results argue that this could change considerably by mid-century if greenhouse gas emissions continue to go unchecked. In fact, 2012 could become the new norm, not the anomaly it once was. As a result of using a singular case study as a benchmark, we can better understand the changes in SI-x in a global climate model and from the relative contributions of natural and forced change in making March 2012-like events more common in the future.

The main conclusions of our analysis are:

  1. 1.

    The range of spring onset trends in the CESM control and historical experiments are similar to those in the BEST observations, suggesting that natural variability has been the dominant factor in pacing regional patterns of spring onset. This is further supported in the BEST historical record where long term trends in spring onset are modulated by shorter term regional differences in leaf out and last freezes. However, on average, a small advancement on the order of one day/decade, is seen across the United States. This compares well with the estimates of Ault et al. (2015a) and Schwartz et al. (2006).

  2. 2.

    CESM realistically simulates observed PDO correlations with spring onset, which in turn plays an important role in modulating spring onset in the historical simulations. The influences of the PDO and spring onset have been widely explored (e.g., Ault et al. 2015a; McCabe et al. 2013), particularly across the Pacific Northwest where -PDO regimes are often responsible for later springs (McCabe et al. 2013). This is consistent with later SI-x dates for western locations and subsequent earlier dates for eastern areas. These patterns are exhibited in most modeled and observed case studies. However, changes in the frequency and magnitude of the PDO in response to climate change (Zhang and Delworth 2016) increase the uncertainty of the regional influences of the PDO and spring onset in the future.

  3. 3.

    Years like 2012, the earliest spring in observational records, are extremely rare in the control and historical simulations, but become commonplace by mid century. Historical climate records since 1920, confirmed by the BEST gridded data set, suggest such early springs are infrequent and outliers. Furthermore, the CESM pre-industrial control supports this result with a frequency of early springs at roughly once per century. Historical LENS simulations indicate a similar frequency. The synoptic and long wave patterns responsible for these early springs through NCEP-DOE Reanalysis II (Kanamitsu et al. 2002) and LENS output result in an important composite suggesting correspondingly similar dynamical regimes and synoptic patterns. Exceptional early season warmth across the region was often in direct response to a high amplitude trough in the eastern Pacific and ridge across the central to eastern continental United States resulting in an anomalous poleward transport. Similarly, a low pressure axis south of Alaska implies a southern shift and weakening of the Aleutian Low, which is fairly common in negative PDOs (Mantua and Hare 2002). In contrast, advancing the LENS members through 2080 under RCP8.5 forcing reveals a striking result in regards to both frequency and magnitude of the early spring onset threshold. The overall timing of spring advances nearly a month earlier across the Great Lakes by 2080. Likewise, an increase in the number of anomalously early springs is also noted through all of the RCP8.5-forced ensembles with member outliers indicating spring-onset in the Great Lakes as soon as early March by the end of the twentyfirst century.

  4. 4.

    While all of the United States experiences surface warming and an increased advancement in the timing of spring by mid century, the southern Great Lakes are projected to see the rate of spring onset outpace the rate of “last freeze” advancement. Hence, the window of time when plants may be exposed to frost damage in that region may increase, therefore, resulting in a greater risk to plant species growth.

  5. 5.

    In the future, the role of natural variability weakens substantially compared to the forced changes, which is fundamentally different from the past. Trends in earlier springs over the period nearly double from the historical record with a mean onset occurring at nearly 2.6 days/decade earlier by the middle of the century. Kay et al. (2014) documents all thirty ensemble members are subjected to surface warming of nearly \(5\,^{\circ }\hbox {C}\). This outcome suggests the timing of spring is likely related to an overall trend in increasing surface temperatures and a decreased influence of internal climate variability. In contrast to the historical experiment and observations, we find that the forcings of long term warming exceed the decadal to multidecadal natural variability, such as from the PDO, in both the trends in spring onset and frequency of March 2012-like springs.

The implication of these results may prove important in our understanding of modeling natural climate variability as we move forward into the twentyfirst century. Moreover, these results open the possibility for testing the potential predictability of spring onset on long-lead (seasonal to decadal) time horizons because it suggests that the steadier long-term forcing from anthropogenic activity might generate a certain type of “built-in” predictability. At the same time, the historical dominance of natural variability in observations as well as control and historical simulations cautions against simply extrapolating recent trends to plan for the decades to come. This argument is further supported by the pattern correlations between the ensemble mean and each ensemble member (Fig. 17), which show that during the first part of this century the individual ensemble members still exhibit considerable spread around the forced pattern. Nonetheless, our results argue that years like 2012 will become commonplace just a few decades from now, and will be important for resource managers and planners in the context of future changes to agricultural growing seasons and the timing of phenological springs.

Table 1 Information listed includes summaries of selected gridded data sets and model experiments utilized in this project
Table 2 LENS RCP8.5 forced future trends (days/decade) are listed for each ensemble member over the 2006–2080 period