After several detours and dead-ends, it looks as if I have locked on a plausible ENSO model, parsimonious with recent research. The sticky widget almost from day 1 was the odd behavior in the ENSO time-series that occurred starting around 1980 and lasting for 16 years. This has turned out to be a good news/bad news/good news opportunity. Good in the fact that applying a phase inversion during that interval allowed a continuous fit across the entire span. Yet it’s bad in that it gives the impression of applying a patch, without a full justification for that patch. But then again good in that it explains why other researchers never found the deterministic behavior underlying ENSO — applying conventional tools such as the Fourier transform aren’t much help in isolating the phase shift (accepting Astudillo’s approach).
Having success with the QBO model, I wasn’t completely satisfied with the ENSO model as it stood at the beginning of this year. It didn’t quite click in place like the QBO model did. I had been attributing my difficulties to the greater amount of noise in the ENSO data, but I realize now that’s just a convenient excuse — the signal is still in there. By battling through the inversion interval issue, the model has improved significantly. And once the correct forcing and Mathieu modulation is applied, the model locks in place to the data with the potential to work as well as a deterministic tidal prediction algorithm.
As a rewind, there were several essential ingredients that I originally thought would play into the ENSO model. From the ENSO’s vague similarity to a Bloch wave, I expected a Mathieu modulation to the wave equation to play a role. And this was reinforced by my discovering that all current sloshing models feature a Mathieu modulating factor. That insight occurred about two years ago in the timeline.
Around that time, I also postulated that the Chandler wobble, the QBO of atmospheric winds, and TSI variations would provide a first-order set of forcing inputs. The Chandler wobble has continued to be an important factor, but the QBO caused some frustrations as it was tantalizing close in providing the right frequency, but not quite. The TSI was likely the one factor that turned out to be a bust. I needed a period of over 10 years to adequately model a required forcing frequency, and while TSI variations were the only factor close to that value, I never could reconcile how that slight of solar heating modulation could play that significant a role.
The use of long-period tidal factors came in to the picture when I started having good success with modeling the QBO. The caveat to this QBO/ENSO relationship is that QBO is really forced by a gravitational forcing, while ENSO is mainly the result of angular momentum changes — with the understanding that a lunar gravitational pull can modulate the angular momentum.
As I indicated at the beginning, the phase transition of 1980 was both a curse and a blessing. When I realized that a biennial modulation could provide just the metastable initial conditions required to cause a phase inversion, that’s when the model started to click in place. The Wang paper describing the triaxial wobble terms solidified the idea that there was a 14 year forcing term that could replace the place-holder TSI term. By modulating the 14 year term with a biennial factor, a period of around 2.33 years effectively mimicked the QBO forcing as well. This clearly showed up in the frequency spectrum as biennial modulated side-bands as shown below, and which confirmed the work of Kim.

Fig 1: Power spectra centered around the 2-year biennial modulation frequency.
So at this point, more than 2 years after I started looking at modeling ENSO, the ingredients are ( listed according to greater to lesser importance):
- A strict biennial modulation in the forcing and in the Mathieu modulation factor of the wave equation.
- A Chandler wobble envelope of period ~6.5 years to provide angular momentum changes to force ENSO
- Another triaxial wobble term of ~14 years.
- A long period lunar nodal tidal factor corresponding to 18.6 years
- A pair of aliased lunar anomalistic tidal factors corresponding to 3.91 and 1.34 years (these are aliased from the 27.5545 day lunar period)
- Second-order harmonics and cross-terms of the above.
I apply either a DiffEq solver or a wave-equation transform for fitting the models to data. The latter has a faster turnaround and can fit models such as the following Figure 2. That’s essentially what took two years in the making, following all the detours and circling back from dead-ends, and working the largely quiet-zone known as the Azimuth Project trying to drum up some interest in the project (no peep on the latest posts 😦 at least not yet) .

Fig. 2: Fit model to ENSO data (1/3 Darwin, 1/3 Tahiti, and 1/3 Nino3.4). This uses the wave-equation transform approach, and has a phase inversion between 1981-1996.
But what’s still exciting is now that the model has kind of locked in place, all the second order factors can be applied. I have noticed that the correlation coefficient is not even close to a maximum. That often happens with models that get close to the truth. Below is a scatter plot of the +/- excursions for a fit after 1940, data with the minimal amount of noise.

Fig 3: Model versus data scatter plot
Below is the training interval, showing that even though the fit was optimized after 1940, the model works remarkably well in the years prior.

Fig. 4: Training interval for Figure 3.
There are a few cases where the fit produces phantom peaks which do not show strongly in the data, such as a phantom La Nina (cold) excursion of 1936. The reason for this is definitely worth looking into.
As a comparison, the fit via a free tidal analysis programs is shown below. It uses a similar multiple regression fitting technique allowing between 5 to 35 tidal periods in the fit. Note below that there are also excursions beyond what is predicted — in this case, due to the occasional intense storm surges. Similar to tide measurements being impacted by sporadic events, could it be that the ENSO of 1936 was an anomalous year, starting out with record cold in the Dakotas and continuing with one of the hottest summers on record in the Midwest? We may be able to figure that out with the help of this approach.
World Tides is a general purpose program for the analysis and prediction of tides. Using least squares harmonic analysis, it allows the user to decompose a water level record into its tidal and non-tidal components by fitting between 5 and 35 user-selectable tidal frequencies (tidal harmonic constituents). The constituents can then be saved to allow future prediction of tides. Non-tidal water level displayed in graphs provide a complete depiction of storm surge and storm tide (storm surge plus astronomic tide) during tropical and extratropical events.
As the ENSO model gets cleaned up, I will place it on the ContextEarth as an interactive chart. This will feature both the wave equation transform approach, and a DiffEq solution approach — shown below with a hand optimization of the factors, and a 180 degree flipping of the biennial forcing starting at 1981. Although not optimal, it does exactly what it is supposed to do with a forcing phase inversion — the response is phase inverted as well.

Fig 6 : DiffEq solution to the wave equation, using similar factors to that used in the wave equation transform.
As it stands, this ENSO model is very close to the QBO model in terms of my having confidence that it they are reflecting the real geophysical processes that are occurring in the ocean and the atmosphere.
In the last post I mentioned the research of Sonechkin, which is actually quite intriguing. I will place a couple of Sonechkin’s papers in the comments. Although Sonechkin’s research is spotty and not very comprehensive on this topic, he has been suggesting over the last few decades that something just like what I have described is actually the fundamental driver behind ENSO. Sonechkin is also not a crank, with over 1000 citations to one of his co-authored papers (in Nature).
First paper by Sonechkin, from 2001, suggesting quasiperiodic forcing to ENSO
LikeLike
Second paper by Sonechkin, with the idea of period doubling playing a role. Even though Soneckin has benn at this for awhile, there is nothing involving model fitting as far as I can tell.
Sonechkin, D. M., and R. Brojewski. “ENSO: a quasiperiodic forced dynamical system.” International worhsop on the low-frequency modulation of ENSO, Touluse (2003): 23-25.
LikeLike
The most recent paper by Sonechkin is an abstract
“It means there are no limits to forecast ENSO. in principle. In practice. it opens a possibility to forecast ENSO for several years ahead.”
He also has this paper :
A confirmation of the oceanic pole tide influence on El Niño
Main text in Russian, couple of the usual ocean contour maps but no quantitative graphs.
LikeLike
I still think the real question is can this behavior be incorporated into a GCM? Can the physical effects on individual grid cells be duplicated in code – or are these insights good for predicting ENSO, but intractable at grid cell level?
Maybe starting with the simplest atmospheric models and seeing if it can be simulated there would be a good starting point.
LikeLike
Yup Kevin, Thinking about this some more, at the GCM level the more rapid cycles of the Madden-Julien Oscillation (MJO) will show up. Those are mostly filtered out using a 3 month window. But I am not sure if they would realize that fitting to MJO is not part of this model.
Its kind of like modeling the aerodynamics of a wing. First-order physics is not going to get the turbulence of the trailing wing edge, but it will show that there is enough lift. Can you imagine if in freshman courses on physics that the prof would insist on turbulence modeling instead of Bernoulli’s principle? But that may be where climate science is at with respect to ENSO.
But I don’t think there is a problem with trying an atmospheric model of QBO, because the faster oscillations of an MJO are not there. Nothing to get in the way, and they will have to seriously consider that it is all lunisolar forcing.
Thanks for the insight!
LikeLike
It is always a good decision to refer back to Spencer Weart’s work “The Discovery of Global Warming” as often as possible. The section on the development of GCMs and early models might be of some assistance, General Circulation Models of the Atmosphere
It would be terribly ironic to find out that a model more than 50 or 60 years old can reproduce ENSO if the right equations are included. Yet I know James Hansen is of the belief that more complex models are not necessarily better at teasing out real insight and Isaac Held often uses rudimentary models to examine specific effects.
LikeLike
Nice idea. I wonder, do these models actually include the sloshing of the thermocline? Do they take the trade winds as a forcing and pile up water on one side of the ocean? I am almost certain they don’t treat it geophysically as an inertial forcing from lunisolar and wobble stimulus.
LikeLike
Using the Regional Ocean Modeling System (ROMS) to improve the ocean circulation from a GCM 20th century simulation, Melsom et al, Ocean Dynamics
December 2009, 59:969, DOI: 10.1007/s10236-009-0222-5
“
From previous reading and this paper it appears that GCMs may not yet be capable of producing oceanic tide information, but that high-res regional models can.
LikeLike
Thanks KO,
Think there is a significant distinction between incorporating tides (as in the physical dilation of the water volume) versus incorporating the tidal forces impacting the thermocline (as in angular momentum variations)?
If they just do the former, I am not sure they are capturing the right dynamics.
LikeLike
“ I have noticed that the correlation coefficient is not even close to a maximum. That often happens with models that get close to the truth. ” Given that observational data has its own uncertainties, there is a limit to how high we should even expect correlation coefficients to be. I.e., a perfect model will never perfectly match observational data because the observational data isn’t perfect. I think this is where many people waste time (and fool themselves) — constantly striving to increase correlation — resulting in over-fitting or models that are more complex than they need to be.
Regarding the oceanic tides in ROMS: the regional models have the ability to generate oceanic tides, so this is probably where you’d have to start work modifying their present equations with replacements built off your models and likely introducing some new equations.
LikeLike
That’s true. For example, the Tahiti and Darwin SOI signals are said to be nicely anti-correlated, yet the correlation coefficient with “as is” data is only ~ -0.58. Because the SOI data has more high frequency noise, if you filter both, the CC gets closer to -0.7.
The Darwin data is considered more representative of ENSO, which you can tell by looking at the comparison to NINO3.4 temperature data. So if you compare filtered SOI so that it matches the less noisy NINO3.4, then the CC goes up close to 0.8.
Then you have to consider that early historical data isn’t as high quality as later data.
So in fact significant improvements can be made if you can identify bad sections of data. Consider a transformed ENSO model that is grossly overfit to an interval from 1990 to 2013. The CC in this interval is nearly 0.98 (lower curve below), which is higher than should be expected. The out-of-band validation is still 0.7, which is not bad.
Yet, some areas of disagreement may be attributable to bad data, as you can see from the comparison to tidal gauge data (upper curve). Note the yellow highlighted intervals, especially the one at 1906. This shows a significant El Nino event according to SOI, but the tide data and the model both capture it as a La Nina. That is really the only place that the sign is reversed substantially. A smaller sign reversal occurs at 1964.
I am a big fan of over-fitting — but tempered with checks — because that is the only way one will find new research results. In other words, to find something interesting you have to bend the data without breaking it.
LikeLike
The original SOI data has gaps
ftp://ftp.cpc.ncep.noaa.gov/wd52dg/data/indices/tahiti.his
For example, note that the years around 1906 are filled with -999 entries indicating missing data. Those numbers are interpolated for the final data set. Is that the right thing to do?
LikeLike
There is usually going to be a grey area- do we need to go any further? As Isaac Held says using a quote often attributed to Einstein, ““Everything should be made as simple as possible, but not simpler.”
Of course Held points out that simplistic models can yield insight, but should not be mistaken for reality.
LikeLike
The problem again is that there are no controlled experiments for testing climate models. All you have is the data from measurements of the phenomena as it is captured. So all you can really do is go back and re-evaluate the data that is available.
“Held says using a quote often attributed to Einstein,”
For example, tidal models often contain more than 30 different sinusoidal factors. Which some would consider overfitting until you realize how simple the basic model is.
I posted a comment to Held’s latest blog entry and it is being held in moderation, I assume. This is what I asked:
“Is it generally agreed that along the the equator the Coriolis forces cancel out and so the wave equation model undergoes a vast simplification? And that this has significant implications for how longitudinal equatorial behaviors such as ENSO and QBO can be more easily modeled?
And also that the clearly observable standing wave mode of ENSO implies that the temporal and spatial modes of the partial differential equation can be separated out. That means that a challenging partial differential equation can be transformed into a more easily solvable (and potentially numerically stable) ordinary differential equation.”
LikeLike
And someone is saying that this PRL paper is significant in simplifying behavior.
http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.116.214501
May be related to mean-value analysis.
LikeLike
A former NCAR numerical climate modeler Gerald Browning is visiting blogs and making claims that the GCMs are not solved correctly. Based on this statement, he doesn’t trust them at all.
LikeLike
I am usually just amused by comments by people like Browning.
Over at ClimateAudit he says, “the unrealistic nature of the ECMWF model (and climate models use the same numerics).”
Combined with the quote you provide, “Any competent numerical analyst or applied mathematician wouldn’t touch climate modeling with a 10 foot pole”
Leaves a bit of a paradox. ECMWF is the best weather model available – yet it uses this same horrendous, awful, fraudulent trickery!!!!
Oh wait – wasn’t his point no competent weather analyst wouldn’t touch it? Color me confused.
Actually it’s more likely Browning is following the ‘gone emeritus’ path. His obsession hasn’t been handled to his satisfaction so everything is WRONG!!!
Yawn.
LikeLike
Yes, it’s mostly an assertion-based argument on his part. But unless he can back up his assertions with some contrary evidence, its all whine from sour grapes.
The reason I am following his stuff is because he is using the shallow-water equations to introduce the topic.
If the Coriolis effect cancels out at the equator, this will reduce to the ordinary second-order differential wave equation, which is what I have been leaning on heavily. Yet, I doubt Browning would take kindly to what I am doing either.
LikeLike
I believe Held deleted my comment/question at his blog after being in moderation for a few days:
http://www.gfdl.noaa.gov/blog/isaac-held/2016/06/03/70-spherical-rotating-radiative-convective-equilibrium/
LikeLike
Pingback: Pukite’s Model of ENSO | context/Earth
Pingback: Needless Complexity | context/Earth
Pingback: Simplifying Laplace’s Tidal Equations for QBO | context/Earth
Pingback: Obscure paper on ENSO determinism | context/Earth