Vostok Ice Cores

As the partial pressure of CO2 in sea water goes like
$$ c = c_0 * e^{-E/kT}$$
and the climate sensitivity is
$$ T = \alpha * ln(c/c_1) $$
where c is the mean concentration of CO2, then it seems that one could estimate a quasi-equilibrium for the planet’s temperature. Even though they look nasty, these two equations actually solve to a quadratic equation, and one real non-negative value of T will drop out if the coefficients are reasonable.
$$T = \alpha * ln(c/c_1) – \alpha * E / kT $$
For CO2 in fresh water, the activation energy is about 0.23 electron volts. From “Global sea–air CO2 flux basedon climatological surface ocean pCO2, and seasonal biological and temperature effects” by Taro Takahashi,

The pCO2 in surface ocean waters doubles for every 16C temperature increase
(d ln pCO2/ dT=0.0423 C ).

This gives 0.354 eV.

We don’t know what c1 or c0 are and we can use estimates of climate sensitivity for α is (between 1.5/ln(2) and 4.5/ln(2)).

When solving the quadratic the two exponential coefficients can be combined as
$$ ln(w)=ln(c_0/c_1)$$
then the quasi-equilibrium temperature is approximated by this expansion of the quadratic equation.
$$ T = \alpha ln(w) – \frac{E}{k*ln(w)} $$
What the term “w” means is the ratio of CO2 in bulk to that which can effect sensitivity as a GHG.

As a graphical solution of the quadratic consider the following figure. The positive feedback of warming is given by the shallow-sloped violet curve, while the climate sensitivity is given by the strongly exponentially increasing curve. Where the two curves intersect, not enough outgassed CO2 is being produced such that the asymptotically saturated GHG can further act on. The positive feedback essentially has “hit a rail” due to the diminishing return of GHG heat retention.

We can use the Vostok ice core data to map out the rail-to-rail variations. The red curves are rails for the temperature response of CO2 outgassing, given +/- 5% of a nominal coefficient, using the activation energy of 0.354 eV. The green curves are rails for climate sensitivity curves for small variations in α.

This may be an interesting way to look at the problem in the absence of CO2 forcing. The points outside of the slanted parallelogram box are possibly hysteresis terms causes by latencies of in either CO2 sequestering or heat retention.  On the upper rail, the concentration drops below the expected value, while as drops to the lower rail, the concentration remains high for awhile.

The cross-correlation of Vostok CO2 with Temperature:

Temperature : ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/antarctica/vostok/deutnat.txt
CO2 core : ftp://ftp.ncdc.noaa.gov/pub/data/paleo/icecore/antarctica/vostok/co2nat.txt

The CO2 data is in approximately 1500 year intervals while the Temperature data is decimated more finely.  The ordering of the data is backwards from the current date so the small lead that CO2 shows in the above graph is actually a small lag when the direction of time is considered.

  

The top chart shows the direction of the CO2:Temperature movements. Lots of noise but a lagged chart will show hints of lissajous figures, which are somewhat noticeable as CCW rotations for a lag. On temperature increase, more of the CO2 is low than high, as you can see it occupying the bottom half of the top curve.

The middle chart shows where both CO2 and T are heading in the same direction. The lower half is more sparsely populated because temperature shoots up more sharply than it cools down.

The bottom chart shows where the CO2 and Temperature are out-of-phase. Again T leads CO2 based on the number you see on the high edge versus the low edge. The lissajous CCW rotations are more obvious as well.

Bottom line is that Temperature will likely lead CO2 because I can’t think of any Paleo events that will spontaneously create 10 to 100 PPM of CO2 quickly, yet Temperature forcings likely occur. Once set in motion, the huge adjustment time of CO2 and the positive feedback outgassing from the oceans will allow it to hit the climate sensitivity rail on the top.

So what is the big deal? We don’t have a historical forcing of CO2 to compare with, yet we have one today that is 100 PPM.

That people is a significant event, and whether it is important or mot we can rely on the models to help.


This is what the changes in temperature look like over different intervals.

The changes follow the MaxEnt estimator of a double sided damped exponential. A 0.2 degree C change per decade(2 degree C per century)  is very rare as you can see from the cumulative.

That curve that runs through the cumulative density function (CDF) data is a maximum entropy estimate. The following constraint generated the double-sided exponential or Laplace probability density function (PDF) shown below the cumulative:
$$\int_{I}{|x| p(x)\ dx}=w$$
which when variationally optimized gives
$$p(x)={\beta\over 2}e^{-\beta|x|},\ x\ \in I=(-\infty,\infty)$$
where I fit it to:
$$\beta = 1/0.27$$
which gives a half-width of about +/- 0.27 degrees C.

The Berkeley Earth temperature study shows this kind of dispersion in the spatially separated stations.

Another way to look at the Vostok data is as a random up and down walk of temperature changes. These will occasionally reach high and low excursions corresponding to the interglacial extremes. The following is a Monte Carlo simulation of steps corresponding to 0.0004 deg^2/year.

The trend goes as:
$$ \Delta T \sim \sqrt{Dt}$$
Under maximum entropy this retains its shape:

This can be mapped out with the actual data via a Detrended Fluctuation Analysis.
$$ F(L) = [\frac{1}{L}\sum_{j = 1}^L ( Y_j – aj – b)^2]^{\frac{1}{2}} $$
No trend in this data so the a coefficient was set to 0. This essentially takes all the pairs of points, similar to an autocorrelation function but it shows the Fickian spread in the random walk excursions as opposed to a probability of maintaining the same value.

The intervals are a century apart. Clearly it shows a random walk behavior as the square root fit goes though the data until it hits the long-range correlations.

Sea temperature correlation

This is a continuation from the Thermal Diffusion and Missing Heat post.

If one takes several months worth of data from a location, the sea surface temperature (SST) and subsurface time series looks like this over a few days.

After a cross-correlation function is applied 75 meters downward, one sees an obvious yet small lag from that level as it mixes with the lower layers.

 The lag is longer the farther the reach extends downward.  If one assumes a typical James Hansen diffusion coefficient of 1.5 cm^2/s, the diffusion is actually very slow over long distance. A Fickian diffusion would only displace 100 meters after 3 years at that rate. So the effect is fairly subtle and to detect it requires some sophisticated data processing.

Bottom-line is that the surface water is heated and it does mix with the deeper waters. Otherwise, one would not see the obvious thermal mixing as is obtained from the TAO sites.

Some other correlations using R. These show longer range correlations. Note the strong correlation oscillations which indicates that temperatures changes happen in unison and in a coherent fashion, given a lag that is less apparent at this scale. Note that the lag is in the opposite direction due to the specification that R uses for defining an ACF ( x(t+k)*y(t), instead of x(t)*y(t+k)).
 

Thermal mixing and an effective diffusivity is occurring. The only premise required is that an energy imbalance is occurring and that it will continue to occur as CO2 is above the historical atmospheric average. This imbalance shows up to a large extent in the oceans waters and is reflected as a temporal lag in global warming.


Notes: 
Have to be careful about missing data in certain sites. Any time you see a discontinuity in a correlation function, bad data (values with -9.999 typically) is usually responsible.

Temperature Induced CO2 Release Adds to the Problem

As a variable amount of CO2 gets released by decadal global temperature changes, it makes sense that any excess amount would have to follow the same behavior as excess CO2 due to fossil fuel emissions.

From a previous post (Sensitivity of Global Temperature), I was able to detect the differential CO2 sensitivity to global temperature variations. The correlation of temperature anomaly against d[CO2] is very strong with zero lag and a ratio of about 1 PPM change in CO2 per degree temperature change detected per month.

Now, this does not seem like much of a problem, as naively a 1 degree change over a long time span should only add one PPM during the interval. However, two special considerations are involved here. First, the measure being detected is a differential rate of CO2 production and we all know that sustained rates can accumulate into a significant quantities of a substance over time. Secondly, the atmospheric CO2 has a significant adjustment time and the excess isn’t immediately reincorporated into sequestering sites. To check this, consider that a slow linear rate of 0.01 degree change per year when accumulated over 100 years will lead to a 50 PPM accumulation, if the excess CO2 is not removed from the system. This is a simple integration where f(T(t)) is the integration function :
$$ [CO2] = f_{co_2}(T(t)) = \int^{100}_0 0.01 t\, dt = \frac{1}{2} 0.01 * 100^2 = 50 $$
The sanity check on this is if you consider that a temperature anomaly of 1 degree change held over 100 years would release 100 PPM into the atmosphere. This is simply a result of Henry’s Law applied to the ocean. The ocean has a large heat capacity and so will continue outgassing CO2 at a constant partial-pressure rate as long as the temperature has not reached the new thermal equilibrium. (The CO2 doesn’t want to stay in an opened Coke can, and it really doesn’t want to stay there when it gets warmed up)

So, if we try the impulse response we derived earlier (Derivation of MaxEnt Diffusion) to this problem, with a characteristic time that matches the IPCC model for Bern CC/TAR, standard:

As another sanity check, the convolution of this with a slow 1 degree change over the course of 100 years will lead to at least a 23 PPM CO2 increase.

Again, this occurs because we are far from any kind of equilibrium, with the ocean releasing the CO2 and the atmosphere retaining what has been released. The slow diffusion into the deep sequestering stores is just too gradual while the biotic carbon cycle is doing just that, cycling the carbon back and forth.

So now we are ready to redo the model of CO2 response to fossil-fuel emissions (Fat-Tail Impulse Response of CO2) with the extra positive feedback term due to temperature changes. This is not too hard as we just need to get temperature data that goes back far enough (the HADCRUT3 series goes back to 1850). So when we do the full combined convolution, we add in the integrated CO2 rate term f(T), which adds in the correction as the earth warms.

$$ [CO2] = FF(t) \otimes R(t) + f_{co_2}(T(t)) \otimes R(t) $$

When we compute the full convolution, the result looks like the following curve (baseline 290 PPM):

The extra CO2 addition is almost 20 PPM just as what we had predicted from the sanity check. The other interesting data feature is that it nearly recreates the cusp around the year 1940.  The previous response curve did not pick that up because it is entirely caused by the positive-feedback warming during that time period. The effect is not strong but discernible.

We will continue to watch how this plays out. What is worth looking into is the catastrophic increase of CO2 that will occur as long as the temperature stays elevated and the oceans haven’t equilibrated yet.

TOTAL Discovery Data

From Figure 1 of Laherrere’s recent discovery data, he suggests a crude oil URR of 2200 GB.

The oil company Total S.A. also has an accounting of yearly discoveries. I overlaid their data with Laherrere’s data in the figure below:

Total must do a bit of a different backdating because their data is consistently above Laherrere’s for the majority of the years. As of about 2005, their cumulative is at 2310 GB while Laherrere is at 1930 GB.

This leads to the following asymptotic graph for cumulative oil according to the Dispersive Discovery model. In this case I assigned a URR of 2800 GB to the model, with a model value of 2450 GB as of 2005. In other words, the Total discovery data may hit an asymptote of 2800 GB, which may be a bit generous:

This is really for comparative purposes as I next plotted what Laherrere’s discovery data looks like against the same model.

You can see that Laherrere’s data likely won’t hit that asymptote.

Discovery data for crude oil is hard to come by. Perhaps Laherrere is using 2P probabilities and Total is applying possible reserve growth so that it is >2P? Or perhaps Total is using barrels of oil equivalent (BOE) to inflate the numbers (which Shell oil does)? In the greater scheme of things, does this really matter?

The following chart is the Shock Model applied to the Total discovery data, whereby I tried to follow the historical crude oil (not All Liquids) production rates by varying the extraction rate until the year 2000, then I kept the extraction rate constant at about 3.1% of reserves. This is lower than the currently accepted 4% to 5% extraction rate from reserves.

If Total does use BOE maybe this should actually fit an All Liquids curve, in which case the extraction rates would need to get increased to match the higher levels of All Liquids production.

Bottom line is that the peak plateau might extend for a couple of years and we will have a fatter decline tail if we believe the Total numbers. If it is an All Liquids discovery model, then it is a wash. As if everyone didn’t know this by now, peak oil is not about the cumulative, it is about the extractive flow rates, and this is a good example of that.

In general, the URR is incrementally getting pushed up with time. Laherrere had used 2000 GB for a crude oil URR for some time (see this curve from 2005) and now likely because of the deep water oil it is at 2200 GB.

As for going through the trouble of evaluating the Gulf Of Mexico data, that is just noise on the overall curve IMO. It’s getting to the point that we have enough historical data that global predictions for crude oil are really starting to stabilize. And the Dispersive Discovery model will anticipate any future discoveries. The longer we wait, the closer all the estimates will start to converge, and any new data will have little effect on the projected asymptotic cumulative.

The following is a curve that takes the Total S.A. discovery data and extrapolates the future discoveries with a dispersive discovery model. The final discovery URR is 2700 billion barrels, which is quite a bit higher than the one Laherrere plots. This is higher because I am making the assumption that Total S.A. is including backdated NGPL and other liquids along with the crude oil. Which means I had to fit against the production data that also used these liquids.

To model the perturbations in production levels, which is necessary to accumulate the reserves properly, I used the Oil Shock Model. In the inset, you can see the changes in extraction rate that occurred over the years. The extraction rate is essentially the same as the Production/Reserve ratio. Notice that the extraction rate was steady until the 1960’s at which it ramped up. It started to level off and drop down during the 1970’s oil crisis and didn’t really start to rise again until the 1990’s. I am extrapolating the extraction rate from today to match the peak extraction rate of the 1960’s by the year 2050.

This is largely a descriptive working model, which essentially reflects the data that Total S.A. is providing and then reflecting that in terms of what we are seeing in the production numbers. The current plateau could be extended if we try to extract even faster (as in Rockman’s words “PO is about oil flow rate”) or we can start including other types of fuels to the mix. This latter will happen if the EIA and IEA add biofuels and other sources to the yearly production.

The bottom-line is that it is hard to come up with any scenarios, based on the data that Total and IHS supplies , that can extend this plateau the way that Total suggests it will, peak to 2020. That is what is maddening about this whole business and you wonder why drivel such as what Yergin continues to pump out gets published.