Not that Certain

A new paper (A probabilistic analysis of human influence on recent record global mean temperature changes–Kokic, Crimp, Howden) states a 99.999% certainty humans are causing the warming on the planet, IF the model contains all factors with significant (ie measurable and large enough to affect the outcome) influence on climate.

The model only has four factors: CO2 (GHG as measured by Kyoto Protocol), ENSO, TSI, and volcanoes. It’s highly unlikely that there are not more factors–for example oceans storing heat, albedo of arctic and antarctic ice, back radiation, convection currents, etc, just to name a few I have read about on various sites. If any of these have a large effect, the model does not match reality and any outcome or prediction may be useful by chance but most probably useless other than to grab headlines.

Also, if the measurements of any of the factors is not accurate, the conclusion is void. That does not mean the conclusion is not true–it means the models and statistics used to create the model and certainty are invalid. In other words, the model is back to an unproven hypothesis. It is possible for an incomplete model might be useful in some ways, but the 99.999% certainty is most certainly exaggerated and should be scrapped. A four factor model of climate that shows this kind of “certainty” is very unlikely to be accurate or even useful.

The modeler’s use a bootstrap calculation, something that seems to be used more and more in the studies I have been reading. In theory, the bootstrap yields multiple data sets to increase the likelihood that the model cover all data. (Correct me if my explanation of this is poorly stated. I am sometimes not very good at explaining statistics so it makes sense to readers.) They ran the bootstrap 100,000 times both leaving in and leaving out GHG. From this, they reached the incredible (or perhaps not-so-credible) 99.999% number.

There is no information on whether or not the model was run eliminating other factors one at a time in the same fashion as GHG. This is vital to gauge whether something else may have just as strong an effect.

The model B also indicated only an approximate 25% of 304 months of continuous record breaking temperatures, but that was one of the original questions in the model–how likely are 304 months of record breaking temperatures without human influence? That would seem to indicate the model missed the mark. Model E also showed only about a 53% chance of this temperature streak happening. Why can’t the model reproduce the 304 months of record setting temperatures? With 99.999 % certainty, one would expect nothing less.

An interesting thing that did show up in the study was the prediction of periods of flat or colling temperatures and the number of periods of cooling was closer to observed in the runs with GHG left in than those without. The number was still not matching actual recoded data but was closer with GHG.

What does this tell us about humans, GHG and certainty? IF the models are sufficiently accurate, there could be a strong case for humans causing warming. However, the small number of variable in the model call into question whether all significant factors have been included. Without a 99.999% certainty that these are the only factors needed the conclusion is not valid. If any measurements of input variables are even slightly off, the conclusion does not hold.

All in all, the study, while it addressed some interesting points fell far short of being definitive proof of humans causing climate change. The certainty is far over stated when one compares reality to the model and its conclusions.