Applying Big Data Techniques, New Study Finds Most Recent Warming Could Be Natural

John Abbot and Jennifer Marohasy have published a study in GeoResJ titled, “The application of machine learning for evaluating anthropogenic versus natural climate change.” The study, which is available here and in pdf form here, is highly technical. Fortunately, Marohasy has summarized its findings on her blog:

After deconstructing 2,000-year old proxy-temperature series back to their most basic components, and then rebuilding them using the latest big data techniques, John Abbot and I show what global temperatures might have done in the absence of an industrial revolution. The results from this novel technique, just published in GeoResJ, accord with climate sensitivity estimates from experimental spectroscopy but are at odds with output from General Circulation Models.    

According to mainstream climate science, most of the recent global warming is our fault – caused by human emissions of carbon dioxide. The rationale for this is a speculative theory about the absorption and emission of infrared radiation by carbon dioxide that dates back to 1896. It’s not disputed that carbon dioxide absorbs infrared radiation, what is uncertain is the sensitivity of the climate to increasing atmospheric concentrations.  

This sensitivity may have been grossly overestimated by Svante Arrhenius more than 120 years ago, with these overestimations persisting in the computer-simulation models that underpin modern climate science. We just don’t know; in part because the key experiments have never been undertaken.

What I do have are whizz-bang gaming computers that can run artificial neural networks (ANN), which are a form of machine learning: think big data and artificial intelligence….

We figured that if we could apply the latest data mining techniques to mimic natural cycles of warming and cooling – specifically to forecast twentieth century temperatures in the absence of an industrial revolution – then the difference between the temperature profile forecast by the models, and actual temperatures would give an estimation of the human-contribution from industrialisation.

After explaining how they conducted their analysis, Marohasy then shares their conclusions:

Considering the results from all six geographic regions as reported in our new paper, output from the ANN models suggests that warming from natural climate cycles over the twentieth century would be in the order of 0.6 to 1 °C, depending on the geographical location. The difference between output from the ANN models and the proxy records is at most 0.2 °C; this was the situation for the studies from Switzerland and New Zealand. So, we suggest that at most, the contribution of industrialisation to warming over the twentieth century would be in the order of 0.2°C.

[However:] The Intergovernmental Panel on Climate Change (IPCC) estimates warming of approximately 1°C, but attributes this all to industrialization…. 

In our new paper in GeoResJ, we not only use the latest techniques in big data to show that there would very likely have been significant warming to at least 1980 in the absence of industrialisation, we also calculate an Equilibrium Climate Sensitivity (ECS) of 0.6°C. This is the temperature increase expected from a doubling of carbon dioxide concentrations in the atmosphere. This is an order of magnitude less than estimates from General Circulation Models, but in accordance from values generated from experimental spectroscopic studies, and other approaches reported in the scientific literature.