tools of light

Computers Incapable of Modeling Climate: Billions Wasted To Perpetuate Deception

by DR. TIM BALL on JANUARY 3, 2012


Billions of taxpayers dollars are wasted on politically motivated climate science, most of it on buying and running computers incapable of modeling global climate. They’re incorrectly programmed so a CO2 increase causes a temperature increase to fulfill the saying “Garbage In, Garbage Out” (GIGO). In the real world, temperature increases before CO2, but the programmers need a political result. Naturally the temperature forecasts are consistently wrong, but that doesn’t matter. They claim they’re getting better and all they need are bigger, faster computers. It won’t and can’t make any difference, but they continue to waste money.

Recently Cray computers produced the Gaea supercomputer for climate research at the National Oceanic Atmospheric Administration (NOAA). More commonly spelt Gaia, she’s the Greek Earth goddess of the religion of Environmentalism. They reinforced the image with a landscape panorama.

Cray Gaea supercomputer (NOAA)

Cray Gaea with landscape panorama. Original Photo ORNL photos/Jay Nave

It puts a false face on the waste of money. It will produce meaningless results, like all the computers, despite being one of the biggest and fastest. It has a 1.1 petaflops capacity. FLOPS means Floating-Point Operations per Second, and peta is 1016 (or a thousand) million floating point operations per second.

This sounds impressive, but is totally inadequate. At a 2008 climate computer conference in Reading, England, Jagadish Shukla reported that,

The current generation high-end computers for climate research have a capability of about 50 teraflops, which makes it possible to integrate a typical climate model with about 100 km horizontal resolution for 20 years in one day.

So at 50 trillion floating point calculations per second, they only study 20 years of record per day. Worse, each run using identical input yields different results, so they average several runs.

This confirms what Caspar Ammann told Steve McIntyre:

GCMs (General Circulation Models) took about 1 day of machine time to cover 25 years. On this basis, it is obviously impossible to model the Pliocene-Pleistocene transition (say the last 2 million years) using a GCM as this would take about 219 years of computer time.

This is with a grossly simplified model with a grid so large that each covers very different climate regions. Shukla challenges,

We must be able to run climate models at the same resolution as weather prediction models, which may have horizontal resolutions of 3-5 km within the next 5 years. This will require computers with peak capability of about 100 petaflops.

It makes no difference; weather prediction models don’t work either. Proponents argued weather predictions are different than climate predictions. They’re not, because climate is the average weather. Wrong weather yields wrong climate.

Capacity and speed are only one reason computer models can’t work. Models are built on data, but there is no surface data for most of the world. Using smaller grids doesn’t increase the available data. Worse, models are three-dimensional and there is virtually no data above the surface. They also leave out most variables and mechanisms. It doesn’t matter how big or fast the computer is – the data to build or test it doesn’t exist.

I watched climate science hijacked by computer modelers in the 1980s. Keynote speakers at climate conferences, they dominated with the new technology. They fought among themselves for dominance; not based on results because predictions were consistently wrong, but about who had the biggest and fastest computer. The challenge was to get funding.

Maurice Strong set up the Intergovernmental Panel on Climate Change (IPCC) through the World Meteorological Organization (WMO), which provided access to national funding for expensive machinery. It also meant appointment of climate modelers who Donna Laframboise said though

…often called scientists, their work has little in common with traditional science.


Nor has the IPCC subjected climate models to rigorous evaluation by neutral, disinterested parties. Instead it recruits the same people who work with these models on a daily basis to write the section of the Climate Bible that passes judgment on them. This is like asking parents to rate their own children’s attractiveness.

The relationship between one country’s climate modelers and the IPCC illustrates this point. George Boer is considered the architect of Canada’s climate modeling efforts. As an employee of Environment Canada, he has spent much of his career attempting to convince the powers-that-be that climate models are a legitimate use of public money.

They are not. Canada’s Auditor General identifies $6.36 billion “climate change funding announcements between 1997 and 2005″, but at what price? A December 13 2011 story provides an answer. Environment and Sustainable Development Commissioner Scott Vaughan reports,

Environment Canada has failed to implement a strategic plan to improve its internal scientific research in areas ranging from managing air and water pollution to toxic chemicals.

Billions are spent on useless computers and climate change while not dealing with real problems. They’re not alone. It’s happening in national weather agencies around the world.

Computers don’t have the capacity, and we don’t have the data or understanding, to create climate models that can recreate global climate or make accurate forecasts. Climate change provided a vehicle for computer modelers to play their games. If you don’t believe me, read what modeler Shukla wrote:

It is imperative that the summit recommends a realistic roadmap to enable and accelerate progress in climate modelling and prediction and to provide substantial and sustained support for enhanced workforce and computing resources. This is our moment! The problem of climate change has riveted the attention of the peoples of the Earth.

Shukla’s rationale is based on a false assumption: “Now we have before us, thanks to IPCC, a third discovery: humans are affecting the Earth’s climate.”

This 2008 meeting concluded, “the scientists had agreed that massive investment in computer and research resources was critical for revolutionising modelling capabilities so that predictions could capture the detail required to inform policy.” It was too late. The IPCC already informed policy in their 2007 Report that models proved with 90+ percent certainty human CO2 was causing climate change. But their models are the only place where CO2 precedes temperature increase. So GIGO provides the greatest deception in scientific history. GIGO. Stop deceiving and stop wasting our money.


Related articles:

  1. IPCC False Philosophical Footings: A Massive Deception
  2. The Intergovernmental Panel on Climate Change (IPCC) Has Achieved Its Goal: It’s Time To Repair The Damage.
  3. Canada Quit Kyoto, Must Now Quit IPCC
  4. Statistics Impact on Modern Society and Climate
  5. Climate Bureaucracies Are The Choice Because They Perpetuate Problems

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.