Now here's my point. Here we have three college professors pursuing one of the most boring and unglamorous subjects I can imagine. But Steve, for whatever the reason, this is their passion. And who knows? Maybe their study, along with thousands of other little bits and pieces, will ultimately feed into the larger picture of climate change which scientists are steadily building. Here's how:
UCAR administers NCAR, The National Center for Atmospheric Research, which uses some of the most powerful supercomputers in existence for climate modeling. The history of NCAR and these amazing machines is fascinating:
"On July 11, 1977, the CRAY-1A, serial number 3, was delivered to NCAR. The system cost was $8.86 million ($7.9 million plus $1 million for the disks).
The supercomputer weighed 5-1/2 tons, arrived in two refrigerated electronic vans, and needed more than 30 construction workers, engineers, and helpers to move it into the computer room. NCAR accepted the CRAY-1A in December. It was the first CRAY-1A to go into production, and upon its acceptance, Cray Research became a profitable company."
Since the CRAY-1A, NCAR has gone through a succession of supercomputers. The latest model is the IBM p575:
"Named "bluefire," the new supercomputer has a peak speed of more than 76 teraflops (76 trillion floating-point operations per second). When fully operational, it is expected to rank among the 25 most powerful supercomputers in the world and will more than triple NCAR's sustained computing capacity."
And they're going to need every one of those teraflops. NCAR's overall climate model is the CCSM, The Community Climate System Model.
"CCSM is unique among powerful models. Funded by the National Science Foundation and the Department of Energy, it belongs to the entire community of climate scientists, rather than to a single institution. The hundreds of specialists at various institutions in the United States and overseas who collaborate on improvements to CCSM make the model’s underlying computer code freely available on the Web. As a result, scientists throughout the world can use CCSM for their climate experiments.
...CCSM-2 recreates climate by dividing the world’s water and land surface into rectangular grid points that extend upward into the atmosphere in 26 vertical layers. Its resolution varies from 2.8 degrees longitude by 2.8 degrees latitude to an even finer resolution, for oceans and sea ice, of 1 degree by 1 degree–meaning that each cell of the grid at peak resolution corresponds to approximately 10,000 square kilometers (about 3,900 square miles).
...For every grid point, the model uses equations to solve such physical processes as the formation of clouds and the movement of heat and moisture. Scientists also input chemical components such as ozone and carbon dioxide that can affect cloud formation or trap solar heat.
...Such complex calculations demand an extraordinary amount of computer power. To recreate a single day of the world’s climate, the model must perform 700 billion calculations. Although this means producing a picture of the atmosphere takes a long time, the payoff is that CCSM-2 can simulate Earth’s climate patterns in considerable detail." (my emphasis)
Now how do you know if CCSM-2 is an accurate representation of objective reality?
"One way to check a model is to see whether it can recreate known climate patterns. When scientists tested CCSM-2, they aimed to reproduce Earth’s climate from 1870 to 2000. They also recreated the irregular cycle of the El Niño phenomenon in the Pacific Ocean and the ebb and flow of sea ice in the polar regions. In each instance, the model produced simulations that closely resembled known climate data. In the case of sea ice, for example, CCSM-2 matched satellite observations of ice pack movements over the cycle of seasons—a major achievement because of the many forces that drive the formation of sea ice, including temperatures, ocean currents, and precipitation.
Another way to check a model is to examine whether it can simulate Earth’s climate over centuries without drifting from actual world conditions. To conduct this test, scientists ran a 1,000-year simulation with hypothetical conditions based on the current-day atmosphere remaining unchanged. The results: CCSM-2 produced realistic climate patterns without requiring scientists to correct for any drift. This is known as "flux-free" modeling, and it is unique to CCSM-2.
Now that the model has passed its tests with flying colors, scientists are using it to explore major climate issues. If carbon dioxide levels in the atmosphere continue to increase, for example, should certain farmers construct more irrigation systems or should coastal residents brace for more storms? Another line of research will explore whether past climate patterns, such as the Little Ice Age that cooled temperatures in the 17th, 18th, and early 19th centuries, are likely to reoccur in some form."
Steve, I followed the link you sent me. In his post, IowaHawk breaks out a spread sheet, then goes shopping around for variables to plug in. The whole exercise (I think) is meant to demonstrate how climate scientists might be fudging the data used to construct climate histories - and ultimately, climate models. Now I mean no disrespect for IowaHawk, but it seems to me lay persons, most with no training in climate science, can select from the massive amount of freely available climate research and use simple tools to create the impression that some of it is suspect.
But the truth is, a climate model like the CCSM, which was generated on the state of the art supercomputers at NCAR, involves the input of virtually thousands, if not millions of variables as diverse as temperature and ocean chemistry. And these models work.
Once again, if you have time, go here to see how the CCSM figured in the IPCC 4th assessment report. An excerpt:
"As one of the world's leading climate modeling and research centers, NCAR is a strong supporter of the IPCC scientific assessment process. NCAR scientists have served as lead and contributing authors in each of the four full IPCC assessment reports (1990, 1995, 2001, and 2007) as well as a number of special reports and technical papers that have focused on more specific issues. NCAR climate modeling and process study research has contributed to the peer-reviewed scientific literature that forms the basis of the IPCC's work."
Steve, the University Corporation for Atmospheric Research and the National Center for Atmospheric Research are American institutions which, as far as I can see, have no political agenda of any kind. Much of the research and research tools furnished by UCAR and NCAR form the backbone of the IPCC assessment reports.
Now I don't think it is wrong for private citizens to ask questions about the stability of the science behind an assessment which will have important consequences for just about everyone on the planet. But to insinuate that a few, largely unproven flaws negate the whole process is a little presumptuous, don't you think?