CONTACT: Stanford University News Service (650) 723-2558
Technological push and pull: The history of the global climate change concept
STANFORD -- The idea that human activities could rapidly change the earth's climate is more than a century old, but it is relatively new as an international environmental and political issue, historians Paul Edwards and Pamela Mack said in a Feb. 10 symposium on global climate change.
In fact, the technology that has made global climate science possible was often pushed forward not by scientific curiosity or environmental concerns but by Cold War politics and internecine competition between U.S. government agencies. In at least one case, scientists had their doubts that a new technology would provide much information. That technology - the weather satellite - is now a staple of short-term weather prediction and television newscasts.
This jostling of technology, politics and science is often the blend that leads eventually to new knowledge, Mack and Edwards said during the symposium, “Global Climate Change: The History and Politics of Global Warming.”
Edwards, a lecturer in Stanford's Program in Science, Technology and Society, is an expert on the history of computers and their impact on politics and culture. He currently is working on a study of the role of computers in the science and politics of climate change. Mack, associate professor in the history of technology at Clemson University, is an expert on space history and a member of the Earth Studies subcommittee of the National Academy of Sciences' Space Studies Board. She is researching the history of experimental weather satellites.
Edwards said that French physicist Jean Fourier was the first to understand the greenhouse effect. Fourier suggested in 1824 that the earth stays warm at night because its atmosphere traps sun-warmed gases in the same way that a greenhouse holds heated air.
In 1894, Swedish physical chemist Svante Arrhenius predicted that if levels of carbon dioxide in the atmosphere doubled, the average temperature of the earth would rise between 1.5 and 4 degrees Celsius - close to the prediction shared by most climatologists today.
At the time, no one felt much sense of alarm. World population was one- third the modern level. Internal combustion engines were rare. No one expected that a doubling of atmospheric carbon dioxide would ever occur. Now, most atmospheric chemists predict that this is inevitable over the next two centuries, even if world use of coal, gasoline and other fossil fuels can be held at 1990 levels.
When scientists began a serious study of climate change using the first computers, it was with the idea that climate control might be a Cold War weapon, a way to starve the Russians by destroying their crops. The complexity - and the global nature - of climate change was not yet understood.
In 1946, computer pioneer John von Neumann chose weather prediction as the second big task (after calculations for the hydrogen bomb) to be tackled by ENIAC, the first American digital computer. Since then, some of the world's largest supercomputers have been used to model the atmosphere. The chance to predict the weather and to understand (and perhaps manipulate) climate was one of the major motivators of early supercomputer development, Edwards said. Most other uses were military- related, with early computers used to analyze intelligence data and to develop command and control systems.
The first computer weather forecast run using ENIAC in 1950 wasn't very accurate, Edwards said, but good enough to continue the research. Since then, computers have been used to develop forecasts that can fairly accurately tell the weather several days in advance, and with a modest degree of accuracy several weeks in advance.
While weather prediction models attempt to forecast snow, sun, storms and temperature - what happens on a daily basis - climate models use measurements of weather plus physical principles to calculate what could happen over seasons, years, decades and centuries. “Climate is the average weather over very long periods,” Edwards said.
By 1965, the first general circulation model of the entire earth's atmosphere had been published.
Modern general climate models use vast amounts of data about the behavior of the atmosphere, ocean currents and other earth systems. The models crunch numbers to predict how climate will change in response to a change in a variable like atmospheric carbon dioxide. The task, even with supercomputers, is massive: A general circulation model takes 1,200 hours to calculate the potential climate for a 100-year period.
All such models are only simulations of what happens in nature. Increasingly, the data that grounds them in reality are collected from the atmosphere, the ocean and the earth's surface by satellites. However, scientists at first had little interest in using satellites to observe weather and climate. Like the first computers, satellite sensing was a military innovation.
“I call that technology push,” said Mack. “Some person pushes a new technology that nobody knew they needed.”
The push in this case came from competition within the Defense Department. In 1958, when the Air Force was assigned to take over satellite spying, the Army promptly renamed the satellites it had been developing independently. "The Army said they were weather satellites," said Mack.
When the National Aeronautics and Space Administration was created later the same year, taking over the Army's satellites, it was effectively in the weather-sensing business.
Meteorological scientists had almost nothing to do with the plans for the first weather satellites, Mack said. In fact, they expected that an eye in the sky would do very little except help with storm warnings. When one meteorologist saw how much he could learn from the first weather pictures from space, he said “we've gone from rags to riches overnight,” Mack said.
There followed years of tug-of-war between the U.S. Weather Bureau and NASA about what kinds of satellites should be built and how to develop accurate sensors so they could collect accurate data on the atmosphere at various levels and on ocean surface temperatures. Difficulties continue to the present day, as NASA budget cutbacks and delayed shuttle launches push back the development of a new generation of earth-sensing satellites.
Once counts of temperature, water vapor and ozone content could be made, technology pushed climate science in another way, Mack said. "The use of the new data revealed significant problems with the approach used by meteorologists interested in numerical forecasting. Scientists had underestimated the complexity of atmospheric patterns, and the weather forecasting models had mathematical weaknesses.
"In addition, the large-scale view of weather patterns provided by satellite images revealed significant phenomena that scientists had left out of their models entirely.”
Over time, scientists improved the accuracy of both the models and the data, but the question of how to include all of the possible impacts on climate change remains today, Edwards said. Increasingly, though, computer models have shown that human activity could influence climate change, and computer graphics make these changes visually understandable.
The public's perception of climate change has shifted, from the “can do” idea of climate control in the 1950s to a more recent concern that climate change could cause disasters.
In 1957, when scientists around the world teamed up to study the physics of the earth for the International Geophysical Year, atmospheric researcher Roger Revelle was concerned enough about carbon dioxide buildup to suggest a permanent monitoring station at the top of Mauna Loa, the extinct volcano on Hawaii. Edwards said that this station is still the source of probably the single undisputed fact in climate research - that is, that there has been a steady increase in the atmospheric concentration of carbon dioxide since 1957, and we have other means of knowing that it has been rising since the beginning of the Industrial Revolution.
Revelle said that the rise in carbon dioxide, caused mostly by increased burning of fossil fuels, was “a great geophysical experiment being conducted by the human race.” However, Edwards said, in the 1950s it was assumed that if global warming got troublesome, it could be solved with some technological “fix.” Even Revelle recommended nothing but more monitoring.
When environmental concerns emerged in the 1970s, the government response was to fund a Climate Impact Assessment Program that increases support for climate modeling. In the early 1980s, scientists, including astronomer Carl Sagan and climatologist Stephen Schneider, proposed the idea that smoke and fires from nuclear war would lead to years of “nuclear winter” - or at least to a less catastrophic, still serious "nuclear autumn."
In the 1980s, scientists also recorded the first proof that the earth's protective ozone layer is thinning.
“Note that these two metaphors - nuclear winter and the ozone hole - are significant in the debate, because they have overtones of danger and emergency, unlike the term 'greenhouse,' which has a kind of benign ring,” Edwards said.
Though debate still rages over the issue, public awareness has shifted to a concern that global climate might go drastically out of control. When the largest gathering of world leaders in history met in Rio de Janeiro in 1992 for the United Nations Conference on Environment and Development, the chief diplomatic document to emerge from the conference was an international treaty establishing a framework to prevent climate change. This month in Berlin, the signers of that treaty are meeting to hammer out the details of how nations will cut carbon dioxide emissions to keep the global greenhouse from getting hotter.
This is an archived release.
This release is not available in any other form.
Images mentioned in this release are not available online.
© Stanford University. All Rights Reserved. Stanford, CA 94305. (650) 723-2300.