Behind the “Contributors” curtain at RedState, we often toss things we’ve found over to others who may be more qualified to use them. For example, your humble correspondent found that Pravda story about Hillary(TM) meaning four years of war – and tossed it over to Moe since he is the master of snark.
Yesterday, Josh (Painter) found a new piece on “global warming” and tossed it over to me.
I generally don’t see much value in “channeling” an article unless I can either make worthwhile added comments of my own, or add something to it.
In this case, I’ll link to the article, but it provides a great chance to discuss a couple of critical scientific aspects of “global warming” (or whatever it’s being called this week) that simply get far too little attention.
More below the fold….
The entire article comes from Senator Inhofe’s continual battle to bring some sanity to Capitol Hill on this issue; the whole article is here.
But from here, your humble correspondent wants to note two critical-but-badly-neglected things.
John Christy is a pretty solid non-believer of the whole AGW thing; he’s known for that. But this devastating quote from him is worth further discussion:
Climatologist Dr. John Christy of the University of Alabama in Huntsville questioned the study. “One must be very cautious with such results because they have no real way to be validated,” Christy told the AP. “In other words, we will never know what the temperature was over the very large missing areas that this technique attempts to fill in so that it can be tested back through time,” Christy added.
This is something I’ve continually tried to point out, since there’s something that gets lost in the shuffling of attempts to study “temperature” over large ranges of space (and time). “Temperature” can only exist as a definable quantity at a particular location in space at a particular point in time. That’s just basic physics – and once you go beyond that, you are entering a realm of murk that is much less enlightening and much less exact.
I’ve occasionally used the notion of pondering how one would define the “average” temperature for, say, the state of Virginia – right now. How many locations would you have to use? How dense would the “grid” have to be in space? How do you scale the densities for places that have more variability in a smaller amount of space? And given the climatic extent of Virginia…. what would some “average” temperature even mean? And could we assign data-like-exactitude to that number?
I guess another analogy for those of you in multi-person households would be the following. You can more-or-less readily (within the constraints of the accuracy of measuring tools and other uncertainities) measure the height or weight of every member of the household. That gives you real data on each specific person. You could then compute an average height or an average weight for your household – and produce a number. But exactly what this number means or what it tells us…. raw data (which has clear meaning) has been “processed” into something that produces a number, but the actual “content” of that number is much lower.
One core problem always faced in science and technology is actually partially illuminated by that analogy. It’s always important to carefully evaluate what you actually can “know” and what you are not really able to “know.” The mere ability to compute numbers does not mean that those numbers have solid meaning.
A further problem is that thinking unfortunately often flows in this fashion: “We can’t really know that, but it would be very nice if we could; ergo, we’ll just pretend that we can, and act accordingly.” This kind of thinking also explains a lot about the present financial crises, but that’s another story.
But it leads into….
The Inhofe entry cited above links to the main NASA data graph that is being cited for indicating “warming” in Antarctica. Here it is:
Because of my background and experience, I always cringe when I see a graph like that being used to draw deterministic conclusions – or to provide the basis for a claim of a secular shift in the underlying system under study. If you go to the NASA page that contains this graph (here), there’s a link at the bottom where you can download a text file of the “data” used in the chart.
That’s a bit interesting for two reasons. The first is that the “annual average” is an average of averages. Each “annual average” is computed by averaging together the monthly averages – or perhaps from quarterly averages computed from the monthly averages. How the monthly averages are computed isn’t stated, but that’s probably done by using the daily averages. Historically, daily averages are computed by averaging the measured high temperature and the measured low temperature for each particular midnight-delineated 24-hour day. (At the South Pole, some 24-hour line of some sort would have to be chosen, since “midnight” doesn’t exist.) The National Weather Service has been compiling hourly temperature data at domestic stations for a bit more than a decade, but I don’t think that those are used in computing “daily averages.”
This is a somewhat inexact approach to trying to reach an exact result. Note the multiple layers of “averages” (three or four) that are used; this is not exactly a precise method of approaching the problem, even from a purely-numerical point of view. This is on top of the obvious “experimental” uncertainty introduced by (apparently) defining a “daily average” merely from the 24-hour measured high temperature and measured low temperature – that rather loose definition will introduce uncertainties that will propagate upward into the larger calculations.
The resulting computed “annual averages” are reported in degrees-C and to two figures past the decimal place; that level of reporting implies (by basic error analysis) that the level of knowledge is within +/- 0.01 degrees-C – something that is basically impossible (either metrologically or numerically).
The other interesting note about the data set you can download is that very large chunks of it – even in rather recent years – are missing. Of the several lines visible on the earlier graph, this implies that the data available for the download corresponds to the blue line in the graph – which has significant gaps that appear to correspond to the gaps in the available data file.
Still, as noted above, attempts to construct graphs like this for the purpose of drawing conclusions…. as stated earlier, because of my background and experience, this is bothersome. Here’s why.
With all the gaps in that data set, I went out and managed to find a better set of Antarctic temperature data for the same years – 1957 to 2008. Here is my plot of that data:
This is a very interesting graph, since it appears to show the 1970s cooling trend, the sharp rise in temperatures that peaked with the 1998 “El Nino” event, and then the modest cooling trend that’s evolved since then. In fact….
Oh, sorry, wait a minute. That graph got clipped a little bit during editing. Here’s the full display that includes the y-axis:
Strange temperature axis? Well, that’s the fun of this stuff. I actually produced this “temperature data” by rolling a pair of dice 52 times in sequence, compiling the roll-results in order, and numbering them (rather than 1 – 52) 1957 – 2008. By fortuity, it simply happened to produce the apparent “trends” that roughly correspond to the kinds of “trend” data (such as the NASA chart) produced earlier.
Einstein, annoyed at the non-determinism inherent in quantum mechanics, harrumphed that “God does not play at dice with the universe.” Einstein turns out to have been wrong about quantum mechanics of course.
But we can see that when we look at climate, God playing at dice provides a pretty good explanation of what’s going on.
“Climate” and “weather” behave statistically. That’s something you learn just by living – but if you bother to look at the problem properly, that reality asserts itself pretty dramatically….