Update 6 Sep 2014 Yet another another another study has claimed “statistical significance”, this one by Philip Kokica, Steven Crimpc, and Mark Howdend. “A probabilistic analysis of human influence on recent record global mean temperature changes.” I haven’t had a chance to look at that study in any detail yet, but I imagine many of the remarks below hold. If I find the new paper needs a new post, I’ll do one in the future.
Update 12 Apr 2014 This originally ran 28 May 2013, but given Shaun Lovejoy’s latest effort in Climate Dynamics to square the statistical circle, it’s necessary to reissue. See the Lovejoy update at the bottom.
My Personal Consensus
I, a professional statistician, PhD certified from one of the top universities in the land—nay, the world—a man of over twenty years hard-bitten numerical experience, a published researcher in the very Journal of Climate, have determined that global temperatures have significantly declined.
You read that right: what has gone up has come back down, and significantly. Statistically significantly. Temperatures, he said again, have plunged significantly.
This is so important a scientific result that it bears repeating. And there is another reason for a recapitulation: I don’t believe that you believe me. There may be a few of you who are suspicious that old Briggs, well known for his internet hilarity, might be trying to pull a fast one. I neither josh nor jest.
Anyway, it is true. Global warming, by dint of a wee p-value, has been refuted.
Which is to say that according to my real, genuine, mathematically legitimate, scientifically fabricated scientific statistical scientific model (calculated on a computer), I was able to produce statistical significance and reject the “null” hypothesis of no cooling. Therefore there has been cooling. And since cooling is the opposite of warming, there is no more global warming. Quod ipso facto. Or something.
I was led to this result because many (many) readers alerted me to a fellow named Lord Donoughue, who asked Parliament a question which produced the answer that “the temperature rise since about 1880 is statistically significant.” Is this right?
Not according to my model. So who’s model, the Met Office’s or mine, is right?
Well, that’s the beauty of statistics. Neither model has to be right; plus, anybody can create their own.
Here’s the recipe. Grab, off the shelf or concoct your own with sweat and integrals, a model. The more scientific sounding the better. Walk into a party with “Autoregressive heteroscedastic GARCH process” or “Coupled GCM with Kalman-filtering cloud parameterization” on your lips and you simply cannot fail to be a hit.
Don’t despair of finding a model. They are as dollars to a bureaucracy: they are infinite! Thing is, all models, as long as they are not fully deterministic, have some uncertainty in them. This uncertainty is parameterized by a lot of knobs and switches which can be throw into any number of configurations.
Statistical “significance” works by tossing some data at your model and hoping that, via one of a multitude of mathematical incantations, one of these many parameters turns out to be associated with a wee p-value (defined as less than the magic number; only adepts know this figure, so if you don’t already have it, I cannot tell you).
If you don’t get a wee p-value the first time, you keep the model but change the incantation. There are several, which practically guarantees you’ll find joy. Statisticians call this process “hypothesis testing.” But you can think of it as providing “proof” that your hypothesis is true.
Funny thing about statistics is that you can always find a model with just the right the set of parameters so that one, in the presence of data, is associated with a wee p-value. This is why, for example, one scientist will report that chocolate is good for your ticker, while another will claim chocolate is “linked to” heart disease. Both argue from a different statistical model.
Same thing holds in global warming. One model will “confirm” there has been statistically significant cooling, another will say statistically significant warming.
The global temperature (as measured operationally) has certainly changed since the 1800s. Something, or some things, caused it to change. It is impossible—as in impossible—that the cause was “natural random variation”, “chance” or anything like that. Chance and randomness are not causes; they are not real, not physical entities, and therefore cannot be causes.
They are instead measures of our ignorance. All physical and probability models (or their combinations) are encapsulations of our knowledge; they quantify the certainty and uncertainty that temperature takes the values it does. Models are uncertainty engines.
This includes physical and statistical models, GCMs and GARCHes. The only difference between the two is that the physical models ties our uncertainty of temperatures to knowledge of other physical processes, while statistical models wed uncertainty to mysterious math and parameterizations.
A dirty, actually filthy, open secret in statistics is that for any set of data you can always find a model which fits that data arbitrarily close. Finding “statistical significance” is as difficult as the San Francisco City Council discovering something new to ban. The only evidence weaker than hypothesis tests are raw assertions and fallacies of appeal to authority.
The exclusive, or lone, or only, or single, solitary, sole way to check whether any model is good is if it can skillfully predict new data, where “new” means as yet unknown to the model in any way—as in in any way. The reason skeptics exist is because no know model has been able to do this with temperatures past a couple of months ahead.
The Dramatic Conclusion
There isn’t a soul alive or dead who doesn’t acknowledge that temperatures have changed. Since it cannot be that the observed changes are due to “natural variation” or “chance,” that means something real and physical, possible many different real and physical things, have caused temperature to take the values it did.
If we seek to understand this physics, it’s not likely that statistics will play much of role. Thus, climate modelers have the right instinct by thinking thermodynamically. But this goes both directions. If we have a working physical model (by “working” I mean “that which makes skillful predictions”) there is no reason in the world to point to “statistical significance” to claim temperatures in this period are greater than temperatures in that period.
Why abandon the physical model and switch to statistics to claim significance when we know that any fool can find a model which is “significant”, even models which “prove” temperatures have declined? This is nonsensical as it is suspicious. Skeptics see this shift of proof and rightly speculate that the physics aren’t as solid as claimed.
If a statistical model has skillfully predicted new temperatures, and of course this is possible, then it is rational to trust the model to continue to do so (for the near horizon; who trusts a statistics model for a century hence?). But there is not a lot that can be learned from the model about the physics, unless the parameters of the model can be married to physical concepts. And if we can do that, we should be able to create skillful physical models. Good statistical models of physical processes thus work toward their own retirement.
Ready for the punch line? It is shocking and deeply perplexing why anybody would point to statistical significance to claim that temperatures have gone up, down, or wiggled about. If we really want to know whether temperatures have increased, then just look. Logic demands that if they have gone up, then they have gone up. Logic also proves that if they have gone down, then they have gone down. Statistical significance is an absurd addition to absolute certainty.
The only questions we have left are—not whether there have been changes—but why these changes occurred and what the changes will be in the future.
Lovejoy Update To show you how low climatological discourse has sunk, in the new paper in Climate Dynamics Shaun Lovejoy (a name which we are now entitled to doubt) wrote out a trivially simple model of global temperature change and after which inserted the parenthetical words “skeptics may be assured that this hypothesis will be tested and indeed quantified in the following analysis”. In published comments he also fixated on the word “deniers.” If there is anybody left who says climate science is no different than politics, raise his hand. Anybody? Anybody?
His model, which is frankly absurd, is to say the change in global temperatures is a straight linear combination of the change in “anthropogenic contributions” to temperature plus the change in “natural variability” of temperature plus the change in “measurement error” of temperature. (Hilariously, he claims measurement error is of the order +/- 0.03 degrees Celsius; yes, three-hundredths of a degree: I despair, I despair.)
His conclusion is to “reject”, at the gosh-oh-gee level of 99.9%, that the change of “anthropogenic contributions” to temperature is 0.
Can you see it? The gross error, I mean. His model assumes the changes in “anthropogenic contributions” to temperature and then he had to supply those changes via the data he used (fossil fuel use was implanted as a proxy for actual temperature change; I weep, I weep). Was there thus any chance of rejecting the data he added as “non-significant”?
Is there any proof that his model is a useful representation of the actual atmosphere? None at all. But, hey, I may be wrong. I therefore challenge Lovejoy to use his model to predict future temperatures. If it’s any good, it will be able to skillfully do so. I’m willing to bet good money it can’t.
Excellent post, thanks for pointing out what should be obvious, that data is data and statistical analyses do not change the data. Kudos.
Under a popular idea, to determine whether Earth is “warming” one fits a straight line to global temperature data in a specified period of time by minimization of the sum of the squared errors. If the slope of this line is positive, it was “warming” in this period. If the slope is negative, it was “cooling.” Popularizers of this idea, including the noted climatologist Phil Jones, have not come to grips with logical shortcomings in it. One of these is a paradox. At each point at which two periods of time are adjacent, there are two different slopes. Hence there are times at which Earth was “warming” and “cooling” at the same time!
This is clearly the Rutherford approach:
â€œIf your experiment needs statistics, you ought to have done a better experiment.â€
Or maybe the Yogi Berra approach:
“In theory there is no difference between theory and practice. In practice there is.”
There is a utility for statistics in climate models, but it is not used.
Start with the observation that different models give different climate sensitivities and therefore give differing projections of future temperature. The reason for this is largely due to differing physics contained in the different models. Normally we would test (statistically) each model against observations to see which do better than others. We would then be inclined to throw out the ones that appear (statistically)unlikely and keep for now the ones that do better. This should not be a stark accept/reject process, rather a sliding scale of likelihood to be used to teach us about what is working and what is not. Such guidance from statistics is therefore useful in helping us to distinguish good physics from bad physics (notice I said “good/bad” not “right/wrong”)
But this is not what is done by the climate bureaucracy. Instead it keeps all the models and forms a multi-model ensemble, as if the models were all trying to “measure” the same thing and all the variations are just noise. This enables the climate bureaucracy to keep all the models in play, but it freezes progress in understanding what is really going on. And in the long run the entire modeling project will be discredited as it becomes more and more clear that — as an ensemble — it has greatly overestimated warming from greenhouse gases. This is already happeneing. Alas that is the price to be paid for the climate bureaucracy’s particular brand of statistical corruption.
Plus, the climate bureaucracy fosters the creation of models which make no predictions, thus being untestable. Through fallacious argumentation they make them sound as though testable.
Even when a model skillfully predicts external facts, there is no assurance that it has done so using the same mechanisms as the Real Worldâ„¢. The underdetermination of scientific theories guarantees that there will always be more than one model that adequately matches the data. This difficulty is compounded when the parameters of the model are not the measured input x’s, but combinations of fragments of these x’s “found” (“constructed”) via orthogonal factor analysis or some such method. These factors need not correspond with any real world entity.
Another reason to withhold full trust from even a skillful model is the crossing of regime boundaries. You may find a nice model Y=f(X1,…,Xn) that makes skillful predictions within a domain, then fails outside the range not because the X’s change but because the f changes. For example: the length of a rubber band increases with the weight suspended from it. Until it crosses a boundary and snaps instead of stretches.
Having proved global cooling, could you enlighten us as to which obvious period of global cooling you had evaluated?
Was that was from the Emmian interglacial period, the Holocene climatic optimum, the Medieval Warm Period, or the global or stratospheric temperature since 2001?
And how wee was the pee value?
Amen, brother Ye Olde; just so.
Why it was weer than the weest freckle on the weest of the wee folk; it was magically wee.
The underdetermination problem is solved when steps are taken to ensure that the model reflects all of the available information but not more than this information. Usually, builders of models do not take these steps. In building its models, the global warming bureaucracy does not. Instead, it seems to select those models from among the infinite possibilities that produce the scariest projections, thus ensuring the most job security for climatological researchers.
I always like to point out that the data does not exist to calculate the global average temperature. Weather stations do not provide average temperature, but rather the maximum and minimum temperatures. The temperature half way between the max and min is not the average but the median. Next, 70% of the earth is covered by water so the surface temperature is unknown over most of the earth. Also, in thermodynamics, temperature is an intensive parameter and doesn’t scale. Averaging temperature is physically meaningless.
“data is data”: not in Climate Science; there NEWdata is OLDdata plus adjustments. At East Anglia they famously lost their OLDdata altogether.
P.S. “adjustment” is a technical term meaning something like ‘that which added to what I’ve got gives me what I want’.
Taking a few of your points in turnâ€¦.
In the model, the cause was indeed random variation. In reality, of course that is untrue, but I am trying to explain this to a lay audience. This is like saying that the toss of a coin is random (almost everyone thinks of tossing a coin as random, even though it is in fact deterministic). Is there harm in that, in the context?
Yes, we agree.
We disagree. If temperatures have gone up about 0.01 degree each year for the last three years, that is not significant (under any plausible model). If temperatures have gone up about 0.1 degree each year for the last 50 years, that is extremely significant (under any plausible model). In between, there is a grey area. People reasonably want to know the shade of grey for the observed data, i.e. whether the observed data lies within what would be expected due to natural variation. The claim has been that it lies outside what would be expected naturally.
Additionally, please note that the Met Office, and the IPCC, started this. They are the ones that claimed that the increase in global temperatures since the late 1800s is statistically significant, and they are the ones who based that claim on a straight line with AR(1) residuals.
Finally, most of this is really just a retake on the op-ed piece that I published in the Wall Street Journal. The news here is that the Met Office is effectively admitting that the op-ed piece is validâ€”and that they tried extremely hard to avoid admitting it. You reviewed the op-ed piece, very positively, at
It would seem the question is the increase since the late 1800s statistically significant? should have been phrased are the temperatures today much different than the temperatures in the late 1800s?. Nice to know, perhaps, but one has to wonder what anyone would do with the answer to either question. It’s right up there with is today’s temperature different than it was last Sunday?
You would think a governing body would be more interested in what the future holds instead of whether the temperatures today are different than those a hundred+ or even a thousand years ago. The key questions should be centered on the predictions and how much confidence we should have in them.
Even then, one still has to wonder what would be done with the answers. Unfortunately, the question has arisen because of the rather dubious claim we can do something about the weather (or rather the average weather) and a claim that it can be driven back to “normal” (whatever that is) through legislation.
An understanding of what’s wrong with models comes from an understanding of the Uranus dilemma
Consideration of the planet Uranus very clearly indicates that radiative models (and any type of “Energy Budget” similar to those produced by the IPCC) can never be used to explain observed temperatures on Uranus. We can deduce that there must be some other physical process which transfers some of the energy absorbed in the upper levels of the Uranus atmosphere from the meagre 3W/m^2 of Solar radiation down into its depths, and that same mechanism must “work” on all planets with significant atmospheres.
Uranus is an unusual planet in that there is no evidence of any internal heat generation. Yet, as we read in this Wikipedia article, the temperature at the base of its (theoretical) troposphere is about 320K – quite a hot day on Earth. But it gets hotter still as we go further down in an atmosphere that is nearly 20,000Km in depth. Somewhere down there it is thought that there is indeed a solid core with about half the mass of Earth. The surface of that mini Earth is literally thousands of degrees. And of course there’s no Solar radiation reaching anywhere near that depth.
So how does the necessary energy get down there, or even as far as the 320K base of the troposphere? An explanation of this requires an understanding of the spontaneous process described in the Second Law of Thermodynamics, which is stated here as …
“The second law of thermodynamics: An isolated system, if not already in its state of thermodynamic equilibrium, spontaneously evolves towards it. Thermodynamic equilibrium has the greatest entropy amongst the states accessible to the system”
Think about it, and I’ll be happy to answer any questions – and explain what actually happens, not only on Uranus, Venus, Jupiter etc, but also on Earth.
If the model used for Earth is a good enough approximation, then it doesn’t matter if that model doesn’t work for Uranus.
Uranus is a big and heavy gas giant with a deep atmosphere. The only way for a gas to support the weight above it and stay a gas is to exert lots of counter pressure, and for that you need it to be hot.
Earth’s atmosphere is not that deep, nor is it that heavy.
Uranus’ core is therefore not comparable to Earth, even though it is of similar size and weight. The circumstances are very different indeed.
Of course the difference between Earth, Venus and Uranus is the in the height of their atmospheres. But just saying “you need it to be hot” is hardly explaining where the energy comes from to make it hot. My point is that, if the surface of Venus and the base of the troposphere of Uranus can become so hot, then there must be another mechanism at work which is able to transfer energy from the Sun down to such depths, because it is not radiation. And why should that non-radiative physical mechanism not be able to function in Earth’s atmosphere as well?
Now, you think the models work to a “good enough approximation” on Earth, but in fact they predict the exact opposite of what is observed – they predict that water vapour (the main greenhouse gas) will produce warmer mean surface temperatures. But my study of real world data in the Appendix here shows that it cools.
The models can be “fiddled” to fit Earth’s data, but if they were really representing what is actually maintaining surface temperatures, then, because physics is universal, they should be able to explain what is observed on other planets. But they come nowhere near doing so. So I don’t buy your argument at all. I have explained the physics of what is really happening on all these planets, including Earth, and my explanation gels with all observations, as well as laboratory experiments and other data from the outer crust of Earth.
“If we really want to know whether temperatures have increased, then just look.”
So where do we look? If we want to know if temperatures as reported by the University of East Anglia have gone up we can look at what they say. But that’s not the same as “Earth’s global average temperature”. So then we’re asking, “How close is what UEA say to the real thing because it’s the real thing I want to know?” and then we’ve got some uncertainty. Then we start doing statistics.
The Uranus Dilemma (continued from above)
The issue is not whether â€œclimate denialâ€ is right or wrong, but rather that the explanations pertaining to the greenhouse conjecture just simply donâ€™t adhere to well known physics, and ignore the physics which explains the warming of the surface by non-radiative processes.
The IPCC (in its Glossary of Terms under â€œGreenhouse Effectâ€) refers to a â€œradiative forcingâ€ effect which is by no means adequately explained in terms of physics. The concept of all radiation coming from a certain altitude is pure fiction. In fact the peak radiation comes from where water vapour is most prolific, somewhere around an altitude of 3Km. Radiative flux is quantified with the Stefan-Boltzmann Law, and it is nothing like a nice linear function declining with altitude. Quite a bit comes from the surface straight to space anyway. The altitude at which equal amounts of outward radiation come from above and below (including the surface) can be shown to be about 3.0Km to 3.5Km. The whole plot rotates around this pivoting altitude such that it has a less steep gradient in moist regions, and thus intersects the surface at a lower temperature.
Now, it is obvious that a planetâ€™s atmosphere does indeed lead to the surface being hotter than it would have been if it only received the same amount of incident Solar radiation but had no atmosphere. Venus would receive only 10W/m^2 and would thus be far colder without an atmosphere. Uranus would receive nothing from the Sun, and would thus be colder than 3K. Even if it received all of the Solar radiation reaching its TOA (about 3W/m^2) it would be colder than 60K.
So it is very clear that the concept of energy budgets supposedly balancing energy and, in effect, instantaneously determining surface temperatures is fictitious. The energy required to maintain these surface temperatures has built up over the life of the planet from just a small amount of the daily dose of Solar radiation, most of which, but not quite all, was radiated back to Space.
It is not a day to day balancing act, and so all radiative forcing and all energy budgets, even for Earth, are totally irrelevant.
The Sun is not heating the atmosphere and outer crust from zero K each day. Thermal energy has built up over many years and is trapped by the gravity effect which keeps more of it closer to the surface than to the top of the troposphere. It does so by the spontaneous evolving process described in the Second Law of Thermodynamics. Heat can and does flow up the very shallow thermal gradient which represents the thermodynamic equilibrium described in the Second Law of Thermodynamics. In doing so, flowing away from a new source of energy which is disturbing the thermodynamic equilibrium, it is merely acting with a propensity to restore the thermodynamic equilibrium, just as the Second Law says it will.
That is how some of the incident Solar radiation absorbed by the atmosphere as night becomes day makes its way towards the surface, maintaining the thermal gradient determined by gravity (and reduced a little by inter-molecular radiation) so that the base of the troposphere is kept warm and thus â€œsupportsâ€ surface temperatures. This is a non-radiative convection process which has nothing to do with radiative forcing.
Radiative models are simply not relevant.
Your point is nowhere better illustrated than in the financial world. Many try to model some asset, and come up with statistically significant models. The ultimate test is always the capacity to make money with it (forecast acurately). If it fails, reality sets in very quickly. That’s what is missing in the climate science world: some mechanism that brings discipline to those who come up with models that are suppose to represent reality.
There is a lot of potential energy in a column of gas in a gravity well. The gas compresses under its own weight and the potential energy changes into kinetic energy, i.e. the atoms and molecules go faster, i.e. the temperature rises. Until the gas column stabilizes. Then a gas particle will be a bit lower than it was, and turned that height difference, which is a loss in potential energy energy difference into more speed, kinetic energy.
And this is potential energy in the mechanical sense, like in the Langranian or the Hamiltonian. Not in the philosophical sense, actualities versus potentialities.
Sorry Sander but you are wrong as described clearly in this link:
Doug Cotton, if you want this debate try WUWT as you are wrong too.
‘Fraid not. Comsider this. Initially, all the atoms of the gas cloud are hardly moving. As the gas cloud collapses, the atoms are moving faster and faster, because they are falling inwards. They are also getting closer together. At some point they are so close to each that they cannot get closer. At that moment the collapse must stop. And all the potential energy that was in the large distances between the atoms is turned into kinetic energy of the gas particles. It has to be, because of conservation of enery.
This is a very different model compared to the model on the WUWT site. It is also not a very good description of a real imploding gas cloud because an imploding gas cloud is loosing part of that potential enery in the form of radiation. Atoms and molecules can bounce of each other so violoently that they internal structure changes a bit. And when that structure returns to its riginal state it sends out a photon. And if that photon is leaving the cloud, the cloud has lost energy. There is now not enough energy to keep the cloud stable, and it collapses some more.
So you have this process of a cloud collapsing and loosing energy. The cloud is hot, but not hot enough to support its own weight. If it was hot enough, as soon as it lost another photon it would not be hot enough and collapse a bit.
Particles in the cloud have a size, you cannot push them closer together than that size. Is the distances between the particles become comparable to their size, the cloud stops contracting. It will still send out photons, but now there is no more potential energy to convert, which means that the cloud will start to loose its kinetic energy. It will cool.
In the WUWT article, there is a static situation, which looks very much like the final situation in my story of a collapsing cloud. And if the heat losses can be neglected, the resulting atmosphere is probably easier to reason about than the more general picture. The exact nature of the resulting collapsed cloud will have a very high impact on that atmosfere. Like on earth, where the heat in the earths core is hardly reaching the atmosphere. Which is completely different from a gas giant, where the atmosfere is not isolated from the heat in the core of the planet.
Sander is correct and the WUWT article has a major flaw which is explained in the last paragraph of Section 14 of my paper “Planetary Core and Surface Temperatures.” Basically, the wire also develops a temperature gradient due to gravity, preventing any endless flow of energy.
Where is your explanation for the thermal gradient (wrongly called a “lapse rate”) on all planets with significant atmospheres? Start by explaining the Uranus dilemma (above) in any other way.
And, by the way, high pressure does not maintain high temperatures. When your car has been in the garage all night the air in the tyres is at the same temperature as the air outside, despite the big difference in pressure. The Ideal Gas Law says pressure is proportional to the product of density and temperature. So temperature can remain constant with pressure increasing with density.
You are switching back and forth from two very different cases. A collapsing interstellar gas cloud is a very different beast compared to the planetary atmosphere of your first comment. Planetary atmospheres are not undergoing gravitational collapse nor are they the product of gravitational collapse. An equilibrium atmosphere is an isothermal atmosphere. This is standard textbook physics. The gravitational gradient of a planet can not produce a stable temperature gradient. This would violate the second law and make possible a perpetual motion machine.
If the WUWT article has a flaw about this point then you must claim that every thermodynamic textbook has the same flaw. I’ve already heard the claim that the wire in Robert Brown’s thought experiment must have a thermal gradient. A magical claim that would never be made under any other circumstances and is made here only in a desperate attempt to save a failed theory. Lapse rates occur in non-equilibrium atmospheres due to an external heat source and aided by the greenhouse effect that you have tried to rule out of bounds. In the case of gas giants internal sources of heat may also play a role as mentioned by Sander. The only dilemma about Uranus is that it is colder than expected. Your statement about pressure seems to come out of left field.
In any case the nonsense promoted by the PSI organization that you are affiliated with has been debunked extensively and devastatingly by WUWT and particularly by Robert Brown of Duke University. He is not only much more articulate than I am but has a lot more patience. I will not waste any more time on this topic.
William Sears (and others who follow WUWT etc)
You claimed that “ An equilibrium atmosphere is an isothermal atmosphere
This statement of yours is simply incorrect. You can either study Sections 4 to 9 of my paper, or my article “Roy Spencer’s Misunderstood Misunderstandings ..”
The Second Law of Thermodynamics explains how thermodynamic equilibrium evolves spontaneously, acquiring a state of maximum accessible entropy. Thermodynamic equilibrium in a vertical plane in a gravitational field can only be acquired when isentropic conditions are achieved. Otherwise work could be done between a region with higher total energy and a region with lower total energy, and so it would not be a state of maximum accessible entropy. Hence, seeing that total energy includes gravitational potential energy (which is not reflected in temperature) we simply cannot have isothermal conditions and thermodynamic equilibrium at the same time in a vertical plane, because potential energy varies.
It does not matter whether the atmosphere is very gradually collapsing as on Jupiter, or probably not doing so, as on Uranus. Either way, we still observe a thermal gradient which is based on -g/Cp but reduced in magnitude by up to about one third, due to inter-molecular radiation which has an opposing propensity towards isothermal conditions. Hence water vapour reduces the gradient on Earth, and carbon dioxide does on Venus, and methane does to a small extent on Uranus. Reducing the magnitude of the gradient causes the thermal plot to rotate (in order to maintain radiative balance with the Sun) and this leads to a lower supported surface temperature, as my study in the Appendix showed is the case.
The brilliant physicist Loschmidt was well ahead of his time (in the 19th century) when he postulated that gravity would cause a spontaneous thermal gradient in all matter, solid, liquid and gas. We see such a gradient in the Earth’s solid outer crust for example, and this gradient is also based on the quotient of the acceleration due to gravity and the weighted man specific heat. All this is in my paper in far more detail, and with empirical support.
I really don’t care how many text books or people of “authority” choose to argue that the Second Law of Thermodynamics does not lead to thermodynamic equilibrium, but instead to thermal equilibrium. I’m afraid physics texts don’t say that.
There is no mention of any greenhouse effect or back radiation or convection or pressure in the simple two line calculation of the thermal gradient (using Kinetic Theory) in my paper.
I repeat: you don’t need to introduce pressure into the calculations, nor convection, nor any air movement, nor expansion, nor contraction – just a simple diffusion process that can and does happen in a sealed cylinder in a lab – as shown in over 800 painstaking experiments spread over more than 10 years – see cited reference in my paper.
Uranus radiates at a low temperature, <60K, yes. And that's the point, because it only receives about 3W/m^2 and re-emits all that back to space from the uppermost layers of its atmosphere. You can read here how the base of its (theoretical) troposphere is about 320K, and it is thousands of degrees hotter further down, and yet has no evidence of any internal heat generation. Collapsing gas planets (without solid cores – as Uranus probably has) will generate some energy due to the net conversion of gravitational potential energy to kinetic energy during molecular free path motion between collisions – because more molecules move towards the centre than away from it.
If you think Roy Spencer and WUWT have rubbished PSI, it is partly because each now runs comment threads about PSI, but prevents PSI members from responding thereon. For example, totally incorrect statements are made thereon about what we say. I’ll leave you to read PSI’s response here.
Regarding these adverse and incorrect comments about Principia Scientific International on WUWT, consider the last paragraph here to which I would reply (if Anthony had not blocked both my ISP’s) …
We at PSI do not say what you claim at all. We say radiation from a cooler atmosphere slows that portion of surface cooling which is itself by radiation, but it cannot slow the greater portion of surface cooling which is by non-radiative processes. That non-radiative cooling is free to accelerate and compensate for any slowing of radiative cooling.
Carbon dioxide and water vapour do indeed slow radiative cooling, but the slowing by all the water vapour is about 100 (maybe 1,000) times as effective as that by carbon dioxide. This means any effect of carbon dioxide is absolutely minuscule. All this is explained in this paper published on our site in March 2012.
Furthermore, just as the body can only heat itself to normal body temperature, and a blanket won’t give you a fever, so too the Sun can only heat the surface to a temperature well under 288K (perhaps approximately 255K) using that portion of its radiation that reaches the surface. Hence any slowing of surface cooling can only slow the cooling from a somewhat lower temperature than what is observed. Hence the slowing of surface cooling is not the reason why the mean surface temperature is as high as 288K.
Finally, some are now realising that there is a need for a paradigm shift (as in this article) because it is clear that surface temperatures are not controlled by radiative forcing, but rather by non-radiative transfer of heat which supports the surface temperature.
If you believe that planetary surface temperatures are all to do with radiative forcing rather than non-radiative heat transfers, then you are implicitly agreeing with IPCC authors (and Roy Spencer) that a column of air in the troposphere would have been isothermal but for the assumed greenhouse effect. You are believing this because you are believing the 19th century simplification of the Second Law of Thermodynamics which said heat only transfers from hot to cold – a “law” which is indeed true for all radiation, but only strictly true in a horizontal plane for non-radiative heat transfer by conduction.
The Second Law of Thermodynamics in its modern form explains a process in which thermodynamic equilibrium “spontaneously evolves” and that thermodynamic equilibrium will be the state of greatest accessible entropy.
Now, thermodynamic equilibrium is not just about temperature, which is determined by the mean kinetic energy of molecules, and nothing else. Pressure, for example, does not control temperature. Thermodynamic equilibrium is a state in which total energy (including potential energy) is homogeneous, because if it were not homogeneous, then work could be done and so entropy could still increase.
When such a state of thermodynamic equilibrium evolves in a vertical plane in any solid, liquid or gas, molecules at the top of a column will have more gravitational potential energy (PE), and so they must have less kinetic energy (KE), and so a lower temperature, than molecules at the bottom of the column. This state evolves spontaneously as molecules interchange PE and KE in free flight between collisions, and then share the adjusted KE during the next collision.
This postulate was put forward by the brilliant physicist Loschmidt in the 19th century, but has been swept under the carpet by those advocating that radiative forcing is necessary to explain the observed surface temperatures. Radiative forcing could never explain the mean temperature of the Venus surface, or that at the base of the troposphere of Uranus – or that at the surface of Earth.
The gravitationally induced temperature gradient in every planetary troposphere is fully sufficient to explain all planetary surface temperatures. All the weak attempts to disprove it, such as a thought experiment with a wire outside a cylinder of gas, are flawed, simply because they neglect the temperature gradient in the wire itself, or other similar oversights.
The gravity effect is a reality and the dispute is not an acceptable disagreement.
The issue is easy to resolve with a straight forward, correct understanding of the implications of the spontaneous process described in statements of the Second Law of Thermodynamics.
Hence radiative forcing is not what causes the warming, and so carbon dioxide has nothing to do with what is just natural climate change.
Thanks for this interesting and useful post. The Met Office’s response to Doug Keenan’s claims is here, and there is a more detailed article here and a briefing paper here.
Thanks for this interesting and useful post. The Met Office’s response to Doug Keenan’s claims is here, and there is a more detailed article here and a briefing paper here.
An atmosphere model needs a context. It has to be derivable from the model that explains a planet’s birth, for instance.
this originally ran 28 Mary 2013,
The RC church perhaps regards May the month of Mary but my calendar doesn’t.
Dear Dr. Briggs,
You are wrong, very wrong, absolutely wrong, indeed mucho wrongola.
You state (dramatically): There isnâ€™t a soul alive or dead who doesnâ€™t acknowledge that temperatures have changed.
Sorry, Bub, but I am a soul (living) who does NOT acknowledge such nonsense. Where I live (Williamette Valley, Oregon), the climate (i.e. average weather/temperatures) has in no way changed at all in my lifetime. Cloudy and dreary today, just as it has been every April in memory.
I study forests forensically. That is, I seek clues about how forests used to be by gathering evidence — real empirical evidence from soils and mud cores and stumps and ancient living trees. My findings: for at least the last 6,000 years the exact same species have been growing here. Over those hoary millennia (back to the Pyramids and then back that far again) there have been no palm trees or sassafrass. There has only been the same old Dougfir and same old Oregon oak and same old wild hazel.
And I defy you to find any (even a speck of) evidence to the contrary. Which you MUST do to support your wildly erroneous contention that the climate has changed here.
Although it hardly matters, because your stated proposition is that no soul knuckles under to your wild, unsupported, crazy jabber, jackbooted “belief” (a nice way of saying your mass insanity superstition), and I am living proof that you are wrong.
Lovejoy is a kook. No argument from me on that. But so are you if you continue to make patently false claims backed up solely by your alleged “degrees” of “education”. Meaningless crap. The most credentialed person in the world can be wrong and often is, as is so plain in this case.
The problem my friend is that Science is Dead. Lovejoy, Briggs, and all the amateur “experts” arguing above are faking it. You all are not doing science, you are counting angels on pinheads. Every one of you is a braying ass and a pinhead yourself. You make me sick with your false pretentions. You are no more scientists than the chickens I am going out to feed right now, because attempting to converse with the utterly deluded is a worthless waste of my precious time. So there.
I mean to say that every year the temperature is not the same. Simple as that. Everybody agrees with that.
“Ready for the punch line? It is shocking and deeply perplexing why anybody would point to statistical significance to claim that temperatures have gone up, down, or wiggled about. If we really want to know whether temperatures have increased, then just look. Logic demands that if they have gone up, then they have gone up. Logic also proves that if they have gone down, then they have gone down. Statistical significance is an absurd addition to absolute certainty.”
I’m happy to read that, since it’s what I’ve always thought, but, being a statistics naif, was afraid to say. When I hear, “The negative trend isn’t significant if it’s not 17 years long,” I think that ought to mean that the chance of a 17-year trend is less than [pick your percentage] if assumption X is true–but I don’t ever hear people who say such things tell what X is.
Maybe everyone but me knows already, so they don’t need to say it. But I’m guessing that a lot of the time they don’t.
Today it was 75 degF. Turned into a nice day. As I recall, last year it was also 75 degF here on some day in Spring. And the year before that and the year before that and every year ever recorded. So your new corrected contention is also wrongola, and I don’t agree with it, making your universality claim again false.
It seems like you are struggling to make a declarative statement that sounds scientific, but for some reason cannot.
What’s your point? Are you trying to find some sort of quasi-agreement with the pseudo fake fraudsters of (tragically deceased) “science”? So they won’t despise and shun you anymore? Because you are a generally agreeable nice guy just trying to find some sort of income in the World of Science?
Maybe you could create a model for “non-radiative heat transfer on the Planet Uranus? That seems to be a popular pastime. It sounds like science, but it’s not. It’s just babble. And it doesn’t pay, or at least it shouldn’t, but then again these days egregious pseudoscientific babble is big bucks for some fraudsters.
“Science” is a con game. Hucksterism abounds. Babble rules. There is no real science being done in any so-called “scientific” field anymore. The hucksters killed it, and the poor deluded sots who think they are doing science are just useful idiots. Look at your commenters. They don’t get any point you make.
Maybe it was always thus. Maybe science has always been something done by a handful of rare individuals while the masses looked on with shock and awe, and wished they too could be scientists, and so they earn degrees in utter crapola with no clue about what real science is, and lurk around their labs in disguise hoping they are never exposed for the frauds they really are.
Try it again. Try to make some declarative statement about reality that stands up to inspection and logic. I think you can do it. But quite possibly you may have to sacrifice all hope of economic reward, because real science is dead and there is no market for truth.
PS — “Everybody agrees” is the most unscientific and illogical argument there is. But you know that. So why go there? Why seek universal agreement? It’s a really dangerous and undesirable thing.
The thing that really matters more than any other thing is whether there is a substantial consensus of scientists who voted that a wee pea of significance demonstrates that “chocolates are good for my ticker”. The rest of you can carry on discussing the weather for a while.
The Lovejoy paper is so fraudulent that at first I thought it was satire. But no, it’s in a peer-reviewed journal. Therefore it must be the case that a goodly number of “scientists” are okay with absolute crap being promoted as “science”.
What does that tell you about the state of “science”? It tells me that most “scientists” can’t tell the real thing from Uranus.
The bell tolls. Science is dead. Weep for all of us.
Well I come at this as a rude mechanical who’s experience of stats is mostly founded on operating controls in the Auto industry. My apologies if I offend on any philosophical points.
A quick look at the AWOS standard shows that the instrument accuracy should be 0.3 C and resolve to 1 F giving an overall error of about .6 C. That’s bench test, unassembled mind not field test. Chuck in manufacturing and field assembly errors, that probably translates to a real performance of +/- 1.0 C. You get a similar result looking at the satellite sensor specs.
So our best, most recent instrument record is within 1 degree C and we are concerned about a warming of 0.6 C. In industry to monitor a specification of .6 C we’d want a first class system to have, by commonly accepted rule of thumb, reliability and reproducability to 0.06 C. Us poor artisans wouldn’t be buying anything of a Company who’s quality was run by Lovejoy
Similarly when BEST say that they can admit to 70% of the Climate stations in the USA have instrument errors of between 2 and 5 Centigrade, but can draw a trend of -0.06 C/Century from the data you know that it can’t be right.
Lovejoy has posted a defense of his paper.
“Lovejoy has posted a defense of his paper.”
Well yes he has, and not far in we get this:
“A.The short answer is that the issue of accuracy of the surface measurements is overblown, misunderstood andâ€“unless it is far larger than any estimates giveâ€“ it is nearly irrelevant to the final conclusion”
Which to those of us who’ve had to run instrument systemsfor a living leaves us this reference:
Lovejoy has posted a Q&A:
Any comments? Thanks
I’d add that we have a theoretically compelling and well tested procedure for narrowing the number of models from infinity down to one and that while meteorologists have used it climatologists don’t. I address this topic at http://judithcurry.com/2010/11/22/principles-of-reasoning-part-i-abstraction/ and http://judithcurry.com/2010/11/25/the-principles-of-reasoning-part-ii-solving-the-problem-of-induction/ .
I took a quick glance at the posts and (maybe because they were quick) it seems you reversed the definition of Shannon’s information entropy. As I use it, entropy is at a maximum between X and Y when X tells you nothing about Y. It would seem a model with maximum entropy wrt X and Y would be useless. Perhaps you really meant Mutual Information? That would reach a maximum when X completely describes Y.
Also, much of what you said sounds like E. T. Jaynes.
This is OT for this blog post. Perhaps Matt will give you some space to repost or summarize the posts you made a Curry’s then we could discuss this further.
Linked to this on my blogâ€”a very good explanation of why statistics is just as creative as accounting! 🙂
Your understandings regarding the entropy and mutual information are correct. Maximization of the entropy is under constraints expressing the available information. The effect is to assign values to probabilities. Thermodynamics is a well known application. Here the constraint is energy conservation.
Maximization of the mutual information is equivalent to minimization of the conditional entropy. The effect is to discover the optimal definitions of the conditions. Modus ponens is a well known application.
Jaynes is one of those who generalized the principle of entropy maximization from its roots in thermodynamics and the theory of communication.
OK I see now what you were getting at. My reading was too hasty.
has anyone looking at the new paper noticed how well an inversion of the SOI graph they use fits the temperature anomaly graph they use?
WM Briggs Re: “I therefore challenge Lovejoy to use his model to predict future temperatures. If itâ€™s any good, it will be able to skillfully do so. ”
J. Scott Armstrong observes:
Armstrong, J.S. (2011). Illusions in Regression Analysis. Forthcoming in International Journal of Forecasting, 2012.
Followup: JS Armstrong et al. Golden Rule of Forecasting: Be Conservative
Armstrong’s no-change bet is doing better than Gore’s 3C/century.
I think you are making a safe bet!
Yes, of course. I’ve also noticed that this is not the first paper to notice a correlation between SOI and surface temperatures.
Notably (to me) absent from this paper is the Atlantic Multidecadal Oscillation (AMO). The Pacific Decadal Oscillation (PDO) and Quasi Biennial Oscillation (QBO) were considered but left out of the model either because they didn’t contribute much to a better fit, or only went back to 1950. AMO has been calculated back to 1880 and improves fit a great deal, so it’s curious that it didn’t even get a mention.
“David L Hagen
September 7, 2014 at 7:02 pm
WM Briggs Re: “I therefore challenge Lovejoy to use his model to predict future temperatures. If itâ€™s any good, it will be able to skillfully do so. ”
He did make predictions that are testable. From the McGill Press Release.
” His study predicts, with 95% confidence, that a doubling of carbon-dioxide levels in the atmosphere would cause the climate to warm by between 1.9 and 4.2 degrees Celsius. That range is more precise than – but in line with — the IPCC’s prediction that temperatures would rise by 1.5 to 4.5 degrees Celsius if CO2 concentrations double.”
Regarding the use of anomalies, Lindzen puts them into perspective here.