by Tomas Milanovic
This essay has been motivated by Isaac Held’s paper [link] arguing for possible emerging simplicity or even linearity in climate dynamics.
Indeed a cursory observation of climatic phenomena shows a staggering complexity on all spatial and temporal scales. Tornadoes, storms, hurricanes, precipitation, clouds, oceanic oscillations, rivers and currents, ice dynamics and much more – they all show infinite diversity in extent, duration and intensity.
Therefore it is a natural question to ask : “Facing this complexity, is there any chance that all or parts of this system can be deterministically predicted with a reasonable accuracy ?”
It is safe to say that the answer can at best be only a partial yes and even then only if the system can be simplified.
We will not attempt to define simplicity which is a subjective notion best expressed by the famous quote : “In every field of inquiry, it is true that all things should be made as simple as possible – but no simpler. And for every problem that is muddled by over-complexity, a dozen are muddled by over-simplifying.” Instead, we will focus on the relevance of simplifications by analogy.
There are several arguments appearing systematically as being a ‘proof” of underlying/emerging simplicity in the climate dynamics and all are based on analogies.
The purpose of this essay is to make a critical review of the most used arguments in order to assess the scientific validity of such analogies.
Before starting the review, it is necessary to make a preliminary remark which deals with the problem of scales and reads: All laws of nature are local.
It then necessarily follows that all dynamical laws of nature are expressed as differential or partial differential equations. From this fact and from translational time and space symmetries follows Noether’s theorem: any differential symmetry of the action of a physical system has a corresponding conservation law. Noether’s theorem implies the existence of invariants like energy and momentum which are the fundamental building blocks of all physical theories.
This observation has a very important consequence. No global property, correlation or relationship of a system represents a first principle. All global, e.g macroscopic or averaged properties are always derived from the local and microscopic laws. In other words it is impossible to bring a proof of a statement using global or averaged quantities otherwise than by deriving them from the local microscopical laws of nature.
This can be illustrated by the RANS (Reynolds Averaged Navier Stokes) which is obtained by substituting the instantaneous values with the averaged values. This yields a system of partial differential equations that the global averaged quantities must obey. However this change of variable has a price – the appearance of unphysical stresses, ill defined initial conditions, and non closure of the equations. Several closure schemes can then be empirically chosen but the fact remains that none of them is grounded in a natural law so that the solutions are empirically useful only in a limited number of strongly constrained cases. As soon as the imposed constraints stop being respected, the RANS become useless.
The lesson is that even if global may often seem simpler than local, the laws of nature always work in the other direction – from local to global.
That is why the often repeated statement “Climate is not weather” is misleading. The right statement is “Climate is uniquely dependent on weather because its properties can only be derived from a known weather by averaging it over some arbitrary space or time domain.”
1) The argument of statistical thermodynamics analogy
This argument is based on the observation that even if the individual trajectories of gaseous molecules are complex, chaotic and unpredictable, simplicity emerges by considering the averages of vast ensembles of molecules. Well defined, simple and deterministic global variables like temperature, specific heat capacity, pressure appear and simple relations among them allow deterministic predictions of these global variables even if the individual molecular trajectories stay unpredictable.
Fluid dynamics are then considered as an analogy to an ensemble of molecules considering that a deterministic, preferably linear simplicity relating global (averaged) variables will emerge making the chaotic dynamics irrelevant for macroscopic predictability.
This argument reveals a deep misunderstanding of the origins of chaos in ensembles of molecules on one side and in fluid dynamics on the other side.
Indeed the foundation of statistical thermodynamics is the hypothesis that kinetic energy is conserved and that the molecules’ movements are independent and uncorrelated. With this hypothesis the ensemble of molecules becomes equivalent to an ensemble of hard spheres interacting only through elastic collisions and it is easy to derive global statistical moments of kinetic energy which allows us to define a global temperature and pressure and to relate them to each other.
With the additional hypothesis of thermodynamic equilibrium (or at least LTE, Local Thermodynamic Equilibrium) it is then possible to derive the statistical distribution of velocities and energies by using the energy equipartition theorem.
The purpose here is not to discuss in what cases statistical thermodynamics fails, but to observe that over a vast volume of parameter space all gases obey quite well the simple laws of statistical thermodynamics. Amidst chaos and complexity, a certain simplicity emerges for some cases.
As for fluid systems, none of the necessary hypothesis of statistical thermodynamics are valid. Fluid systems are dissipative and do not conserve mechanical energy let alone kinetic energy.
Fluid systems’ phase spaces are infinite dimensional while statistical thermodynamics’ phase spaces are finite dimensional even if the dimension is large.
Parcels of fluids cannot be considered independent and uncorrelated because they are part of a continuum described by the Navier Stokes equations.
Therefore there is no analogy between a fluid system and an ensemble of molecules so that it is impossible that a “simplicity” emerges in an analogous way as it does in the case of statistical thermodynamics.
Chaos theory illustrates this absence of any analogy in a much crisper way.
The dynamics of an ensemble of molecules is a particular case of Hamiltonian chaos. Hamiltonian chaos describes the behavior of nonlinear systems that conserve mechanical energy. Hamiltonian chaos governs also the orbits of bodies in the solar system and was studied by Poincare in the frame of the famous 3 body problem more than 100 years ago. http://en.wikipedia.org/wiki/Three-body_problem
The dynamics, attractors and orbit stabilities are relatively well understood and formulated in the KAM theory http://en.wikipedia.org/wiki/Kolmogorov–Arnold–Moser_theorem of the invariant tori. The invariance of the tori is guaranteed by the constraint that the system must stay on hypersurfaces of constant mechanical energy. For interested and mathematically advanced readers see ftp://ace1.ma.utexas.edu/pub/papers/llave/tutorial.pdf
On the other hand the dynamics of fluid systems is governed by non Hamiltonian spatio-temporal chaos driven by a cascade of energy from inertial scales to dissipative scales [see my previous post Spatio-temporal chaos http://judithcurry.com/2011/02/10/spatio-temporal-chaos/%5D. This domain is recent and still poorly understood. Turbulence is for example a particular case of spatio-temporal chaos related to small dissipative scales, while ENSO is a particular case of spatio-temporal chaos related to large inertial scales.
From the above follows that there is absolutely no analogy between statistical thermodynamics and fluid dynamics so that no simplicity can emerge in the latter for analogous reasons it did in the former.
2) The argument of seasonal analogy
This argument is based on the observation that despite the chaotic properties of the weather, it is possible to statistically predict that the temperature in Montreal will be higher in summer than in winter. Another variant of the same analogy on a different time scale is that it is possible to statistically predict that the temperature during the day in Montreal will be higher than during the night.
From that observation, it is then concluded by analogy that because it is possible to predict some property of a chaotic and fundamentally unpredictable variable, it is then possible to generalize to all predictions concerning this or other chaotic variables so that the intrinsic chaos is irrelevant.
In this case, chaos theory explains the observation and invalidates the analogy too.
We will restrict to Navier Stokes as proxy for the weather dynamics without loss of generality.
It has been proven (see f.ex Foias and Temam LINK?) that N-S equations possess a global attractor. A global attractor is defined as an invariant subspace of the functional space (here the Hilbert space of square integrable functions) where the N-S solutions live asymptotically.
Further it has been proven that the global attractor is finite dimensional. The dimension of the attractor can be considered as the finite number of different spatial and temporal Fourier modes that fully define the asymptotical dynamics. This means in practice that only N Fourier modes are necessary to describe the dynamics, the remaining (infinity – N) modes are “enslaved” to the dominant N modes.
There exists an abundant literature estimating N and the properties of the global attractor. Unfortunately the upper bound of the dimension is given as a function of the Grashof number which is in practical cases O(10^10) or O(10^20), which puts the calculation of the attractor out of reach.
Now if we fix the spatial point for example to Montreal, then the observed dynamics will apparently depend only on time so that only the temporal Fourier modes remain with, of course, unknown weightings because the attractor’s structure at this point is unknown.
Yet the local oscillator may be postulated to be a case of simple, merely temporal chaos with no spatial correlations to neighboring points. Such an oscillator will possess an attractor too. Any such attractor can be thought of as a skeleton of an infinity of unstable periodic orbits where the system follows an unstable periodic orbit for a time then derives to another unstable periodic orbit etc.
The behavior of chaotic systems submitted to periodic forcings has also been studied and often leads to frequency or phase locking. This means that the unforced chaotic system which has a continuous spectrum with no significant peaks partially synchronizes with the forcing and a peak at the forcing frequency appears.
For visualization purposes one can then represent the Montreal’s attractor topology like the numeral 8 where the lower loop represents the winter (or night) states and the upper loop the summer (or day) states. A forcing with a period 1 year will then have for effect that the chaotic orbit will synchronize in a sequence U-L-U-L-U … where the system will spend approximately 6 months moving in the upper loop of the 8 and 6 months moving in the lower loop.
If one adds the hypothesis that the system is ergodic [see my previous post Ergodicity http://judithcurry.com/2012/02/15/ergodicity/%5D, then according to the ergodic theorem and thanks to the U-L-U-L … periodicity, the time average of the winters will converge to the phase space average of the winters, e.g to the middle of the lower loop of the 8. The same applies for the summer and the upper loop of the 8.
From there it is trivial to conclude that in the infinite time limit, the winter time averages are lower than the summer averages because the phase space averages of the upper and lower loops of an 8 are different. This trivial prediction is the result of the topology of the attractor and of chaos synchronization with a particular strong periodic forcing with 1 year period.
What can we say about the topology of this simple attractor in general ? We know that the topology will strongly vary spatially. At the equator it will be more like an O where the winter and summer states do not separate strongly. At the pole the separation between the two loops of the 8 will be maximum especially because both periodic forcings (day and year) are in phase and have the strongest amplitude. This implies that this kind of trivial “predictability” is extremely variable with high confidence in Montreal but low in Singapore, which forbids any generalization for the whole spatially extended system.
If the spatial consideration above destroys any hope of generalization, the frequency consideration is even more damning. The chaos synchronization only happens for 2 particular frequencies of 1 year and 1 day, corresponding to the 2 modes of rotation of the Earth. From this follows that no other unstable periodic orbit can be stabilized and no prediction of the kind “summers in Montreal are generally warmer than winters” can be made for any time scale longer than 1 year. There is simply no other strong periodical forcing beside the 2 already considered.
For the sake of completeness, the periodical forcing of 11 years due to the solar cycles must also have an effect on the unstable orbits of same periods and their harmonics within the attractor but the weakness of this forcing probably leads to subtler frequency effects than a “simple” strong synchronization seen in the 1 day and 1 years case.
The seasonal argument is thus revealed as being merely a very particular case of chaos synchronization at a single frequency at high latitude locations and has no relevance to any other time and space scale. The “seasonal analogy” is no analogy at all and more specifically doesn’t shed any light on the multi-decadal time scales of interest.
3) The Boeing analogy
This primitive (pseudo) analogy doesn’t deserve to be discussed seriously but it appears often in general media and on climate blogs so that a fast invalidation is in order. The argument takes turbulence as an example of chaos and submits that since we can design planes that fly, then the system is deterministic and chaotic dynamics are irrelevant for all practical purposes.
Here the fastest invalidation is provided by the following picture.
We see here not only airplane which is obviously flying but also a large-scale complex circulation structure extending behind the plane on 100 m scales. The fluid dynamics shown here its typical strong qualitative differences on space scales going from the sub millimeter scales at the wing level to 100 m scales behind the plane. If one is interested by the microscopical properties of the turbulence in the neighborhood of the wings that define drag and lift that decide whether and how the plane flies, then 2 methods are possible:
- CFD (computational fluid dynamics) which will solve the Navier Stokes equations with an extremely high spatial resolution
- spectral methods like von Karman which basically consider that the velocity and momentum distributions at microscopical scales are described by empirical spectral functions.
At microscopical scales and only on microscopical scales both methods (which can be further refined) give reasonable results for the purpose of computation of drag and lift so that the plane flies indeed. Yet as everybody who learned flying knows, even these models are only valid for a small range of plane velocities and attitudes. At velocities and attitudes that provoke stalling, the trajectory of the plane becomes unpredictable and chaotic and the plane no longer flies.
Both methods fail at the large space scales and are unable to predict the complex spatial structures extending behind the plane. CFD fails because the computing time scales like d^3 and there are some 7 orders of magnitude between the wing level turbulence and the large scale turbulence. The computing time is then multiplied by 10^21, which puts the numerical solving of N-S out of reach forever.
As for the spectral methods, they fail because they postulate stochastic laws for velocity and momentum distribution while the large scale structures have clearly nothing to do with randomness. If anything this picture illustrates a ‘simple’ analogy of the global attractor in that it shows how a chaotic system selects some particular spatial Fourier modes and suppresses others.
It then becomes obvious that the reasons for which planes fly and why there exists a limited computability for drag and lift on microscopical scales have absolutely no relevance for the predictability of the dynamics at larger space scales.
Symmetrically, methods that would help to describe the large scales structures would fail to understand and model the sub millimeter scales necessary to compute drag and lift.
For climate, the situation is demonstrably worse than what can be seen here because any valid climate theory has to be able to predict and explain dynamics at space scales that extend to the planetary size, e.g a further 4 orders of magnitude above the scale of the picture above.
This translates to the constraint that climate models use a resolution of O (100km) which doesn’t allow for solving N-S or any other PDE for that matter so that it is unclear what the computed numbers may represent.
4) How simple is simple ?
We have been discussing the invalidity of the most frequent analogies in the frame of fluid dynamics and more specifically in the case of Navier Stokes equations. The simplifying choice of Navier-Stokes is taken because even if the problem of turbulence and of the existence and regularity of solutions to the N-S equations is an open problem, there are still many robust results on which we can build and there are no unknown unknowns.
The first natural question to ask is how relates the complexity appearing in fluid dynamics to the complexity appearing in weather and climate.
In weather it is safe to say that the complexity is identical. Weather models are fundamentally governed by Navier Stokes only. Therefore almost everything we know about fluid dynamics can be immediately transported to weather problems.
For the climate the situation is quite different. It has been extensively analyzed in R.Pielke’s paper here. The conclusion is that by integrating the biosphere and the cryosphere, extending the time scales to centuries and the space scales to the planet, the system becomes much more complex, non linear and chaotic than the simpler weather case. Because we don’t have a theory of this more complex dynamical system that is the climate, we will restrain ourselves to the “simpler” weather case to draw some conclusions about simplicity.
As we have seen above, there exists a finite dimensional global attractor which defines the asymptotic dynamics of the system.
Considering an invariant attractor is already a first important simplification. Indeed the necessary and sufficient condition to predict the systems dynamics is to know the large number of the Fourier modes defining the global attractor. Yet the global attractor is only invariant if the coefficients (weightings) called control parameters are constant. If they vary with time because the forcings and/or the boundary conditions vary with time, then the topology and dimension of the attractor may change dramatically regardless whether the variation of the control parameters is small or large. This effect has been studied in the frame of many toy models like the Lorenz system and shows unexpected birth of new dynamical regimes for example here.
The second simplification is to try to reduce the dimension of the attractor.
One way is to use models called coupled map lattices. This paradigm treats spatio temporal chaos as a collection of N² identical chaotic oscillators situated on a NxN spatial lattice. Each oscillator is coupled to P neighbors and the dynamics of the system is then studied. The obvious limitation is that, as we have seen, the chaotic oscillator situated in Montreal is not identical to the chaotic oscillators on the pole or in Singapore. Despite the limitations of this simple model due to the variability of the local oscillators with space in the real world, it enables us to understand how complexity and spatial patterns arise when local oscillators are coupled. An excellent example how this approach is useful is here.
Another way is simply averaging spatially and/or temporally. Indeed as averaging has for effect to filter Fourier modes, it reduces the volume and the dimension of the global attractor.
However this simplification is an illusion because using an averaging operator is a one way road. Applying the averaging operator A on the global attractor Eg defines an averaged attractor Ea(A). Ea(A) has a lower dimension and a lower volume than Eg.
However for A1 ≠ A2 we have Ea(A1) ≠ Ea(A2) what destroys the universality of the averaged attractor because the topology of Ea(A) depends on the averaging operator used.
Therefore it is only possible to construct Ea(A) if Eg is known; the converse, e.g constructing Eg when Ea(A) is supposed to be known is impossible. Furthermore it is neither possible to construct Ea(A) from first principles nor to find universal metrics in which some A would be objectively privileged relative to all others.
The latter shows that the oft debated question “At what scales does the climate start ?” is meaningless.
If we classify the averaging operators Ai by the scale Si on which they average, then Si>Sj => Vol(Ea(Ai)) < Vol(Ea(Aj)). The information about the dynamics encoded in the attractor decreases monotonically when the averaging scale increases – the climate “starts” at no particular scale and the effect of averaging is to falsely decrease the variability (underestimate uncertainty) whose true value is defined only by the topology of the global attractor.
Last and third simplification is to focus on observed dominant Fourier modes and to study their dynamics, hoping that at least at shorter time scales these Fourier modes will continue to dominate. Formally this method also reduces the dimension and the volume of the attractor to a very low number. The difference to the averaging method is that the latter reduces the dimension in an arbitrary way with no justification while the former bases on evidence and doesn’t privilege any averaging scales.
Therefore this approach is superior even to the serious averaging methods. In order to avoid misunderstandings:
Here by serious averaging methods we do not mean very low dimensional deterministic models like the 1 box … N box models which belong all to the “muddling by oversimplifying” category. Indeed this category of toy models always uses a notion of equilibrium. But an equilibrium is a stable point on the global attractor and we know that the attractor contains only a large number of unstable, pseudo equilibrium points. Therefore the system’s orbit cannot be asymptotically directed towards any stable point (e.g equilibrium) because there are none.
All naïve models of the form d<T>/dt = N boxes where <T> is some spatial average of a dynamical parameter like the GMT contain no relevant information of the system’s dynamics and can be considered useless.
The dominant Fourier mode method has an advantage that we know them. ENSO is the strongest and we have an idea about its spatial and temporal Fourier modes (e.g its dimensions and pseudo periods). Follow PDO, AMO … and more generally all observed oceanic oscillations.
Tsonis, Swanson and Wyatt with the Stadium wave are examples of this approach which shows promising results. These methods have in common the choice of a finite and small number of observed dominating spatio-temporal patterns, their Fourier modes are then determined and finally one analyses how these modes interact.
The weather/climate system is arguably one of the most complex systems we know and this complexity is with us to stay. No analogy with statistical thermodynamics applies to this dissipative non equilibrium system so that an analogous simplification will not take place.
Eliminating as oversimplified and irrelevant all low dimensional deterministic toy models, the simplest approximations are probably low dimensional chaotic empirical models based on a selection of observed dominant Fourier modes and analysis of their non linear interactions (Tsonis like models). A breakthrough in coupled map lattices theory generalizing to variable local oscillators would bring us even farther.
The vulnerability of low dimensional chaotic simple models is that their predictive skills crucially depend on the validity of the founding hypothesis that the spatial and temporal Fourier modes used in the models are and stay dominating. As the time scale at which this hypothesis stays valid is unknown, the statistical predictability of the system via such models is necessarily limited in time.
JC comments: This post was submitted via email, and I did some light editing. As with all guest posts, keep your comments relevant and civil.
Yes, the most important factor for regulating earth temperature is really this simple.
When the oceans are warm and wet, it snows more and that bounds the upper limits of temperature and sea level.
When the oceans are cold and frozen, it snows less and that bounds the lower limits of temperature and sea level.
CO2 just makes green things grow better, while using less water.
What is your time scale between stopping emissions and restarting them if they are found to be safe?
Rambling. Incoherent. Did Curry lose a bet?
“Yet as everybody who learned flying knows, even these models are only valid for a small range of plane velocities and attitudes. ”
I learned to fly in a Cessna 172 circa 1991 and used Microsoft Flight Simulator’s Cessna 172 to practice instrument flight (the view out the virtual windows was pretty deficient for visual flight rules) but the simulated Cessna 172 plane flew quite accurately through its entire operational envelope.
So your statement is false because I learned to fly using both a real plane and a toy model of a real plane on a personal computer. What is your experience again?
“the simulated Cessna 172 plane flew quite accurately through its entire operational envelope.”
Exactly! The “envelope” is the multidimensional space within which the flow araound the aircraft (an particularly the flight surfaces) stays simple and predictable. Now You just try to go a bit outside it…..
I once flew upside down in a Boeing 747. Crashed and burned but what the heck.
When you’re tumbling through the air uncontrollably it doesn’t really matter any longer how faithful it is to reality. This would be like a medical simulation that predicts you’re going to die and you protesting because it doesn’t predict the eulogy. Spare me.
It’s a misunderstanding that’s built into all the pre-chaos-theory paradigms. This highlights the importance of considering paradigms (e.g. Kuhnian) in any discussion of “proofs” “falsification” or other such concepts WRT any field that studies hyper-complex non-linear systems.
You are needed in the CE room stat! ;-)
You can always draw an analogy. The question is how far the similarity extends, and whether it helps more than it harms (the latter by suggesting false equivalences).
Think of ancient maps: at some point, far enough away from the center of map-making, we find “Here there by tygers” or some equivalent statement of non-similarity. In many such maps, between regions which were mapped correctly (according to the assumptions expected for the readers) and “Here there be tygers” were regions that were mapped vaguely and incorrectly, with no documentation of the lack of knowledge. Such maps were often worse than useless due to their tendency to give false confidence to “facts” that were nothing but fanciful rumours. But, within stricter limits, they remained useful.
Here there be dragons seems more apt.
The Heat Death of the Universe is a prediction from Thermodynamics for the ultimate end of every particle of everything. A century ago, that was pretty much as complex as we could get on the birth and demise of the Universe. This was the simple picture we had of all the complexity of everything ever.
Nowadays, even cartoons like http://www.youtube.com/watch?v=oZE-WMy513I give us simple pictures of much greater detail of all the complexity of everything ever.
We’re never going to make the truly complex on its own scales of time and size less complex; we are however not interested always only in simplicity but sometimes in understanding the principles guiding complex histories, and resolving the details of big pictures to a scale meaningful to ourselves.
Where the future is involved, we know we cannot take any but a Risk-based approach, of chances or probabilities or expectations built from the principles we have to date. And the principles we have to date tell us to expect for acts we take there to be consequences, perhaps unknowable in detail but attributable in cause to those of us acting in certain ways. We know we can place a price on certain of those future expectations that are negative, by the Law of Supply and Demand. We know we can identify those who cause the costs of future negative expectations, and we can require them to pay the price or stop being the cause.
In the end, Capitalism is a great simplifier.
“We know we can identify those who cause the costs of future negative expectations, and we can require them to pay the price or stop being the cause.”
The problem is that we can’t even begin to quantify the size/cost of future consequences. You draw conclusions based on a set of firmly held policy views…. and use those conclusions to justify your policy views. Circular logic at its worst. The first step is to accurately quantify what future warming will actually be. The second step is to accurately predict the consequences (potentially both good and bad) for that warming. The third step is to rationally quantifying future costs and future benefits for those consequences. The fourth step is (if costs outweigh benefits) to apply a reasonable discount rate to estimate the present net value of those costs. The final step is to impose additional costs on those who emit more CO2, in proportion to the present net value of future costs. You have the cart far ahead of the horse; heck, you have a cart in detailed design stages before the invention of the wheel. I very much doubt policies like you want will be implemented any time in the next few decades, and maybe never. So if I were you, I would not get my hopes up.
Steve Fitzpatrick | May 23, 2014 at 11:55 am |
That’s just senseless babble, barty. What you want is for the authorities to impose carbon taxes on a public that doesn’t want it. Stop the diversionary BS about capitalism. You want taxation without representation. You are not fooling anybody.
Your reply strikes me as being just as disconnected from reality as was your original comment. You are proposing to impose costs on the emission of CO2 which appear without reasoned justification. I am not. I note again, you said:
“We know we can identify those who cause the costs of future negative expectations, and we can require them to pay the price or stop being the cause.”
Which implies clearly that you have an idea of how “the price” would be determined… so how do you determine that price? My proposed steps lay out a reasoned path to evaluate the ‘external costs’ for fossil fuel use and find a suitable ‘price for carbon’. What is your proposed path?
Like I said before, I would not get my hopes up if I were you.
stevefitzpatrick | May 23, 2014 at 1:04 pm |
Again, you err, with politburo-style rationalization.
Capitalism prices some things, and leaves prices off some things. That is, some things are in the Market, and some are treated as Commons or otherwise.
Have you never wondered how that distinction comes about?
It isn’t an arbitrary process, where one day some government regulator says, “Hey, let’s sell radio bandwidth,” for example.
There are six minimal requirements before a good or service is priced, either by government fiat or natural law.
1. Scarcity: does the good or service have some quality where lucrative use diminishes its availability, such as does the CO2 level go up in ppmv if human industry burns more carbon?
2. Capitalizability: is there a means to induce investment, to bring sellers to the Market, such as giving to all citizens per capita the revenues of carbon pricing through payroll tax deductions and repayment mechanisms, or inducing enterpreneurs to come up with more carbon-efficient processes to reduce their carbon costs by pricing carbon emissions?
3. Rivalry: does the good or service demand a choice between one consumer and another as rivals, such as moving the CO2 level from 300 ppmv to 301 ppmv cannot be regained within a reasonable wait time, except by costly processes?
4. Excludability: do we have the means to prevent lucrative use of the good or service, such as can we prevent dumping of CO2 into the carbon cycle’s service of CO2 waste disposal by preventing the sale and monitoring the process emissions and leaks of carbon-based volatiles?
5. Administrability: can we ensure, as Weights and Measures ensure, that a fair Market can be maintained in this good or service, for instance by repurposing the retail tax system to add a carbon-content component to the price of all carbon products for sale?
6. Marketability: can we induce buyers to the table, for example by putting a fee on the carbon component of carbon-based volatiles that will give the buyers the power of the democracy of the marketplace to choose which budget decisions are best for them as individuals?
If you missed how Capitalism works in grade four Civics, thanks for giving me the opportunity to clear that up for you.
Durning and Bauman might give you a hand in this regard on one alternative way this is being done: http://daily.sightline.org/2014/05/22/17-things-to-know-about-californias-carbon-cap/
How simple is that?
Now, if you want to complicate it, there’s analyses and introductions such as http://wealthofthecommons.org/essay/common-goods-don%E2%80%99t-simply-exist-%E2%80%93-they-are-created that depend on Elinor Ostrom’s work, but really, a good Capitalist only needs to know that if you aren’t paying for what you get, you’re likely a freeloader, thief or communist.
That’s nice, barty. The government Capitalists on the left coast are going to cap carbon and squeeze it out of the market. We have already started to feel the squeeze in California. It’s squeezing a lot of money out of consumers and it’s squeezing businesses and jobs out of the state. They will burn their carbon elsewhere. But the world will be saved. Thanks to barty’s cartoon version of Capitalism.
Don Monfort | May 23, 2014 at 2:49 pm |
See now, if California had followed BC’s example, it’d be pouring that money into the pockets of consumers and bringing jobs and investment into your little cartoon economy through lowering taxes, lowering tax churn, lowering tax inefficiency and achieving results determined by the democracy of the Market, not by politicians.
“The first step is to accurately quantify what future warming will actually be. The second step is to accurately predict the consequences (potentially both good and bad) for that warming.”
No, the first step is to reduce carbon emissions.
The second step is to accurately quantify what future warming will actually be.
The third step is to accurately predict the consequences.
If and only if steps 2 and 3 show carbon emissions are safe we can restart emissions.
@ Steve Fitzpatrick
“The problem is that we can’t even begin to quantify the size/cost of future consequences. ”
Or, more critically (as you noted) the SIGN of the consequences.
Bart et al simply decree, ex cathedra, that the future consequences of emitting CO2 are negative and demand that we pay taxes to them NOW to pay for their postulated future damages.
Neat gig, if you have the police power to ‘bring it home’.
Unfortunately for us and our ilk, it appears that they in fact HAVE the power and intend to exercise it. And to criminally prosecute us for objecting, never mind whether we actually pay or not.
This is for you but it landed in the wrong place.
they should be reduced very slowly, but I think the increase can be fast
I could almost think that you were trying to parody the rants of the green left with your comments, but your references are fair summaries of those rants, so I am pretty sure you are not trying to offer a cleaver parody. Setting a high price artificially on carbon by (fiat) limiting its availability is based on the implicit assumption that future negative consequences will be more costly at net present value than the present day economic cost of that artificial scarcity. It is a conclusion without a defensible rational…. nothing but political philosophy substituting for reasoned analysis. Don’t hold your breath waiting for carbon rationing; it is just not going to happen, at least not any time soon.
Same baseless assumptions as Bart R, same political philosophy substituted for reasoned analysis. Don;t count on it happening.
That is a little vague. It is said we can’t reduce temperature for 1000 years as a proportion of co2 remains in the atmosphere for at least that time.
Surely it is impractical to say curb co2 in the next ten years-with all that implies-And then restart it 20 years after that?
Bob Ludwick | May 23, 2014 at 4:54 pm |
I’m a Capitalist. If you could show me that giving out shoes for free to everyone would result in the solution to all the world’s problems, I would object to the giving away of free shoes. Pay for your freaking footwear. You aren’t willing to pay for your shoes? Well then perhaps those solutions to those problems aren’t worth it to you.
If you could show me that CO2 emissions would solve all the world’s problems, I would still object to free CO2 emissions. Pay for your freaking CO2 dumping. You aren’t willing to pay for your CO2 dumping? Well perhaps that solution to those problems aren’t worth it to you.
What do you have against Capitalism, the foundation upon which America is built?
Why do you hate America?
Steve Fitzpatrick | May 23, 2014 at 5:08 pm |
Price by FIAT?! WHAT THE FREAK ARE YOU TALKING ABOUT?!
I said Law of Supply and Demand.
You understand how that works, right?
The price of emitting CO2 is raised until the point the next penny of rise in price results in less total revenue collected. That’s the Law of Supply and Demand. That’s how goods are priced in Capitalism. That’s how the Market works.
Not by fiat, but by the democracy of the individual buying and selling decisions of everyone in the Market.
HOW DO YOU NOT GRASP THIS?
Have you never shopped in America?
bart does indeed fail Econ 101
No, that is the law of diminishing returns. Since there is no demand or supply mentioned in your statement. NOR is the market allowed to set the price. The price is being set centrally – just like the old USSR. That worked out real well.
It is also how a progressive tax works.
From the article on the BC carbon tax:
Another relevant factor is the surge in cross-border shopping. The past five years have seen a doubling in the number of British Columbians visiting Washington state, most of whom fill their tanks while there (many B.C. truckers and commercial vehicle owners also buy fuel in the U.S. and in Alberta).
Growing cross-border fuel purchases artificially lower reported energy consumption here in B.C.
What about the economic effects of the carbon tax? SP argues it has had little impact on B.C.’s macroeconomic performance, because the carbon tax revenues have been fully recycled back into the economy through personal and business tax relief measures. This claim makes sense. However, the rising energy costs stemming from the carbon tax have hurt some of B.C.’s export industries, as well as manufacturers forced to compete with imports in the domestic market.
In aggregate, the government’s “tax shift” policy has imposed a net financial cost on businesses: the carbon tax paid by all B.C. enterprises (about $600 million per year) exceeds the revenues they save from slightly lower business tax rates. And with the provincial government’s recent decision to lift the corporate tax rate from 10 to 11 per cent, any economic benefits accruing to the business sector as a whole under the carbon tax regime will be further diminished.
Read more: http://www.vancouversun.com/business/2035/carbon+hurting+businesses/8739247/story.html#ixzz32ZlazSPa
==> ” Setting a high price artificially on carbon by (fiat) limiting its availability is based on the implicit assumption that future negative consequences will be more costly at net present value than the present day economic cost of that artificial scarcity. ”
Yes, much better that we set a low price “naturally,” by externalizing the cost of particulates in the air and the geopolitical costs of keeping oil flowing, not to mention any potential costs due to the effects of ACO2 emissions on the planet.
jim2 | May 23, 2014 at 5:18 pm |
Why not read more from guy who are actually Economists from the US side of that border, instead of from newspaper hacks from the Canuckistan side?
So you have a choice. You can believe friend of Climate, Etc., Yoram Bauman, the Stand Up Economist and your fellow American, or you can believe http://www.bcbc.com/pdfs/Jock%20Finlayson%20Bio%20with%20Photo.pdf Executive Vice President and Chief Policy Officer at the Business Council of British Columbia, an association representing 250 large and mid-size BC companies, a man with so many conflicts of interest it makes your teeth spin.
Is it your plan to talk us all to death, barty? Seriously, don’t you have anything else to do? Any other opportunity for human contact?
You should read more carefully. I said you propose limiting supply of fossil fuels by fiat, not setting a price by fiat…. though as any self proclaimed capitalist like you understands, reducing supply will increase price, which is the economically damaging consequence of this policy. You have carefully avoided addressing the real issue here: Your implicit and unsupported assumption that future harm from CO2 emissions at net present value justifies higher energy costs today. (Or if you prefer, that the external costs of CO2 emissions are high enough to justify forced CO2 emissions reductions.) I do not expect you will actually address this issue, since you do not appear to actually have a reasoned argument, only a desired policy outcome, so it is likely not worthwhile to spend any more time on it. I suggest you vote for politicians who think like you, and I will vote for those who think like me.
Your sarcasm does not serve you well; it only discredits whatever else you say.
stevefitzpatrick | May 23, 2014 at 5:55 pm |
Capitalism is not a fiat.
Neither the price nor the supply is a fiat.
The price is a natural consequence of the Law of Supply and Demand. The amount the Market will supply is a natural consequence, too.
We do not need to know whether there is any harm whatsoever to know that scarcity is a property of the carbon cycle’s ability to dispose of CO2 waste of industry. We know there is scarcity because there is rise in CO2 level. It needn’t be a dangerous rise or a dangerous level. It could even be a beneficial rise. If it’s a benefit, then those who want the benefit ought to be willing to pay for it. If they’re unwilling to pay, they’ve proven it’s no benefit.
You really don’t grasp how Capitalism works?
I find that hard to believe.
In Capitalism, we don’t call future harms or proof thereof the fundamental issue, nor need for that issue to raise its head at all. Proof of harm is a subject for the Law of Torts, not the Law of Supply and Demand.
Are you trying to construct a case for to sue carbon burners in civil court?
Well, that won’t work. SCotUS has ruled the EPA has the sole right to deal with the issue of harms of CO2 emissions.
Tell me; do you prefer the EPA to have that power, or the Market?
You are in need of an intervention, barty. I am going to stop encouraging your odd behavior. It’s no longer funny.
Bart R fails Econ 101: “The price of emitting CO2 is raised until the point the next penny of rise in price results in less total revenue collected. That’s the Law of Supply and Demand. That’s how goods are priced in Capitalism. That’s how the Market works.”
There is no justification for maximizing the revenue collected from an emissions tax. None. The resulting tax rate from such a procedure (assuming counterfactually that it could be implemented) would in general either be above or below the optimal rate.
The optimal emissions tax rate balances the marginal benefit of emissions reduction with its marginal cost. The way you do that is to estimate the incremental social benefit of reducing emissions and set the tax to that level. Then anyone whose cost of mitigation exceeds the social benefit keeps emitting (and pays the tax) and anyone whose cost of mitigating is below the social benefit mitigates. That result would maximize the total economic surplus created. The revenue-maximizing tax has no connection to marginal social benefit, only to cost, and so could be way too low or way too high relative to the optimal benchmark.
It’s surprising that Bart R supposedly favors market mechanisms for pollution control but doesn’t understand the basic economics behind them. This subject has been studied to death since the 1970s and there is a vast literature discussing the finer points of using taxes versus tradeable emission permits, implementation problems, etc. Bart R’s Laffer-curve maximization approach rightly has no place in this literature.
“SCotUS has ruled the EPA has the sole right to deal with the issue of harms of CO2 emissions.”
Ummmm… no, the Congress gives that authority to the EPA; what the EPA can and can’t do with regard to CO2 emissions, or any other environmental issue for that matter, can be at any time be changed by Congress…. by fiat, if you will. I actually would not be surprised if Congress does take some steps to restrain the EPA’s actions on CO2 emissions, depending of course on the outcome of the next two election cycles.
Honestly, your understanding of capitalism seems to me so odd as to force a smile. This is not a subject upon which there seems any possibility of agreement between us.
I am in Europe and it is late. Cio.
stevepostrel | May 23, 2014 at 6:28 pm |
Again, you don’t grasp Capitalism.
It’s not a freaking tax if every penny of the revenues go to the owners, directly to their pockets, in the form of cash money.
Why do you insist on getting this wrong?
Are the revenues from cell phone bills taxes?
Are the revenues from selling apples taxes?
Are the revenues from shoe shines taxes?
Pollution control is not the issue, and is a red herring.
We’re not talking about an externality, because we have no more way to justify treating CO2E costs as an externality than we do to justify shoe shines, apples or cell phones as externalities.
Give your head a shake.
stevefitzpatrick | May 23, 2014 at 6:31 pm |
Ah. Europe. Explains why you can’t understand Capitalism. Sorry to have confused you. It’s an American thing.
Of course, you are correct stevepostrel. What you are missing is that barty thinks he is mocking/parodying Capitalists/”skeptics” (us). It’s a variation on joshie’s silly games. It’s not funny. That’s why you didn’t get it.
From the article:
VANCOUVER, May 9, 2014 – The Cement Association of Canada (CAC) is calling on the B.C. government to once and for all change how the carbon tax is applied to the cement sector to address the unintended consequences of the tax and to ensure a sustainable industry that provides family-supporting jobs.
“The cement industry wants to be part of the solution to climate change through equitable application of the carbon tax,” says CAC President and CEO Michael McSweeney. “We continue to push the government to live up to its own Budget 2013 and B.C. Standing Committee on Finance recommendations to examine the carbon tax and address the devastating impact on the cement industry.”
After six years of inaction by this government, local producers have lost nearly a third of the market share to imports since the inception of the carbon tax in 2008. Imported cement coming from the U.S. and Asia is exempt from carbon tax. This creates an unfair advantage for foreign producers, having a negative impact on climate, the Jobs Plan and investment in B.C.
“The cement industry is vital to develop the required infrastructure for an LNG industry, mines, the Site C clean energy project, and the roads and bridges that keep our economy moving,” says McSweeney. “B.C. is the only jurisdiction in the world that does not recognize Energy Intensive/Trade Exposed industries like cement and concrete”.
From the article:
How about the forestry industry? There is a lot of talk about the need for “value added: in the forestry sector in order to create good paying jobs. I couldn’t agree more. But is subjecting our softwood lumber manufacturers to a carbon tax on their production going to achieve that? No, it won’t, because the carbon tax is effectively a tax on processing which is another term for value added. In fact, what this might encourage is the export of raw logs to places like Washington and Oregon where they can be processed carbon tax-free and then reimported to be B.C. where they compete unfairly with domestically manufactured lumber. Again, not only are we not reducing carbon emissions, but we are putting the B.C. economy at a competitive disadvantage.
jim2 | May 23, 2014 at 6:56 pm |
The cement industry in BC is 125% the size it was in 2007, before the carbon tax, and produces only 56% as much CO2E today as it did then, while experiencing substantial savings from converting to less carbon-intensive processes through the benefits of modernization.
It has received substantial inputs from the revenues of the carbon tax, and although it has ‘lost’ market share, this is principally because the exchange rate for the Canadian dollar has heavily shifted.
In short, another industrial lobbyist is lying.
“Boo hoo. I’m rich. Give me government handouts.” That’s what the Cement Association is crying, all the way to the bank.
Thanks for shining a light on this.
You should look into the false claims of the BC Greenhouse Growers next.
jim2 | May 23, 2014 at 6:59 pm |
That’s hilarious. You’d have been better off focusing on the Greenhouse Growers; forestry is getting so much benefit from the carbon offsets market in BC it’s laughable to hear them complaining about the carbon tax.
Maybe if you ask them how they’d feel about losing their offsets funds, you’d hear their true feelings?
From the article:
Prince George, B.C.- While Premier Christy Clark says the carbon tax has had a “disproportionate impact in rural communities.”
She says she lives in Vancouver and has access to alternate forms of transportation, including public transit, which is just not the case for all British Columbians.
While she says the government has little to do with the price of gas at the pumps, governments do collect a lot of taxes that are tacked on to that price. She says that is one of the reasons why the Government has agreed to take a pause on increasing the carbon tax.
Speaking on the Meisner program on CFISFM, Premier Clark asked for listeners to provide their input on the carbon tax “What should we do with the carbon tax, where should we go with it? What changes would they like to make ? It certainly wasn’t perfect when it was brought in.”
She recognizes that B.C. is a leader in climate change issues, “But no one has followed, so we look at where the carbon tax has taken us and there have been all the offsetting taxes, but if there are no followers, it has put us at a real competitive disadvantage for all of our competing businesses if we continue to raise it. So we have to be cautious about that balance that we find.”
From the article:
The carbon tax is an unfair tax in that it affects
some business sectors such as processing,
manufacturing and resource development more than ot
her sectors, thereby selecting from the
entire business community both winners and losers –
not an intended consequence.
It also affects agriculture in that they are “price
-takers”. The prices at which they are required
to sell their products are set in larger markets wh
ere BC agriculture producers must “take” the
price set in that market. They have little or no i
nfluence on the level of pricing set in national
and international markets and cannot pass on their
increased costs due to carbon tax. The
greenhouse industry is a price-taker and was recogn
ized as requiring assistance since their
competition came from other jurisdictions which wer
e much larger and effectively set the
pricing in BC. They received assistance from the go
vernment by means of grants relative to the
amount of the carbon tax paid by the greenhouse ind
If the purpose of the carbon tax is to reduce carbo
n emissions, then the tax has not been
effective in achieving that purpose. Carbon emissio
ns in BC have actually increased during the
period when the carbon tax has been in effect. Sev
eral sectors, such as cement and mineral
processing that require heat for their operations,
have no choice but to use carbon products.
The result of imposing a carbon tax on their operat
ions has been to substantially increase
production costs in comparison to companies in othe
r jurisdictions. The unintended result of
the carbon tax then becomes the export of jobs from
BC to jurisdictions without such a tax, such
as Washington, Alberta, China and other jurisdictio
ns. In most cases the end product from such
operations will still be produced in other jurisdic
tions so that the amount of carbon gas
generated will be the same worldwide, whether peopl
e are employed in BC to produce the
goods or not. The carbon tax has created a detrime
ntal effect on portions of the BC economy
because BC is the only jurisdiction within our trad
ing sphere where such a tax is in force.
Bart R is now sounding like a bad student arguing over the grade he got after screwing up the exam question.
“Again, you don’t grasp Capitalism.”
The capitalization of capitalism is a bit of a tell, but more important is that emissions fees are not set by private parties but by a central government authority and so are a form of regulation, albeit one that is more compatible with harnessing the profit motive than traditional technology requirements.
“It’s not a freaking tax if every penny of the revenues go to the owners, directly to their pockets, in the form of cash money.”
This makes no sense. First, because whether you call it a tax or a fee or a fine or a penalty or a price is pointless semantic quibbling. Second, because within the pointless semantic quibbling it does not comport with standard usage, as income tax rebates don’t stop us from calling income taxes “taxes.” Tax rebates do not negate the taxiness of taxes.
“Why do you insist on getting this wrong?”
Because I’m sane.
“Are the revenues from cell phone bills taxes?
Are the revenues from selling apples taxes?
Are the revenues from shoe shines taxes?”
Again, the semantics are unimportant, but the normal usage is that when the government forces you to pay for something (e.g. earning income, importing goods, owning land, buying gasoline or alcohol) in excess of what private parties would charge, we call that a tax. When I pay the government gasoline taxes at the pump on top of the price I pay to the gas station owner, is that not a tax? Emission fees can be called anything you want, but the key point is that they are government imposed payments in excess of payments enforced by private owners.
“Pollution control is not the issue, and is a red herring.
We’re not talking about an externality, because we have no more way to justify treating CO2E costs as an externality than we do to justify shoe shines, apples or cell phones as externalities.”
Well, we could certainly have a vigorous discussion about the merits of the externality concept as used in neoclassical economics. The late Aaron Wildavsky argued that it was simply a political club used to coerce others when you don’t like their behavior. But if you want to stick to the standard definitions, CO2 emissions would be the classic, archetypal example of an externality. Literally textbook. Shoe shines would not, because I cannot physically consume shoe shines without the owner’s consent, because the provider of a shoe shine is easily identifiable, and because the shoe shiner has a private property right to his labor and materials, unlike atmospheric “services.”
“Give your head a shake.”
Oh, my head is shaking all right, but it is over your arrogant ignorance.
jim2 | May 23, 2014 at 7:16 pm |
Did you save these clippings up all year?
Regurgitating pap straight out of newspapers may be entertaining and all, but it doesn’t prove anything.
Thoughtful analyses, actual data, real accounting and measurement, show the benefits to BC of their carbon pricing scheme. People whinging and complaining and making stuff up is normal and natural, like petty theft. It doesn’t mean we should encourage it.
Prove the truth of the claims you’re reciting with independently verifiable analyses.
stevepostrel | May 23, 2014 at 7:24 pm |
The government also forces you to pay for the goods you walk out of a store with. If you don’t pay, government police officers take you to a government holding cell where a government justice sets a government bail and remands you to a government court where a government judge judges your crime of theft, and you can get sent to a government prison.
Weights and measures? The government enforces those, too.
Currency? Yup, that’s the government.
Me, I want less government.
When some far future version of bitcoin operates without intervention of government, we should all be better off and government should be smaller. When we can all carry and use devices to evaluate the quantity and quality of goods on offer without some government bureaucrat putting a stamp on the package, we should all be better off, and there should be fewer bureaucrats. When we don’t need clumsy systems of courts and lawyers and prison guards to render due process, I’ll be fine with that if that minimization can achieve — as it should — more justice without becoming soft on crime. But that isn’t today, yet.
I don’t care the particulars of how carbon pricing gets done: it can be a set of lobby groups demanding sellers price the carbon in their goods or they will boycott them; it can be a system of weights and measures enforced by meat inspectors or vehicle inspectors or private contractors; it can be LIDAR-equipped probes sent out by Google used by their rivals marketing departments to drive consumers from emitters toward non-emitters. The how doesn’t matter, so long as the carbon price is instantiated under the Law of Supply and Demand and the revenues go fully and directly to every citizen per capita. And I don’t care whether harm or benefit are proven to me, because I am not such a tyrant as to believe I can replace the judgment of every individual in the world with my own.
Why do you believe you’re so all-knowing thay you can?
Ooops! Hockeystickesque ad hoc stats problems with research by famous left-wing cherry-picking economist:
“Thomas Piketty’s book, ‘Capital in the Twenty-First Century’, has been the publishing sensation of the year. Its thesis of rising inequality tapped into the zeitgeist and electrified the post-financial crisis public policy debate.
But, according to a Financial Times investigation, the rock-star French economist appears to have got his sums wrong.
The data underpinning Professor Piketty’s 577-page tome, which has dominated best-seller lists in recent weeks, contain a series of errors that skew his findings. The FT found mistakes and unexplained entries in his spreadsheets, similar to those which last year undermined the work on public debt and growth of Carmen Reinhart and Kenneth Rogoff.”
In his defense, the suddenly famous left-wingo economist protested:
“Contacted by the FT, Prof Piketty said he had used “a very diverse and heterogeneous set of data sources … [on which] one needs to make a number of adjustments to the raw data sources.”
In other words, the raw data has to be cooked to get it to taste right. Sometimes it’s got to be used upside down. And if it still ain’t palatable, it has to be censored.
Now Bart R descends to the level of undergraduate late-night dorm arguments:
“The government also forces you to pay for the goods you walk out of a store with. If you don’t pay, government police officers take you to a government holding cell where a government justice sets a government bail and remands you to a government court where a government judge judges your crime of theft, and you can get sent to a government prison.”
Wow, and we all might just be living in the toenail of a giant, man. Mind-blowing. To think that free enterprise is not necessarily the same as anarchy! Who would have thought of that? But you see, the issue is that the owner of the store can identify which property is his and then call the police, and we agree that he has a right to exclude all others from that property without his voluntary consent. Neither of these conditions holds for the air. So maybe the real issue–that government policy in enforcing private property rights is entirely different from enforcing restrictions on the use of un-owned common property–hasn’t been illuminated by Bart’s bull-session bull.
“Weights and measures? The government enforces those, too.
Currency? Yup, that’s the government.”
More belaboring of the irrelevant obvious.
“Me, I want less government.”
“When some far future version of bitcoin operates without intervention of government, we should all be better off and government should be smaller. When we can all carry and use devices to evaluate the quantity and quality of goods on offer without some government bureaucrat putting a stamp on the package, we should all be better off, and there should be fewer bureaucrats. When we don’t need clumsy systems of courts and lawyers and prison guards to render due process, I’ll be fine with that if that minimization can achieve — as it should — more justice without becoming soft on crime. But that isn’t today, yet.”
I once had a conversation with Fred Smith of the Competitive Enterprise Institute wherein he postulated a future in which each of us would be able to monitor the bubble of air around us at all times and exclude others from trespass, as if we were each walking around in a spacesuit. I think he referred to a future hypothetical civilization on Ganymede or something. It was fun but had little bearing on today’s policy problems, as does Bart’s fantasy. Of course he neglects the problem of enforcement in his utopia, but what the heck. It has no bearing on the choices before us now, which are about applying new forms of government regulation and coercion on the private activities of citizens under the justification of controlling externalities.
I say again, justified by controlling externalities. Taxes and regulations imposed on the unwilling and willing alike, without any private owner who can identify individual loss of use of his property. Nothing to do with extending capitalism or whatever dissociative ravings Bart prefers.
“I don’t care the particulars of how carbon pricing gets done: it can be a set of lobby groups demanding sellers price the carbon in their goods or they will boycott them; it can be a system of weights and measures enforced by meat inspectors or vehicle inspectors or private contractors; it can be LIDAR-equipped probes sent out by Google used by their rivals marketing departments to drive consumers from emitters toward non-emitters. The how doesn’t matter, so long as the carbon price is instantiated under the Law of Supply and Demand and the revenues go fully and directly to every citizen per capita. And I don’t care whether harm or benefit are proven to me, because I am not such a tyrant as to believe I can replace the judgment of every individual in the world with my own.”
Not caring about the particulars means not caring if the policy is workable and beneficial or not, but let’s set aside that particular piece of silliness. Let’s also not laugh out loud at the non-governmental-enforcement fantasies in the above paragraph.
It’s still true that setting the tax rate to maximize the revenue from a carbon tax, as Bart suggests, would be completely stupid. It would not be according to supply and demand, because it ignores the DEMAND for emissions reductions, which ought to be based on the marginal social harm of emissions or equivalently the marginal social gain to mitigation.
“Why do you believe you’re so all-knowing thay you can?”
You see, another obvious distinction between CO2 mitigation and shoe shines is that we can all choose our own level of shoe shining but are forced to share the same level of atmospheric CO2j. So there can’t be free-market, individual choice about the demand for mitigation because we all share the same atmosphere. We’re not buying air to put in our spacesuit tanks and choosing how much of different chemicals we want in our personal mixture.
It is amazing that someone could write the same obsessive-compulsive slogans so many times without engaging in the elementary reflection that would reveal these obvious flaws in his position. Think!
stevepostrel | May 23, 2014 at 8:12 pm |,
That is a nice clear refutation of Bart R silly refusal to address the substantive issue at hard: ‘externalities’ must be evaluated by non-market means, and only then addressed by taxes, if that is justified. The silly discussion of the mechanism by which taxes are imposed only avoids the real issue. Or as I said in my very first comment, Bart R uses his desired policy outcome to draw a set of conclusions…. which he then tries to use to justify is desired policy. Utter sophomoric rubbish.
I happen to be traveling in Europe right now; I am from the States.
” private parties who are offering the use of their share of the carbon cycle to dispose of waste CO2 get paid by the consumers of that waste disposal service.”
OK so now I see you are assigning ownership of ‘the carbon cycle’, And you implicitly think the only proper CO2 level is,well, where that cycle is ‘in blance’ so that atmospheric CO2 is constant. Which once again avoids addressing the real issue of external costs: Why is a constant level of atmospheric CO2 the only acceptable level?. Why not a falling level, or a rising level, or a level which must be cycled by human activities +/-50 PPM from a long term mean each 102.5 years? There is no justification offered, just an assumed ‘right’ balance.
Your arguments amount to nothing more than bizarre rubbish. They are based on nothing more than a presumed need to stop any activity which adds to atmospheric CO2 level…. in the complete absence of a cost/benefit analysis. But at least they are mildly amusing.. Too weird for me though. I will not bother engaging you any further on this….. or any other subject. As my Brazilian friends might say ‘A Deus’.
Bart R thinks he lives in a perfect, crystalline world, and by Golly, he does.
Steve Fitzpatrick | May 24, 2014 at 7:30 am |
How hard is this for you?
You think there’s someone who wants to raise the CO2 level for benefits?
Then they ought to pay for the benefits.
And the Law of Supply and Demand sets the price.
What do you have against Capitalism?
Why do you hate America?
The climate modeller’s innate desire to be able to say “cracked it!”; to pat their colleagues on the back saying “that looks realistic to me!” and the natural human instinct to avert away from opening the Pandora’s Box of niggling thoughts and ideas that might complicate matters. These are where we find most of the flaws in climate models; in the genes of their parents.
We find ourselves in violent agreement.
While analogs are not having a good day in this post, I’ll bravely note that climate warmist modelers and theorists strike me as having perspective distortion quite similar to that of financial prognosticators. i.e. “The markets are going down. We’re all going to die.” on the one hand. With, “Everything is so awesome we’ll soon all be drinking free Bubble Up and Rainbow Stew.” on the other hand.
To state the obvious, herd mentality is self-reinforcing. It results in limitation of perspective which in turn tends to drastically limit the possible range in the result. Because financial markets are 100% human, this information can be used there to make a dime or two. (This is based on focusing on the desired result i.e. “I want to make money,” rather than struggling to predict actual direction. In the Boeing example, this is analogous to ) In climate however, the great sage Yogi Berra would have correctly observed something to the effect of, “It is what it is.”
Judith: Do I see some editorial comments inadvertantly left in the text, e.g. NEEDS CLARIFICATION and COULD USE MORE EXPLANATION?
Fact I Each new generation of Boeing/Airbus jetliner presses harder against the thermodynamic limits to energy efficiency, and informatic limits to control efficiency.
Fact II Each new generation of Boeing/Airbus jetliner relies *MORE* heavily on computational simulation, and *LESS* heavily upon wind-tunnel validation.
How can this be? Tomas Milanovic’s ill-posed ill-justified faux-mathematical hand-waving arguments in regard to simulation efficiency and accuracy in respect to turbulent dynamical flows are “not even wrong.”
Reference Forrester Johnson, Edward Tinoco, and Jong Y’s Thirty years of development and application of CFD [Computational Fluid Dynamics] at Boeing Commercial Airplanes, Seattle
The Valuation of the Market The market’s valuation of dynamical simulation companies has been soaring ever-higher for the past twenty years; wind-tunnel companies, not so much.
These math-and-science considerations are plainly evident, eh Climate Etc readers?
Planes are small compared to the entirety of the climate system. That’s how.
I nominate this for Most Vapid Comment.
Harold, you must be new here. That doesn’t even come close to FOMD’s most vapid comments.
Fan misses the key point when he writes:
” Each new generation of Boeing/Airbus jetliner relies *MORE* heavily on computational simulation, and *LESS* heavily upon wind-tunnel validation.”
Fan I spend 20 years as a Boeing engineer. The reason computational simulations are now relied upon is because that they have been demonstrated to match what was observed in wind tunnels within reasonable tight and consistent margins of error.
That is not true of climate models. A simple truth
Rob Starkey: Fan I spend 20 years as a Boeing engineer. The reason computational simulations are now relied upon is because that they have been demonstrated to match what was observed in wind tunnels within reasonable tight and consistent margins of error.
Kudos for bringing in the concept of the “reasonable” and “consistent” margins of error. Milanovic hand waves an argument that a simplified model can not be constructed that is exactly equivalent of the complex model with many complex sub-models. However, that is not to claim that a simplified model with an accuracy of say integrated mean square error in the estimates of regional mean temperatures of 0.25C might not be constructed.
With an allowance that FOMD ignored the extensive testing and margins of error, I think FOMD’s Boeing example is pertinent to this discussion, and an example of why Milanovic’s argument has no practical force. There is not guarantee that a model with an integrated mean square error of 0.25C in estimating regional means will ever be constructed, but there is no reason to think it is impossible. It is just really hard.
If Boeing aircraft were designed and built using simulations as off-the-mark from reality as climate models, there’s be a 97% crash rate.
Of course we were going to hear the silly mentioning of computer modeling of planes and air flow. That’s not relevant because we have one planet with one climate and little chance of experimenting with it in a controlled way,, and on the other side a million wings in a million different, controlled conditions that have been studied for decades and translated into working, verified models as a matter of course.
Regarding the Boeing argument, I’ve used the inverse of what’s mentioned to point out that we can model successfully across most of the wing chord (the trailing edge can often be a problem) at high Reynolds numbers where kinematic viscosity doesn’t create ugly issues that send us back to the wind tunnel.
I should also note that the Navier-Stokes equations are invalid if the system undergoes evaporation or condensation. We ignore those in CFD equations for aircraft, even though there’s obvious condensation occurring during flight at particular altitudes, temperatures, and humidities, but in climate science it would amount to ignoring the weather itself.
Taking a completely different spin on the discussion of orbits (real instead of chaotic), a change of 1 W/m^2 of downwelling radiation is the same as you’d get from moving the Earth sunward by one-fifth of the Earth-moon distance. If our planet can’t survive that kind of shift, the Prime Mover must’ve been really, really careful to place the planet exactly where it is. One slip of his giant hand and we’d be toast!
George Turner: I should also note that the Navier-Stokes equations are invalid if the system undergoes evaporation or condensation. We ignore those in CFD equations for aircraft, even though there’s obvious condensation occurring during flight at particular altitudes, temperatures, and humidities, but in climate science it would amount to ignoring the weather itself.
And then there is the sun faint earth paradox:
The faint young Sun paradox or problem describes the apparent contradiction between observations of liquid water early in the Earth’s history and the astrophysical expectation that the Sun’s output would be only 70% as intense during that epoch as it is during the modern epoch. The issue was raised by astronomers Carl Sagan and George Mullen in 1972. Explanations of this paradox have taken into account greenhouse effects, astrophysical influences, or a combination of the two.
Naturally, and magically, CO2 levels have accommodated the earth’s temperatures to keep it warm enough, but not too cold, according to the warmistas.
I’ll try to summarize the post….
1. The microscopic affects the macroscopic in both space and time, not the other way around – likewise my neurons make me think, but my thinking doesn’t get my neurons working
2. Taylor approximations only work for very small changes
3. You can simplify the climate to study it, but the end result is that you study the simplified climate and not the real one
Exactly! Simple is as simple does. For example, changes in the amount of cosmic rays that bathe the Earth as it dashes through the leftovers of busted stars is like having a galactic gravestone fall on your toe. You can’t ignore that, right? Assuming they even care there is only one sane and rational answer as to why Climatists fail to take account of it: because there is not enough computing power on Earth to recon with it so we must put that variable on ignore.
Many thanks for posting a very interesting and understandable summary of the problem of chaos in climatology.
The attention of Climate Etc readers is directed to free-as-in-freedom on-line April 2014 theme issue of Notices of the American Mathematical Society, including in particular Chen et al. OpenFOAM for Computational Fluid Dynamics, and Bohun Introduction to Modern Industrial Mathematics
Even on desktop computers, OpenFOAM in particular has no trouble in accurately simulating precisely the turbulent flows over complex structures that Tomas Milanovic believes are infeasible to simulate.
And OpenFOAM is free.
Conclusion Milanovic’s “no-go” arguments seemed plausible twenty years ago … nowadays, not so much.
Ok, now go back and read what he actually wrote.
Linearity could be expressed this way: Does something happen? Yes. For instance, pick an extreme, say, a river flood. Do floods happen? Yes. Do they sometimes overwhelm the river valley? Yes. Will they? Yes. The upper limit of a flood can be defined geomorphologically, by looking for the highest level above ‘normal’ flow that shows evidence of fluvial processes (erosion or deposition). That is the upper limit defined by the most extreme event. However, the linearity is not “linear” between normal and extreme, owing to the factors that come into play as intensity rises……the increase in flow is not linear. The size of the weather system as it develops is not linear. Which brings us to a wall. These things happen. That is a certainty. But how? What is the path of causation? Can one predict? There’s the problem. At any moment in time, like the airplane wing under design conditions, we can predict the outcome. Go beyond those design conditions and things become fuzzy as to what causes the plane to crash. We know it will, but how? At any moment in time, we can simplify. But how do we integrate a series of simplifications to accurately ‘model’ the outcome?
It would seem that no matter what we do there will always be a range of outcomes….essentially chaos….chaos that nonetheless must obey the laws of physics. Maybe there is an element of fractal subdivision, so finding the set that replicates at all levels from a small eddy to a hurricane-sized vortex might be the holy grail…..but if there is a range of outcomes….which one will it be? THAT is the difference between prediction and postulation, I guess!
In other words, the climate system is a massively complex, chaotic, non-linear coupled system affected by many variables – some known, and who knows how many unknown unknowns. To “simplify” the system to say that ANY single element acts as THE control knob, especially when that element makes up .04% of just one of the subsytems of the overall system, and that man’s contribution to that .04% is anywhere from 4 to 30%, is, in a word, nonsense.
Only the surface of an airplane wing provides lift. The rest of the wing subsystems do not.
Only radiative gases cause the greenhouse effect. Oxygen and Nitrogen do not.
Non condensing greenhouse gases provide a backbone for the total greenhouse effect.
So much for the 0.04% argument.
You will have to do better than that.
Not to mention that Barnes contradicts himself when he appeals to chaos and then argues the effect of CO2 must be small because it’s a trace gas.
Guess we can’t apply the KISS principle to
The KISS principle is for things that humans design. Other things (including things that evolve) are under no such obligation.
Do not modellers design climate?
If we prick them do they not bleed?
Tomas recognizes climate involves a nonlinear, dissipative thermodynamic system far from equilibrium. As an experiment, let’s construct an enormous gas discharge lamp of kilometric scale and irregular shape. For good measure, we will build in a number of asymmetrically placed I/O ports. When turned on, can we calculate the local distribution of internal fluxes of matter, charge, energy? Of course not. But can we determine the rate at which it’s dissipating energy based simply on external observation? I do believe so. Can we similarly determine the effects of boundary perturbations on dissipation? Ditto. And thus it is for climate sensitivity – the ratio of a dissipative flux change to a surface potential change. Simplicity? Quite!
I agree that attempts to construct macroscopic theories from microscopics are ‘premature’. The inverse path may be more revealing. For instance, the stability of a thermodynamic steady state depends on an inverse correlation of local temperature fluctuations and energy flux (hot spots cool off). And yes, there are macroscopic invariants, non-divergent energy and free energy flux functionals. Thermodynamics remains an exception to the assertion “All laws of nature are local.”
Pushing the envelope originates in aeronautical engineering. You have a model, say some way of solving the NS equations via this or that approximation, and then compare it to the real thing.
The region over which your model is tested as matching is inside the envelope.
Pushing the envelope means going outside, where the model may not work. It hasn’t been tested.
That’s how you get reliable results with airplanes. You stay inside the envelope or verify outside.
Climate science doesn’t do envelopes. It just models guesswork. No checking, of if there is checking, it’s hand-waved away. There’s some reason to ignore the data, always.
Or you refine the model, which amounts to curve-fitting. You might as well use a polynomial and forget it. There’s no model at all.
Airplanes fly by throwing air downwards. The wing shape happens to be most efficient for doing that with least forward drag, a result of the NS equations rather than any first principle.
Everything you heard about Bernoulli is wrong. The air is going faster because it ran down a pressure gradient, not the reverse. Cause and effect are backwards.
To back up your observation, you can make a piece of plywood with a propeller engine attached fly even though the cross-section of the plywood looks nothing like those diagrams of the Bernoulli effect in the encyclopedias I read as a kid. It’s basically just adding up force vectors (at least outside the stall regime).
Confounding our statistical analyses, the geophysics of our dynamic water world may be unique in all the universe and so it is just possible that with our sample size of only one the degrees of freedom are therefore zero — i.e., our observation is our only reality: what we see is what we get; and, nothing more.
Locality isn’t valid in quantum mechanics. The field seems to be the fundamental thing.
Isaac Held is completely right and “Tomas Milanovic” is completely wrong.
As substantiation of Held’s views, I have been working on a model of ENSO. ENSO is the Pacific ocean sloshing behavior that drives El Nino and La Nina. Ostensibly, this would be considered an intractable problem of hydrodynamics by such folks as “Tomas” and “Skippy”. Yet it is simpler than these dudes would imagine, and like Held asserts (and I agree) it is close to being deterministic.
Why is it simpler? Because everything does reduce to first-order physics, and the forcing on the system is what matters. And forcing is easy to model.
I have a differential wave equation, slightly perturbed with a periodic nonlinear factor, that does a remarkable job of modeling the behavior of the Southern Oscillation index over the last 130+ years.
Now, what happens if this model is wrong due to it being too simple ?
Well, it still fits the observable system and it has utility in that regard.
How do you like them apples?
I see we have future Physics Nobel Laureates ’round here. I am impressed.
Now if only the concept of forcing corresponded to a real world thing…
Enough to ask for your model’s ENSO prediction for 2014-2015. That is a hot topic (pun intended) with Trenberth hoping for heat. Make the prediction, and we can revisit at Christmas time how good your ENSO model is.
Webster’s world view is the very essence of simplicity. He’s a genius while all who disagree with him are little better than drooling morons.
Rud, It’s just too bad I don’t work for you and I don’t kowtow to your wishes big boss-man. Ha ha.
It was a polite request for you to follow the scientific method.
In poker, it is known as calling a bluff. You just folded your credibility cards.
Rud Istvan | May 23, 2014 at 12:43 pm |
Scientific method = poker.
That makes sense.
No Rud, What you did was a classic fallacious argumentation technique called moving the goalposts or raising the bar.
I do whatever I do when I am ready, and not on the whims of some vulture capitalist that hasn’t contributed anything of significance to the advancement of science.
I’m no vulture capitalist and I’ve asked you for your prediction a month ago.
‘The essence of science is validation by observation. But it is not enough for scientific theories to fit only the observations that are already known. Theories should also fit additional observations that were not used in formulating the theories in the first place; that is, theories should have predictive power.’ http://www.project2061.org/publications/sfaa/online/chap1.htm
Webby uses solutions to Mathieu functions for standing waves in an elliptical bathtub. He then uses a variety of means to scale the resultant to the SOI. It is a weird sort of homeopathic math that I am sure has no basis in theory or reality.
Solutions to the bathtub model look like this.
But hey – the proof is in the pudding. Ante up webby.
Try getting apples out of cider.
Web. If you make and publish your prediction and it is validated by observation, think how much more your model will be worth. You could sell it for actual money or sell your predictions like the hurricane prediction company our kind host operates. Actual cash!
I to think you should tell us your prediction so we can see if it is validated this fall, winter or next year.
On the other hand if you don’t tell us your prediction and then after ENSO does whatever it does this fall – and you then report perfect match with observation, we will not be impressed. Publication gives credibility.
WebHubTelescope: Rud, It’s just too bad I don’t work for you and I don’t kowtow to your wishes big boss-man. Ha ha.
This is a great opportunity to test your model by making an actual prediction of out-of-sample data. Show your stuff big man! Show us what you can do! It’s not just Rud Istvan and Matthew Marler; all the denizens will be watching.
WebHubTelescope: No Rud, What you did was a classic fallacious argumentation technique called moving the goalposts or raising the bar.
Not so. The test is always whether the model can predict out of sample data.
Most of the commenters here are pseudo-scientists if they believe that a model is only worthy if it can predict. That is the land of wayne’s world worthiness.
For example, models used for prediction that have no scientific basis behind them are usually referred to as heuristics.
Consider that a heuristic is useless once it fails to predict, as it adds no knowledge or understanding of the physical principles behind the behavior.
A scientific model is still just as useful if it fails to predict, since it adds to the knowledge of the contributing factors.
The outcome is that if one gets the science right, the rest of the pieces fall into place,
OK, show us what your heuristic can predict. No pain, no gain.
Marler, You are letting your statistician bifocals get in the way of logical reasoning. If the behavior is deterministic and there is only one sample series available, then the technique you describe is less powerful than you imagine.
Take the case of following the trajectory of a baseball thrown in the air. So one measures the path for a certain distance, notice that it follows a parabolic arc, and then comes up with a model. Yet along comes Marler, who wants it to go further and see if you can predict beyond the arc that was measured. So you measure that further distance and see that it works as well. Check.. But then comes Marler Jr, who is not satisfied and wants to see even more data. That gets tiresome real fast.
This is another one of those fallacious arguments demanding impossible perfection.
Look, my model is very simple. It is a second order differential equation with a periodic perturbation. I know how to numerically solve these and the agreement is outstanding. Lots of ways to slice up 130 years worth of SOI data, and it looks good however it is sliced.
I will write up a post this weekend on my blog. This has been a work in progress and all I can say is that being persistent in developing a simple model does pay off.
Can I hear the sound of webby wimping out to the strains of a vague song and dance about ‘sloshing’ being a fundamental advance in understanding the dynamics of ENSO?
Of course the AAAS – in the quote above – disagrees as to the relevance of prediction to theory. A model is just a theory.
WebHubTelescope:This is another one of those fallacious arguments demanding impossible perfection.
Not so. I suggested o.25C imse for judging climate models with respect to regional means. For ENSO, how about imse of 0.1C for next Jan to March, or a 3 month span with your predicted peak in the middle? Choose your own publicly available data set, so you don’t need an Eddington expedition or new satellite launch for the data. Predicting SOI or MEI instead? A reasonable standard of accuracy can be specified.
Every test of hypotheses has flaws, but testing the prediction against future data is the best test. Are you backing away from your previous claim that ENSO is predictable?
This is not “raising the bar”: PK models used for dosing regimens are tested this way. Calibrated measuring instruments are tested this way. One of our Boeing experts can tell us whether the Boeing models were tested this way.
Already no readers here believe your model can predict ENSO (chime in, Oh Readers!), so you have nothing to lose.
@Matthew R Marler
“Already no readers here believe your model can predict ENSO…”
That is not my frame of mind. I would like to see a prediction if one can be appropriately made. (Some models forecast but some models quantify relationships, etc. I do not know the particulars of WHUT model.)
Certainly Tomas is much further from predicting numbers or even qualitative behavior. So perhaps we should release the hounds of hell on him for such non-testable speculation. No, I do not think so. Effort at all levels of abstraction seems to be a proven approach–thinking of QM in the foist few decades of the last century. [Before it got cute.]
If WHUT is not ready, then he is not ready–but it sure is fun goading him given whut he dishes out at times.
Nicola Sacfetta had the guts to make his model and to publish his model’s predictions vs the IPCC’s. He also says he can forecast/hindcast from each half of the historical data to the other half.
You claim a much more precise fit. When will you put your model to the test?
Make your prediction and compare it against both Scafetta and the IPCC.
Nya, nya, a boo-boo.
‘Finally, the presence of vigorous climate variability presents significant challenges to near-term climate prediction (25, 26), leaving open the possibility of steady or even declining global mean surface temperatures over the next several decades…’
Chaos ‘predicts’ the pause persisting for decades along with more frequent and intense La Nina.
ye of no faith. I do know what I am doing. Why would you think otherwise?
ye of no faith. I do know what I am doing. webby
Already no readers believe that he does.
You must be projecting. Mine is not a heuristic, but a simple model based on first-order physics.
WebHubTelescope (@WHUT) | May 24, 2014 at 2:11 am |
You must be projecting. …
On no, it’s not that. I was just messin’ with you:
“For example, models used for prediction that have no scientific basis behind them are usually referred to as heuristics.” [WebHubTelescope (@WHUT) | May 23, 2014 at 5:38 pm]
Chaotic systems are defined as deterministic and dependant on initial conditions – which makes them unpredictable (because we can’t measure sufficiently accurately). So the patterns you see vary e.g. the sun’s sunspot cycle varies around 11 years (we’re just having an unusuallylong inactive one).
I understood Tomas’ post to imply that the climate is also fractal i.e. self-similar at all scales. (that is similar – not identical).
I hear some kid out here on the playground has a perfect half-court shot. OK, we’re here with the cameras.
mwgrant: If WHUT is not ready, then he is not ready–but it sure is fun goading him given whut he dishes out at times.
It sounds as though you do not believe that WebHubTelescope’s model can make a reasonably accurate prediction.
WebHubTelescope: ye of no faith. I do know what I am doing. Why would you think otherwise?
a. because you have not putlished your model, or even presented it publicly at a scientific meeting.
b. because you have not made a successful out of sample prediction with a pre-specified margin of error on a relevant data set.
However, I have not claimed literally that you do not know what you are doing, I have claimed that there is no demonstration to back up your claim that your model can predict any aspect of ENSO. I am critiquing the model not the person, except for noting that you personally did claim predictive power for your model.
It was sad when Mighty Casey struck out. You decline to take a turn at bat.
Matthew R Marler
Which part of If WHUT is not ready, then he is not ready–but it sure is fun goading him given whut he dishes out at times.
“It sounds as though you [I] do not believe that WebHubTelescope’s model can make a reasonably accurate prediction” ?
I do not have an opinion on what the model will predict or even if I can be used to forecast–maybe it just fits sets of concurrent observated variables. As I stated, ” I do not know the particulars of WHUT[‘s] model.” I just enjoyed seeing him unwittingly step on a fresh cow paddy of his own making by ‘declining to bat’. :O)
I find it interesting that he will take the hits for now–there is the promised post on his blog–and am happy to pile on in moderation by twisting his words. But that is nothing more strong than opportunistic sport. I note WHUT closes with some legitimate qualification/expectation:
“Now, what happens if this model is wrong due to it being too simple ?
“Well, it still fits the observable system and it has utility in that regard.”
[Full disclosure here — many years ago I used/flirted with the Mathieu equation in a completely different field to characterize a process–chemical bond formation. It was useful (to me at least.) Nothing inherently wrong with simple models; it is all a matter of when, where and how.]
…or even if it can be used to forecast–maybe it just fits sets of concurrent observed variables.”
mwgrant: I do not have an opinion on what the model will predict or even if I can be used to forecast–maybe it just fits sets of concurrent observated variables.
I did not mean to imply that you think his model will fail. Of the people who think his prediction will fail, people who have no opinion, and people who believe that his model will succeed (within the appropriate margin of error), two classes comprise those who do not believe that his prediction will be successful.
Look at how impatient they are.
Yet they snap to attention at this “Tomas Milankovic” character.
Webby uses solutions to Mathieu functions for standing waves in an elliptical bathtub. He then uses a variety of means to scale the resultant to the SOI. It is a weird sort of homeopathic math that I am sure has no basis in theory or reality.
Solutions to the bathtub model look like this.
But hey – the proof is in the pudding. Ante up webby or I am entitled to continue to believe that your fractured math and fantasy physics is as ridiculous as I describe. A bathtub conceptualization of ENSO using homeopathic magic.
WebHubTelescope: Look at how impatient they are.
Yet they snap to attention at this “Tomas Milankovic” character.
Who are they?
This is a sneak peek at how the model works.
This chart shows a solution to a Mathieu equation using the Mathematica numerical solver, applying two periodic forcing functions — one corresponding to the QBO frequency of about 28 months and another corresponding to the inertial wobble of the earth at about a 6 year beat period. Since both the QBO and Chandler wobble vary slightly over time, the model goes in and out of phase with the SOI data.
It’s really a very simple model which uses known factors as forcing inputs.
‘We examine the El Niño–Southern Oscillation (ENSO) influence on the quasi-biennial oscillation (QBO) modulation of the cold-point tropopause (CPT) temperatures. An analysis of approximately five decades (in most cases 1950s to near-present) of radiosonde data from 10 near-equatorial stations, distributed along the Equator, shows that the ENSO influence on the QBO is quite zonally symmetric. At all stations analyzed, the QBO has larger amplitude and longer period during La Niña conditions than during El Niño over this total period. We also show that as a consequence of the ENSO influences on QBO periods and amplitudes, the differences between the warmer CPT temperatures during QBO westerly shear conditions and colder temperatures during QBO easterly shear conditions are larger during La Niña than during El Niño for all stations for the entire period considered here. This strengthens earlier findings that the greatest dehydration of air entering the stratosphere from the troposphere occurs during the winter under La Niña and easterly QBO conditions. In addition, stratosphere/troposphere wind and temperature profiles are derived to establish the degree of QBO downward penetration necessary to influence zonal winds and temperatures in the upper troposphere.’ http://onlinelibrary.wiley.com/doi/10.1002/qj.2247/abstract
In other words – he modulates a bath tub solution for a standing wave by a strong effect of ENSO – QBO – and a weak effect – Chandler wobble. Neither effect can be predicted as much as can ENSO. What a bore. I keep telling him that poorly fitting a curve to data – using circular logic even – is not very interesting.
The deniers are very afraid of simple models because they aid in understanding, which is counter to their aims of creating FUD. The simple models also help to democratize science, enabling an easy entry to the world of mathematical modeling.
Moreover, when it comes to the way science needs to be done, the deniers are first-class hypocrites. When they are aware of someone that is trying to democratize science, they immediately go to the fainting couch and demand that the science go through conventional peer review. That is the same peer review process that they are always belly-aching about in terms of being an incestuous process. You see, the deniers are all about conservatism, which is law and order and by-the-book rules and regulation, necessary to keep their corporate masters happy. Yet, in a massive case of psychological projection, they consider it perfectly OK for their own kind to post all sorts of questionable analyses on krank sites such as WUWT, but not the other way around. Recall the acronym IOKIYAAR.
The way that I am going about this is like a race. The bear is chasing me and whoever else is working to solve the mystery of ENSO. I don’t have to worry about whether the bear catches me, all I have to worry about is that I outrun my colleagues. Blogs are nice that way. Generate a blog post, kerplunk, it goes into the internet archive wayback machine. No worries.
Keep an eye on http://ContextEarth.com
Too bad, concern trolls.
What are you prattling on about now? You were challenged to validate your misbegotten and misshapen Frankensteins monster triple plus blogospheric unscience in the usual way that science is validated as per the AAAS quote above – and as should generally be recognized by any competent scientist anywhere.
This has got nothing to do with peer review but about the practice of the scientific method itself.
Your ridiculous bathtub curve scaled to absurdities has no chance at all of being a fundamental advance in understanding ENSO or of predicting squat. Prove me wrong dipschidt.
Grow up and get over yourself.
Poor lil Skippy has a temper tantrum. Typical larrikin behavior.
Words are cheap. Your twaddle is especially cheap.
You have one honourable and scientific course open. Provide the validation.
If you could in the interim cease with the inane prattling and preening and gratuitous projections – that would be neat.
Thanking you in advance.
I’ll put my money on the bear. Faster webby, faster!
WebHubTelescope: This is a sneak peek at how the model works.
No need for sneak peeks, because the model has been available for study for some time now, and some of us have studied it. An interesting question for now is whether it can make accurate predictions, and now is a good time to test whether it can make accurate predictions. Should it make a reasonably accurate prediction, even more people will study it, especially after you publish it, along with the record of its accurate and public prediction.
Are you guys still having problems with simple models?
Paraphrasing Nicholson, wait till they get a load of this:
Lakers fans that we are, all I can say is SWISH !!!
nothin but net
‘The essence of science is validation by observation. But it is not enough for scientific theories to fit only the observations that are already known. Theories should also fit additional observations that were not used in formulating the theories in the first place; that is, theories should have predictive power. ‘
It is scaling a bathtub concept to fit a data series and not a credible theory at all. Which is why webby wimps out of providing actual validation. Why am I not surprised?
You shouldn’t be suggesting to scientists that simple models don’t work.
It bounces back at you. Ad you know what that means …. OWN GOAL!
You have yet to see the validation check.
Quite apart from considering it fringe triple plus blogospheric science – I dispute that it is a model at all.
It uses homeopathic math of standing waves in a bathtub – scaled with what are effects rather than causes of ENSO at best. If you are not simply waving your pencil around for no one to admire.
I love simple models. This 1st differential global energy equation one is based on the 1st law of thermodynamics and actually makes sense.
d(W&H)/dt = energy in (J/s) – energy out (J/s)
I noted Loeb et al 2012 – http://www.nature.com/ngeo/journal/v5/n2/full/ngeo1375.html?WT.ec_id=NGEO-201202 – closed out the radiant imbalance using ocean heat. But hey I was first.
Fitting a line to a data series on the other is far from impressive. Predicting ENSO beyond a few months – which is the best we are capable of so far – would be impressive. But somehow I doubt that a homeopathic bathtub model is going to do the job. Prove me wrong instead of just waving your pencil around.
That’s misbegotten and misshapen Frankenstein’s monster fringe triple plus blogospheric unscience
The Aussie is out of his element. He has to resort to sockkpuppet comments to argue his POV.
chief, skippy, robert, dingo-boy, whatever
You lose. Now watch this swing
More to come
Hey, got me a pic of one of those complicating thingies. It’s just a drawing of a place I’ve never been. In fact, nobody’s ever been there, not even James Cameron – hence just the rough sketch. Bit embarrassing really. It looks awfully large and would probably squash you or give you a bad scald if you went there. I can see why people with fragile models and delicate data collections stay right away.
The place is called Earth.
“All laws of nature are local.”
except the law that they are all local, which is global.
This is what I like about Mosher. Smart fellow always thinking outside the circle.
No, this is what happens when you let a philosophy clown in the donkey tent.
“thinking outside the circle”
On the other hand,
“Then, as his planet killed him, it occurred to Kynes that his father and all the other scientists were wrong, that the most persistent principles of the universe were accident and error.”-Frank Herbert, Dune
“We will not attempt to define simplicity which is a subjective notion ”
err.. really depends upon the field of study
The problem is not that Stephen-Boltzmann is wrong, or that the gas laws are wrong, or that Newton’s law or Kirchoff’s law is wrong. It is that current climate models ignore well-known thermodynamics. Let’s apply what is known correctly, then we can examine non-ideal behavior at the fringe. Of course, this would spell the end of the gravy train.
Could you critique how Nobel Prize winner Dr. Molina is wrong in his thinking: http://theenergycollective.com/davidhone/60610/back-basics-climate-science
Stephen, where are evaporation and convection in that model? They dominate the surface cooling.
@Stephen Segrest | May 23, 2014 at 12:27 pm |
Could you critique how Nobel Prize winner Dr. Molina is wrong
It’s not that he is wrong in what he said. He is begging the question. The problem isn’t with the basic principles of physics underlying climate, it is that the climate system is a very complex system based on the simple principles. It is the complexity of climate that is the problem, not fundamental physics.
The CFC issue was a relatively simple problem in chemistry. Climate isn’t simple at all.
Doing a lab test on carbon-dioxide in a confined space does not prove that the same thing will happen in a completely unbounded atmosphere. Physics says that enthalpy (potential for heat or energy transfer) will raise due to additional carbon-dioxide. It could just cause expansion of the atmosphere .
Ann, what you say is true. But that does not mean an experiment (using matter and energy) won’t give you valuable information. Judy put up a post about ocean warming. This post focused on the role of the atmosphere-ocean interface. The interface between sea water and the atmosphere can be studied in the lab. In a well-designed experiment, heat flows can be traced, wind can be simulated, different wavelength ranges can be tested. I think such experiments could tell us a lot about the behavior of that interface.
OTOH, creating a model without knowledge of the system under study to constrain the model is just begging to be misled.
I wonder if all this enthusiasm for models of something we cannot truly test can be made useful by applying it to the study of another complex system.
Individual neurons work very simply. Please provide a functioning computationally feasible and working model of the human brain.
IMHO: The field of componential fluid dynamics (CFD )is much like climate science. If one reads the peer reviewed literature one would think it predicts aerodynamic performance well. If one is an experimentalist one knows its usefulness is much less than advertised. Depending of CFD for predictions outside the tested envelope is extremely risky.
For example if one looks out the window of a commercial aircraft one will see small metal tabs protruding from the wing. These are vortex generators and are added to fix aerodynamic problems CFD and wind tunnel testing did not identify. They are the aerodynamic equivalent of a band aid and can wipe out a bunch of a new wings drag reduction.
Error by Dan, correction by FOMD.
Modern computer-designed wings (e.g. Boeing’s 787) require few-or-no vortex generators … with attendant gains in efficiency, needless to say!
FOMD is much more on his game today.
Take a close look at the 787-8 tail lots of VG plasted over it.
Per Doug Ball’s recent “Recent Applications of CFD to the Design of Boeing Commercial Transports“ (2010), the 787’s vortex generators were designed-in with CFD guidance (not add-ons to fix CFD deficiencies).
But heck, what credibility does Doug Ball have? He’s just Boeing’s Chief Engineer for 787 Aero Characteristics and Flight Performance!
See also Ball’s “The Use of Modeling and Simulation at Boeing Commercial Airplanes“ (2008), also “Contributions of CFD to the 787 — and Future Needs“ (2008).
These presentations are highly recommended to Climate Etc CFD-skeptics.
Plenty of job opportunities here for young mathematicians/dynamicists/programmers. Old fashioned wind-tunnel operators, not so much.
The value of CFD is one can wrap an optimizer around it and produce a clean 787 wing; however one can’t depend on the answer without verification with wind tunnel or flight test data. You can bet there were many CFD low drag designs that did not stand up to air over the wing. They looked good until the data came in. Yes CFD was instrumental in the 787 design but so was wind tunnel testing.
From the post:
Before starting the review, it is necessary to make a preliminary remark which deals with the problem of scales and reads: All laws of nature are local.
This is manifestly not true. One of the fundamental assumptions of physics is that the laws of nature apply everywhere in the Universe. If not, astronomy is doomed.
All models are wrong. Some are useful. Useful means suitably fit for purpose. Weather models are useful. Look how far hurricane uncertainty comes have come. But not useful for climate.
One of the takeaways from this most interesting post is that ‘dominant Fourier’ simplifications like stadium wave may be a useful climate way forward. That requires periodicy data that is unfortunately in short supply prior to the satellite/ARGO era. Which points to a potentially fruitful direction for empirical research into high periodicy climate proxies. The sort that ClimateReason is doing for CET temperature, or could be done, for example, for regional rainfall or whatever.
More data, less treating model outputs as if they were equivalent. Cut model funding and put the money into rehabbing the equatorial Pacific buoy system that has been allowed to deteriorate. A specific example of the more generalized thought.
Thanks Tom for addressing “linearizability” issues.
I would welcome your comments on the usefulness of modeling “climate persistence” using “Hurst Kolmogorov Dynamics” e.g.,
Markonis, Y., and D. Koutsoyiannis, Climatic variability over time scales spanning nine orders of magnitude: Connecting Milankovitch cycles with Hurst–Kolmogorov dynamics, Surveys in Geophysics, 34 (2), 181–207, 2013.
Willis Eschenbach exposes the presumption of linearity in GCMs in:
Model Climate Sensitivity Calculated Directly From Model Results
Willis further shows some limits to “linearity” in The Fatal Lure of Assumed Linearity
I’m not your boss either, WHT, but I’d really like to know your ENSO prediction for this year. Come on, throw us a crumb.
Do we indulge the weekend only to catch our collective breath on Tuesday at the spatio-temporal chaos we observe on large inertial scales?
Pingback: Climate Modeling | Transterrestrial Musings
Are we sure that Isaac Held’s did not copy me? I have been saying the same for years and proved it mathematically, all available on the web.
Climate change and seasonal variations occur infinitesimally with time, therefore any climate parameter must exhibit constant change and linearity. This is true for all slow processes. Mathematically speaking, climate parameters can be developed in a Mac Laurin Series, and constant change and linearity is the case. Yes, climate and season calculations are simple and can be very accurate, and those who want to make it look impossible have not tried hard enough.
Does this apply to weather as well? You may become very rich.
So, very small items cause slow and gradual change –
like an asteroid impact?
Or whatever caused the Younger Dryas?
I have not tried the weather yet for am happy with a three to five day weather forecast. More important are the future of ENSO, sea level, and surface temperature. These can be calculated accurately enough for all of our practical needs.
Nabil, From your book, it appears that you are using unconventional concepts. For example, I searched and found no references to spectrum, photons, wavelength, etc. Any theory of warming based on pure thermodynamics is suspect.
What counts is at the macroscopic level. No one has ever or can measure radiations between photons or molecules, whether they cancel out or not, we will never know.
If you believe that thermodynamics is a suspect, you are way on the wrong track. Although unnecessary, you can test the work it if you wish. The equations have passed the test of time, and if you think that they are a suspect, then you may as well suspect you schools, degree, education and all of the books that took us to the moon. Besides, global warming projected in Tables 1 and 2 have been on track since 2007. Why then it is a suspect in your opinion?
The Planck response is a part of statistical mechanics (and of course quantum mechanics) and statistical mechanics is at the heart of thermodynamics, so I guess I need to ask you why you do not believe in statistical mechanics.
You are way off topic.
o A fan of *MORE* discourse | May 23, 2014 at 12:37 pm | Wrote
Dan asserts [wrongly] “If one looks out the window of a [modern] commercial aircraft one will see [few or no] small metal tabs protruding from the wing.”
Error by Dan, correction by FOMD.
Modern computer-designed wings (e.g. Boeing’s 787) require few-or-no vortex generators … with attendant gains in efficiency, needless to say!
Take a close look at the 787-8 tail…….. Looks like at least 12 VGs
CFD is an important tool but it is often over sold.
“All laws of nature are local”
Give the man a prize.
DocMartyn | May 23, 2014 at 2:33 pm |
It’s some form of hubris to believe no more universal explanation can be found, if the current one is found wanting by addition of new observation.
Bart, I had a look at the text and find it to be just like the Isle of White Ferry.
DocMartyn | May 23, 2014 at 3:03 pm |
Given that the square of the length of a right hypotenuse is equal to the sum of the squared lengths of the other two sides, why do we care?
Do triangles behave differently on the Isle of Wight?
Here is simple
I await a physicist to treat it as an equilibrium
BTW The Isle of White ferry is black, steaming and comes out the back of Cowes.
DocMartyn | May 23, 2014 at 6:40 pm |
Well, it _is_ Stanford, and Philosophy.
Eppur si muove
Last I heard from you weren’t you doing some work experience project at NASA? Did you get your phd?
Not strictly true. For example if we know the starting &. end points of a light ray across a medium of variable refractive index (such as the atmosphere), we can calculate its entire path on the condition it would be the fastest one. This law of optics is not local, although it is mathematically equivalent to a completely local description.
Variational principles like this, if valid, can indeed be useful in characterizing complex systems. One such candidate is the Maximum Entropy Production Principle. Paltidge had some success with this approach.
There is even something that looks like a firm theoretical background in non equilibrium thermodynamics.
Journal of Physics A: Mathematical and General Volume 36 Number 3
2003 J. Phys. A: Math. Gen. 36 631
Information theory explanation of the fluctuation theorem, maximum entropy production and self-organized criticality in non-equilibrium stationary states
Unfortunately Paltridge only considered entropy production by turbulent dissipation, although more than 90% of entropy production occurs when incoming shortwave radiation gets absorbed and thermalized. Absorption is argued to be irrelevant.
However, it can’t be the case, because the very “absorptivity of the material under consideration”, that is, planetary albedo, depends on climate, first of all on distribution of clouds. And that’s regulated by highly nonlinear internal processes indeed.
Journal of Climate, Volume 26, Issue 2 (January 2013)
The Observed Hemispheric Symmetry in Reflected Shortwave Irradiance
Aiko Voigt, Bjorn Stevens, Jürgen Bader and Thorsten Mauritsen
Difference in clear sky albedoes of the two hemispheres is huge, in spite of this their actual albedo is virtually the same. That is, difference of annual average reflected shortwave radiation between the two hemispheres is almost two orders of magnitude smaller than it would be with no clouds. It indicates a very strong regulative process mediated by weather.
So, dismissing radiative entropy production is unsupportable.
However, in this case rate of entropy production is clearly not maximized, because Earth is not black, it reflects some 30% of shortwave irradiance right back to space with very little increase in its entropy.
Please note there is no contradiction with Dewar 2003, because he only considers reproducible thermodynamic systems, for which there is a straightforward definition of Jaynes entropy.
But the terrestrial climate system is chaotic, so microstates belonging to the same macrostate can evolve to different macrostates in a short time. That means Dewar’s results are only applicable to it up to a limit and even then only with extreme caution.
The fact is, currently we do not have a physical theory of irreproducible quasi stationary non equilibrium thermodynamic systems at all (to which broad class the climate system belongs to).
Is the Maximum Entropy Production Principle a law of physics? It seems to be used mostly in very opaque situations, like a “climate science”, but is it valid in the first place?
More about “all laws are local” – probably strictly true on a microscopic level; as Peter correctly remarks, not so for thermodynamics (of course a macrostate is nonlocal by definition). That does not mean we should always start from that platform. For the purpose of climatic systems evolution, nonlocal descriptions like Newton’s gravity and Coriolis force are extremely precise and much easier to solve than local equations of General Relativity. Avoid being squeezed into a “simplicity” box.
Could there be a critique on this blog on where Nobel Prize winning scientist Dr. Molina is considered wrong on his thoughts on AGW?
As a layman, I’m certainly not trying to be argumentative — I’m trying to understand, and I fully agree that Climate Science is extremely complicated. But Dr. Molina believes there are basics that should drive us.
Maybe the problem is how one frames the question. Taking the hard science that Dr. Molina refers to, I ask the question: Isn’t it a bad idea to follow a trajectory path of 800 to 1,000 ppm?
Dr. Molina believes taking this path is a bad idea. From a science perspective, can people critique why they believe Dr. Molina is incorrect.
Leave policy out of this — and if you want to bring policy into this, assume that the ONLY policy enacted is a tremendous R&D effort by World Governments in areas like nuclear power or safe fracking (natural gas which Dr. Muller proposes).
When people critique Dr. Molina, please remember that he is a Nobel Prize winner. When you say “he does not consider X” — surely you would have to admit that he is aware of your X but still feels the way he does.
So is Al Gore
… 800 ppm or 800/1,000,000th of the atmosphere = 8/10000th = 0.0008 = 0.08% of the atmosphere. That CO2 is not poison is the most solid thing we know. The weather and by extension the climate changes without our help. For example, Dr. Roy Spencer shared this with on Wednesday —
… this proxy reconstruction of past temperatures suggests climate change is the rule, not the exception:
“Isn’t it a bad idea to follow a trajectory path of 800 to 1,000 ppm?”
Bad as in morally bad, philosophically bad, or what bad? What is bad?
This is a non-scientific question.
Maybe it’s BAD like I am, or like George: ;)
Sloppy thinking from a Nobel Prize winner. Huh.
We know it cannot bemorally bad. The hubris and hypocrisy of global warming alarmists is palpable and that is what Richard Muller, et al., is shining a spotlight on. “Environmentalists who oppose the development of shale gas and fracking,” Muller said of the greenhouse gas fearmongers, “are making a tragic mistake… [and] concerns are either largely false or can be addressed by appropriate regulation… [S]hale gas is a wonderful gift that has arrived just in time. It can not only reduce greenhouse gas emissions, but also reduce a deadly pollution known as PM2.5 that is currently killing over three million people each year, primarily in the developing world.”
If Nobel Prize Winning Warmers want to assert they can describe AGW in a scientific way, then they are going to have to stop mixing in non-scientific appeals if they are going to remain scientifically credible.
“Taking the hard science that Dr. Molina refers to, I ask the question: Isn’t it a bad idea to follow a trajectory path of 800 to 1,000 ppm?”
Ya, its a bad idea.
Well the imperfect science we have suggests that if we get to 1000 ppm
we will as a species face a situation we havent faced before.
How certain is this? That’s really hard to put a number on. hmm..
What can we do?
Raise the price of energy ( tax c02)?
is it a bad idea to follow this path?
Ya its a bad idea.
The imperfect science of economics suggests that if we tax carbon we could hurt people. Oh ya, it also suggests that the hurt will be great.. no wait.. it also suggests the hurt will be small.. no wait it suggests the benefits will be great.. err wait not so great.. errr the science of economics is probably more divided than the science of climate.
So we have one uncertain science suggesting some danger ( from small to large) and another uncertain science suggesting some harms, wait, benefits..
one approach is to take the actions which you should take anyway regardless of the uncertainty.
Gas is better than Coal. focus on that. Hope that the science picture get a bit more clear.Gas will buy you time. with wicked problems, you need more time.
When people critique [Barack Obama], please remember that he is a Nobel Prize winner. When you say “he does not consider X” — surely you would have to admit that he is aware of your X but still feels the way he does.
Steven Mosher | May 23, 2014 at 3:58 pm |
You propose that Economics is divided on the effects of carbon pricing.
However, this is an overgeneralization and an oversimplification.
Generally, economists are even more united on the double dividend benefits of carbon pricing than climatologists are on AGW. Even Ross McKitrick agrees on this; it was his PhD thesis.
Specifically, where carbon pricing is employed exactly like the pricing of goods in the Market, either economists have not done the work to evaluate this case, or they can find no argument against it. Tol hasn’t looked at this case. Nordhaus hasn’t differentiated this case from carbon taxes and done an in-depth analysis of it. Nigel Lawson has assiduously avoided it, and let’s face it last had anything relevant to say about economics in the last millennium. Lomborg isn’t actually an Economist. Everyone who has looked at it has agreed with Ostrom’s fundamental premise.
It’s possible cap and trade and carbon taxes might be harmful, especially if badly implemented. It’s hard to argue against Capitalism. It worked for the mobile phone industry, in the sale of bandwidth. It will work for CO2E.
And even if the worst predictions of economic arguments — however unlikely — came to pass, they easily remain two orders of magnitude below the cost of the best outcomes of a rapid transition to an 800 ppmv CO2 world.
You’re asking the wrong question. Humanity needs more energy not less. So, the only rational question is, what’s better: 800 ppm CO2 or nuclear energy?
I don’t understand your carbon pricing idea. How much are you going to charge me to exhale CO2?
” From a science perspective, can people critique why they believe Dr. Molina is incorrect.’
Because, from a science perspective, warmer is better – colder is worse. There is a reason the Holocene Climate Optimum (HCO) is called, well, optimum. Back before CO2 was considered by many to be evil, the HCO was generally shown as warmer than the late 20th century. More recently – not so much.
Better science? Or agenda driven science?
RickA | May 23, 2014 at 4:52 pm |
What a great question.
Given that you exhale CO2 only after consuming carbon sequestered entirely from CO2, and in that sense add zero CO2 to the carbon cycle in the short run and hence not rivalrous nor encumber scarcity, that your exhalations are not excludable or administrable, that few could be enticed to invest in ways to stop your exhalations or purchase the cessation of your exhalation, you are not participating in a lucrative exchange just by breathing.
However, because you breath to live, you are unalienable from your ownership of the air; we cannot justifiably separate you from that ownership, and as you are a person and thus have ownership rights, we must acknowledge your right to be compensated for the use of your air where there is a scarce, capitalizable, rivalrous, excludable, administrable, marketable usage such as sale of fossil fuels.
I hope that helped you out with anything you too missed from grade 4 Civics.
Wagathon | May 23, 2014 at 4:29 pm |
You fall victim to Unruh’s Carbon Lock-in paradigm.
Carbon is not the same as energy. Nor is nuclear fission.
RobertInAz | May 23, 2014 at 4:56 pm |
Anyone who works in HVAC can tell you it takes over three times the resources to cool a room five degrees than to warm it five degrees.
Human beings can cope extremely well in extreme cold if properly dressed and shielded from wet and wind; there’s little you can do for humans in extreme heat but cool them down except demand less productivity of them.
It appears you have a romanticized view of ‘warmth’.
I am with barty, on this one: Generally, all economists usually always agree on the double benefits of carbon taxes. In fact economists are even more united on the benefits of the carbon tax, than climate scientists are on AGW. It must be something like a 142% consensus.
And barty said something about carbon being priced just like it is in the Market. Which is the way it is being priced now , but he actually has something entirely different in mind. Like the way the gubmint sold band width to the mobile phone industry. But in reality it’s like we pretend the gubmint is the Market and we let the gubmint screw us for our own good by making us pay more for carbon, until it really hurts and we stop using the stuff. It’s just like the high taxes on tobacco and alcohol. Tobacco and alcohol are not good for us and neither is carbon. I just remembered that people have not stopped using tobacco and alcohol, so the carbon taxes are really going to have to sting to be effective. I don’t care what they do, because I have money to burn. Let those poor suckers suffer.
“It appears you have a romanticized view of ‘warmth’”
As someone who has live with arthritis for 2/3 of their lives I will take warmth over cold any day; it was not romance that led me to Texas from England, it was climate.
Half a million Canadian’s are “snowbirds” and head to the US south every winter and the number of Americans, mostly retirees, who do the same is about 9 million. The snowbirds escape the misery and death that comes with cold winters.
My general argument doesnt depend upon the details of the harms/benefits of a carbon tax. I would say the science of economics is more divided than you would.. the devil would be in the details of the tax and as an empiricist I dont think there is enough evidence to settle the matter.
My general argument is that there are two sciences ( well actually three) at play here. Climate science, economics, and technology projection.
None sits on entirely solid ground. What we know from climate science is that switching to gas and then renewables will allow us to avert the most dangerous scenarios. It buys us time to conduct more British Columbia carbon tax experiments and gives us time to place some wiser bets on technology.
None of these sciences is precise enough to ENGINEER the future. write that down.
folks are just going to have to accept some fumbling and bumbling as we try out various approaches. What should be avoided is a fragile approach.
An approach that bets it all on technology saving us. An approach that bets it all on carbon tax. And approach that bets it all on nuclear, or solar or wind.An approach that bets the farm on global treaties. There is no science of how to place these bets. But when you screw up ( betting on biofuels for example) and the results suck. Then fess up and move on. When your bets into solyndra go bad.. fess up. the public dollar is better bet on the long term, deep research bet than the go to market bet. when your efforts to corral the world of cats into a global treaty fails decade after decade its time to fess up and try something else.
we dont know enough climate science, economics and technology projection to engineer the future. That doesnt mean we do nothing. It suggests instead a more flexible, more open minded, more “try and see what works” approach. with feedback and accountability.
DocMartyn | May 23, 2014 at 6:47 pm |
I sympathize with your special circumstances. However, as we have seen from the winter of 2014, quite predictable from Jennifer Francis’ work, as demonstrated by Tim Palmer, “warming” doesn’t mean we’ll necessarily all get the nice even kind of warming that soothes the joints of sufferers, at least not reliably.
What we’re getting from these increasing Forcings is increasingly erratic weather conditions, and increasingly uncertain outcomes.
Rich snowbirds aren’t especially known for their vulnerability to death due to cold; they fly south because they want to avoid extremes, or else they’d stay south all the time, and face the heat of summer.
Steven Mosher | May 23, 2014 at 9:51 pm |
I’m the last person to argue that Economists are agreeable. There are few I could stand to be in the same room with for more than a few minutes, which is a trait also shared by many Economists.
Likewise, you aren’t hearing me arguing to place all eggs in any one basket. We’ve had all eggs in the carbon basket for far too long, and are putting more in at an increasing rate even still; and look at the mess Carbon Lock-in has put us in. If you think the arguments I’ve put forward are too commonplace, too overused, too overdone, you’d be the first here to say so.
Sure, http://thesolutionsproject.org/infographic/#ca is only a dozen baskets (counting conservation and ‘miracle’), but how is that not better than one? Sure, each nation choosing its own carbon pricing scheme that suits its population is only 196 baskets, but that’s 171 baskets more than we have now.
Four in five technologies fail within five years. Four in five businesses fail, even if their technology is solid. You’re going to always name two dozen kinds of fail for every success. I know ninety six people with patents; all except one of those patents has not earned back a fraction of what was spent on them. That other one? Could pay for the other ninety five people’s patents ten times over.
So we’re in broad agreement. Except where we differ, which is something I can respect.
My point is, the same fail as got us here is never the right one to try to ride out of this mess.
“What we’re getting from these increasing Forcings is increasingly erratic weather conditions, and increasingly uncertain outcomes.”
Actually the data show that some things, like temperature, get less erratic. This is true in historical data and its true in climate models.
Bart Jacobsons work is not 12 baskets. In the paper he wrote which established the basis for his 50 state plan he removed gas as an option based on bogus science. he basically places everything in two baskets:
One efficiency, which is good.
the problem is the logistics of getting from today to a renewable future will require a bridge. A bridge he tries to burn down.
moreover the problem is china.
WRITE THAT DOWN.
over and over again.
In the book The Fountainhead, philosopher Ayn Rand was writing about corruption not architecture. We are living Rand’s novel today. Rand could have been writing about climate change instead of architecture, Michael Mann could be Ellsworth Toohey and William Gray might be Howard Roark.
In our modern-day ‘Fountainhead’ the federally-funded finger-pointers of global warming are academia’s corrupt monks spreading seeds of self-defeatism. All humanity must fight against and overcome the use by the government of the peoples’ own money to put the people out of business.
Wagathon | May 23, 2014 at 11:49 pm |
Which is she, novelist or philosopher?
Ayn Rand isn’t really good enough at either as to make it easy to tell which it is she’s failed at.
If you really see the world today in terms of Fountainhead, it’s small wonder you’re so wrong about so much; one suggests you read Harry Potter to get a more near view of how the world operates, or possibly Peter Rabbit.
Steven Mosher | May 23, 2014 at 10:41 pm |
The problem is a different quarter of the world than the one you understand best and have most influence within?
Sounds like shirking and excuses to me.
Also, I try to write up, not down. Easier on the wrist. Give it a try.
“It appears you have a romanticized view of ‘warmth’.”
More like a realistic understanding of warmth. Take a look at where the planet has “warmed” in the anthropogenic CO2 era and relate that back to your air conditioning example. Please incorporate the observation that much warming appears to be in higher nighttime minimums.
Ad hom attacks is what the Left does for a living. It certainly is no wonder the global warming believers also turn their noses up at the objectivist writings of atheist Ayn Rand. Rand writes for individuals who love freedom. Rand writes for Galileo not the science authoritarians of his day. Rand writes for Einstein not the German citizens who voted Hitler into power. Rand writes for the person anyone could be who is not looking for a free ride. Rand writes for everyone that seeks the truth and who are willing to pay with their own time, sweat and tears, which is the price that is required to obtain real knowledge.
‘as a species’. Well, moshe, proto-humans, practically physiologically indistinguishable from humans, certainly survived higher CO2 levels. Given the paucity of atmospheric CO2, the ease with which animals rid themselves of it as a waste product and the difficulty with which plants acquire it for themselves as a nutrient, I wouldn’t be surprised that however much or little atmospheric CO2 functions as a climate control knob, that it also has a biomic thermostat.
Wagathon | May 24, 2014 at 10:29 am |
You should have stopped at Galileo.
People know what Einstein and Rand thought of each other, and it was not the rosy hued picture you set out. Rand is known to have gotten Einstein wrong on Special Relativity, and Einstein was somewhat of a Socialist and man of strong religious faith. Objectivists are famous as a group for anti-relativity, the denial of Einstein’s Special Relativity. There are few more opposite pairs of historical contemporaries one could name.
My problems with Objectivism aren’t philosophical, any more than my problems with Jediism are. I don’t have problems with either. They’re both quite popular and lucrative forms of fiction. And who could oppose harmless free enterprise?
RobertInAz | May 24, 2014 at 10:21 am |
I used to give some credit to the “most warming at night” generalization, until I looked into it a little more closely.
The NASA map you refer to is hardly enough information for us to go by.
http://www.nws.noaa.gov/om/hazstats/resources/hazstat-chart13.gif also gives us too little information, however it kinda sorta tells us that you’re simply wrong on fact.
“I used to give some credit to the “most warming at night” generalization, until I looked into it a little more closely.”
Do tell – I’m eager to see your data.
RobertInAz | May 24, 2014 at 2:10 pm |
Donat, M. G., and L. V. Alexander (2012), The shifting probability distribution of global daytime and night-time temperatures, Geophys. Res. Lett., 39, L14707, doi:10.1029/2012GL052459.
There’s a few dozen other scholarly articles, some better or more relevant, but you get the gist.
We’ve known for a couple of years now that the “warmer at night” or “warmer in winter” isn’t really a reliable or indicative generalization, and to a quite high degree of confidence.
Where your map shows the least warming is generally where you get accompanying higher impacts from severe storm and other extremes, so it’s not like it’s exactly a comfort.
Leftist tendencies are always to deny–e.g., denying they’re against free enterprise capitalism. Other tendencies are ad hom attacks everyone and especially the heretics, which we see happening in the area of global warming. “This tendency, partly articulated as a worldview in the writings of Thomas Malthus, takes what might be reasonable concerns over issues such as air and water quality and embeds them in an ideology deeply hostile to economic progress and the majority of human beings… The overall thrust was still clear: the U.S. and the world should move in the direction of ending population growth, and protection of the environment should be given an importance equal to or greater than that of improving the standard of living… Economic growth and technology were portrayed as problems.” ~Dr. Donald Gibson
Excellent, it has been my view for some time that the stadium wave paper is predictive only in the trivial sense……..”if what has been happening continues to happen then this is what we can expect.” Tomas put it much more eloquently of course.
Error by Tomas Milanovic, link by FOMD …
… that is, a link to high-school teacher Ryan Termath’s outstanding citizen-science project/poster “Can Wing Tip Vortices Be Accurately Simulated?
Conclusion Science marches on … denialist cognition, not so much.
‘Finally, Lorenz’s theory of the atmosphere (and ocean) as a chaotic system raises fundamental, but unanswered questions about how much the uncertainties in climate-change projections can be reduced. In 1969, Lorenz  wrote: ‘Perhaps we can visualize the day when all of the relevant physical principles will be perfectly known. It may then still not be possible to express these principles as mathematical equations which can be solved by digital computers. We may believe, for example, that the motion of the unsaturated portion of the atmosphere is governed by the Navier–Stokes equations, but to use these equations properly we should have to describe each turbulent eddy—a task far beyond the capacity of the largest computer. We must therefore express the pertinent statistical properties of turbulent eddies as functions of the larger-scale motions. We do not yet know how to do this, nor have we proven that the desired functions exist’. Thirty years later, this problem remains unsolved, and may possibly be unsolvable.’ Julia Slingo and Tim Palmer – http://rsta.royalsocietypublishing.org/content/369/1956/4751.full
The weirdness of FOMBS is so pronounced that it appears to be deliberate misdirection for ideological purposes. At any rate – any resemblance to rational discourse is purely coincidental.
Anyone who has put a tea-kettle on to boil has observed cool/dense surface water atop heated/less-dense bottom water. The resultant Rayleigh-Taylor dynamical instability can be predicted theoretically, and simulated numerically, and verified experimentally, and finally, observed (dazzlingly!) geophysically.
Conclusion There’s not much scientific mystery as to *WHETHER* the earth’s energy-budget is imbalanced and *WHY* the earth’s energy-budget is imbalanced … isn’t that correct, Climate Etc readers?
There were lot of fools at the conference – pompous fools – and pompous fools drive me up the wall. Ordinary fools are alright; you can talk to them and try to help them out. But pompous fools – guys who are fools and covering it all over and impressing people as to how wonderful they are with all this hocus pocus – THAT, I CANNOT STAND! An ordinary fool isn’t a faker; an honest fool is alright. But a dishonest fool is terrible!” Richard Feymann
The sort of fool who can mention a boiling pot and ‘Rayleigh-Taylor dynamical instability’ in the same sentence.
Julia Slingo – head of the UK Met Office – and Tim Palmer – European Centre for Medium-Range Weather Forecasts are in fact quoted. Not to mention Edward Lorenz.
If one knew absolutely nothing about it – and FOMBS undoubtedly does not – one would assume that these leaders in the field would know more than FOMBS. Division by zero really.
One wonders – fleetingly – as to his motivation. Mere disruption of rational discourse for the purpose of marginalizing a forum for skeptical views?
ENSO is non-linear, but maybe it can be gotten rid of by sufficiently long averaging, but ICE AGES, and MANKIND surely shoot-down any belief that the climate is predictable. I’m assuming here that MANKIND is just an example of the natural BIO world. If we ain’t natural, at what point did that happen?
It strikes me that there is less distance between Held and Tomas than might first appear. Tomas addresses the problem of predictability – the difficulty of using models initialized with a set of physics principles and parametrized to match existing climatology as a means of determining how climate will evolve in response to an imposed perturbation. I think most modelers would agree, although the evolution of weather prediction over the course of decades belies the notion that predictability can’t be improved by appropriate model refinements.
I expect Held would also agree, but as I interpret him, he adds an additional source of information – history. In essence, he states that although quasi-linear behavior may not be predictable from principles alone, there are a number of climate phenomena that empirically demonstrate something resembling linearity. The consistent 3 C variation in mean global temperature between NH and SH summer (quite striking considering the short, six month timescale) is one example, and reflects the strong climate forcing involved, but he cites others that exemplify the same benefits of empiricism. My impression from the evidence is that this works best on moderate timescales (multiple decades but not one or two decades or multiple centuries) and on broad spatial scales (see Quondam’s comment above in the thread), but not regional ones. We should not expect perfect accuracy at any point, but the experience with weather prediction suggests that we haven’t yet reached an accuracy limit.
I think one further point is relevant to dispel what may be some confusion about problems of attribution (see the link to Held in Tomas’s post). There is a critical distinction between prediction and explanation. Predicting how external forcing will interact with internal climate variability is still a risky enterprise, at least on centennial or multidecadal timescales. On the other hand, attribution in hindsight is much easier if sufficient evidence is available. We can now conclude with high confidence that almost all post-1950 warming was attributable to external forcing (primarily anthropogenic), based on the evidence. Pre-1950 warming is less certain because evidence regarding ocean heat uptake was unavailable. Similar uncertainty lies ahead. Without knowing how the oceans will behave in the future, we can’t confidently make an attribution prediction for the next several decades or even longer. The recent slowdown in warming, due mainly to an unpredicted variability in ocean dynamics, is a salient example of model weakness in this regard. Ultimately, these variations are likely to average out over the long term, but we can’t be sure how completely or how long it will take, and a small but non-zero possibility of a major surprise can’t be excluded.
Fred Moolton – You say “We can now conclude with high confidence that almost all post-1950 warming was attributable to external forcing (primarily anthropogenic), based on the evidence. Pre-1950 warming is less certain because evidence regarding ocean heat uptake was unavailable.”.
With all due respect, this is complete nonsense. If you do not know what caused the pre-1950 warming, then you cannot know if the same factors operated after 1950. If you do not know if those factors were operating after 1950, then you cannot know what caused the post-1950 warming. What you have expressed is a classical case of circular logic – by implicitly assuming that those factors were not operating after 1950, you deduced that something else (in this case, anthropogenic warming) caused the warming. ie, you end up “proving” your initial assumption.
We can easily conclude the modern warming is well inside the bounds of the Roman Warming and the Medieval Warming, the data does show that to be true, and that it is being caused by the same thing that caused those warming periods. What was happening then has not stopped. NOTHING HAS STOPPED. This Warming is just like the ones before.
Hi Mike – I think you missed my main point, which is how to divide attribution of post-1950 warming between internal variability and external forcing. We can do this on the basis of the post-1950 data that include ocean heat uptake. The latter was unavailable earlier, which is why we can’t apportion warming prior to 1950. As to the causes of post1950 forcing, the evidence that it was almost entirely anthropogenic comes from many sources that have been discussed previously. It would be a distraction to review all of them here, since that’s not what this post is about. We also have good evidence that pre-1950 forcing was a mixture of greenhouse gas, solar, and volcanic influences, but we can’t exclude a role for internal variability to the extent we can post-1950.
My original comment stands. Nothing you say negates it. When you say “As to the causes of post1950 forcing, the evidence that it was almost entirely anthropogenic comes from many sources that have been discussed previously. It would be a distraction to review all of them here, since that’s not what this post is about. We also have good evidence that pre-1950 forcing was a mixture of greenhouse gas, solar, and volcanic influences, but we can’t exclude a role for internal variability to the extent we can post-1950.”, you are conveniently ignoring the fact that you can’t explain the pre-1950 warming. Your post-1950 “evidence” is not evidence at all. It is constructed to fit the post-1950 warming (it says so in the IPCC report : look for “constrained by observation”), and includes stuff for which there is in fact no evidence (cloud feedback, for example).
Fred won’t see it. It’s quite willful.
Mike Jonas, suppose Fred had said we have enough data to explain 1900-1950, then you would say what about pre-1900. We can’t explain that, so this is still no good. See where this leads? At what point is the data enough. Most would say that having a quantitative understanding of the last 60 years is enough, but maybe you need 100 or perhaps 200.
It’s like: “Oh, and just by the way, the claims re AGW are valid…but we’re not here to talk about that. Been dealt with elsewhere at length.”
Well, I’d like to say that the claims re AGW are invalid…but we’re not here to talk about that. Been dealt with elsewhere at length.
Even Stevens? And remember, we’re not here to talk about that.
A 1% change in albedo is a 3.4W/m2 change in reflected SW. How bloody likely is that in a chaotic Earth system?
GS, the ice albedo feedback could do that, but that is positive so it won’t help you. A lot of the recovery from the last Ice Age was of this kind.
Eh – the cloud ‘feedback’ is taking care of that.
In Jim’s world everything is a ‘feedback’ – even jars of CO2 sitting on a windowsill. I am quite sure that if he fell on his arse it would be a ‘feedback’.
Jim D – Yes, I considered that. “At what point is the data enough” is a very valid question. I think that it is really necessary for the models to work for at least one significant pre-manmade-CO2 warming period (which conveniently we have in 1H20thC), because otherwise there is no reason to suppose that the post-1950 warming attributed to CO2 wasn’t actually caused by something else. If that was achieved, then whether it was sufficient would depend heavily on how it was achieved – if it was just a retro-fit then it would not count (“with five [parameters] I can make him wiggle his trunk”). If done properly then that would go a long way towards acceptance. Final acceptance would I think require it to work for the MWP and LIA too.
I’m reluctant to re-enter the discussion, having said most of what I wanted to say above, but I noticed exchanges of comments elsewhere involving Jim D, climategrog, climatereason, R. Gates, and others, related to the roles of volcanic and solar forcing during pre-industrial times, including the LIA. Here’s one link concluding that solar forcing was the more important cooling influence, because unlike volcanic forcing, surface and stratospheric effects operated in the same direction – Solar and Volcanic Forcing. My purpose, though, is not to compare their respective roles but to point out that in combination, these forcings induced profound and long-lasting climate responses. The relevance to this blog topic is that neither volcanic nor solar variation during that era was periodic – i.e., there was no evidence to support phase-locking with a putative chaotic oscillation, as could be claimed for seasonal or diurnal variation. The observational record therefore appears consistent with evidence for simplicity (quasi-linearity) in response to strong forcings that dominates over the unpredictable variability inherent in chaotic behavior of the climate that may have operated at that time. Again, I believe Tomas is correct in emphasizing the limits to predictability, but Held is correct in reminding us that those limits don’t preclude our ability, with fairly high confidence, to make fairly accurate long term predictions under Holocene climate conditions.
another solar volcanic paper
re: The Boeing analogy
Further support – NASA is retiring wind tunnels in favor of supercomputers. The flow is well understood and computationally tractable.
RobertInAx are you a troll? If not I have some bottom land I can sell you cheap
NASA uses supercomputers to solve the flow around a vehicle. It is verified with wind tunnel tests and flight tests.
The Climate is a much larger problem. It will be many years before the supercomputers are super enough to deal with that.
That is a small part of the problem. Current Climate Theory does forecast temperatures that warm while Real Data does not warm.
You cannot Model Climate when you really don’t understand Climate.
Marshall Space Flight Center, Huntsville, Ala.
RELEASE : 12-058
NASA’s Marshall Center Concludes Wind Tunnel Testing to Aid in SpaceX Reusable Launch System Design
HUNTSVILLE, Ala. – NASA’s Marshall Space Flight Center in Huntsville, Ala., completed wind tunnel testing for Space Exploration Technologies (SpaceX) of Hawthorne, Calif., to provide Falcon 9 first stage re-entry data for the company’s advanced reusable launch vehicle system.
Under a Reimbursable Space Act Agreement, Marshall conducted 176 runs in the wind tunnel test facility on the Falcon 9 first stage to provide SpaceX with test data that will be used to develop a re-entry database for the recovery of the Falcon 9 first stage. Tests were conducted at several orientations and speeds ranging from Mach numbers 0.3, or 228 miles per hour at sea level, to Mach 5, or 3,811 miles per hour at sea level, to gage how the first stage reacts during the descent phase of flight. ”
I am not a troll
Here is where most of the wind tunnels are:
Here is a 2011 analysis of demand – reduced due to fewer airframe programs and increased computational efficiency.
NASA inspector general:
At least 6 of NASA’s 36 wind tunnels were underutilized or NASA managers could not identify a future mission use. NASA’s use of wind tunnels has declined in recent years due to a reduction in the Agency’s aeronautics budget, fewer new aircraft developments by the Department of Defense and private industry, newer and more capable foreign testing facilities, and alternative testing methods such as computational fluid dynamics.
Tomas Milanovic – Many thanks for that excellent essay. I have stated several times on this and other blogs that there are no climate models, only weather models. I have thought about what a climate model might look like (I don’t think I have put it in writing) and I am pleased to find that I have been thinking along the lines that you express so clearly :
“to focus on observed dominant Fourier modes and to study their dynamics, hoping that at least at shorter time scales these Fourier modes will continue to dominate.”
That surely is the first step for the creation of the first climate model. Like Boeing, we can then learn by testing, and improve.
For every problem that is muddled by over-complexity, a dozen are muddled by over-simplifying.
That is totally backwards.
For every problem that is muddled by over-simplifying, a dozen are muddled by over-complexity. When you use complexity, you can easily fool a lot of other people and especially you can fool yourself.
They use computers and come to believe the output of their computers and they do stop thinking.
Look for simple answers. You won’t understand the complicated answers and you cannot determine if they are right or wrong.
To err is human; to really screw things up you need a computer.
Great post. The Boeing analogy is indeed complete nonsense.
You know I agree entirely – but I like to bring it back to natural sciences. The following is a bit disconnected in my view – but contains a hodge podge of ideas that are familiar to sophisticated denizens.
There are a few hydraulics equations based on fundamental properties. Conservation of energy – Bernoulli’s equation. Conservation of mass – the storage equation. Conservation of momentum – the travelling wave relationship.
Likewise – statistical relationships can be derived for regional rainfall and global temperature from the state of major modes of ocean variability – although I take the Tsonis view implicit in the stadium wave that these are best viewed as chaotically oscillating nodes on the underlying network of the Earth system.
e.g http://s1114.photobucket.com/user/Chief_Hydrologist/media/USdrought_zps2629bb8c.jpg.html?sort=3&o=132 – http://s1114.photobucket.com/user/Chief_Hydrologist/media/USdrought_zps2629bb8c.jpg.html?sort=3&o=132
Decadal probabilistic climate prediction relies on the residence time of the climate system within finite volumes of the phase space of the global attractor. In English – climate stays in certain states for a period before there is a shift in the combination of factors that determine climate as an emergent phenomenon.
‘The global climate system is composed of a number of subsystems – atmosphere, biosphere, cryosphere, hydrosphere and lithosphere – each of which has distinct characteristic times, from days and weeks to centuries and millennia. Each subsystem, moreover, has its own internal variability, all other things being constant, over a fairly broad range of time scales. These ranges overlap between one subsystem and another. The interactions between the subsystems thus give rise to climate variability on all time scales.’ http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.303.1951&rep=rep1&type=pdf
On decadal scales it manifests as shifts in ocean and atmospheric states that persist for 20 to 40 years. The character of ENSO, the PDO, the AMO, etc. – chaotic nodes on the global climate system. It suggests that the current cool state could persist for decades.
High resolution ENSO proxies suggest centennial and millennial shifts. Hundreds of years of dominance of one state or the other – a shift 5,000 years ago from La Niña dominance to more frequent and intense El Niño. Similar shifts would have an immense impact on modern civilization. We are currently at a 1000 year high point in El Niño intensity and frequency.
The new paradigm of abrupt climate change requires a new definition of climate and of climate sensitivity. Climate is an average of weather between climate shifts and sensitivity is dynamic – high at climate shifts but not otherwise. It is statistically a non-stationary series – with changes in means and variance. Climate averages are valid for the periods 1915 to 1943, 1947 to 1975, 1979 to 1997 and 2002 to date. The shifts are anomalous – typically known as noisy bifurcation or more poetically – dragon-kings.
‘We develop the concept of “dragon-kings” corresponding to meaningful outliers, which are found to coexist with power laws in the distributions of event sizes under a broad range of conditions in a large variety of systems. These dragon-kings reveal the existence of mechanisms of self-organization that are not apparent otherwise from the distribution of their smaller siblings. … We emphasize the importance of understanding dragon-kings as being often associated with a neighborhood of what can be called equivalently a phase transition, a bifurcation, a catastrophe (in the sense of Rene Thom), or a tipping point. The presence of a phase transition is crucial to learn how to diagnose in advance the symptoms associated with a coming dragon-king.’ http://arxiv.org/abs/0907.4290
In the satellite era shifts in ocean and atmospheric circulation that are associated with cloud changes that dominate global energy dynamics. This is what the data says – and the IPCC agree that the different sources converge and are consistent with ocean heat content. Natural variability has dominated the small changes from anthropogenic greenhouse gases. Over this century there is no sense in which the pattern of the 20th century of warmer to cooler to warmer is certain to continue. The potential exists that the 1000 year natural high in surface temperature has plateaued and the next shift is to yet cooler. Given both the large natural variation in global energy dynamics and the unpredictable nature of climate shifts – centennial predictability a la Held may be a little misplaced.
‘The climate system has jumped from one mode of operation to another in the past. We are trying to understand how the earth’s climate system is engineered, so we can understand what it takes to trigger mode switches. Until we do, we cannot make good predictions about future climate change… Over the last several hundred thousand years, climate change has come mainly in discrete jumps that appear to be related to changes in the mode of thermohaline circulation. ‘
Wally Broecker seems closer to the mark than Isaac Held. Nonetheless – dynamic sensitivity – a la Wally – has some disquieting implications. One of these is the presumably small but finite possibility of catastrophic climate change in as little as a decade.
Robert I Ellison | May 23, 2014 at 7:17 pm |
Climate averages are valid for the periods 1915 to 1943, 1947 to 1975, 1979 to 1997 and 2002 to date. The shifts are anomalous – typically known as noisy bifurcation or more poetically – dragon-kings.
Good. Drawing a trendline through the whole time series implies something different. Thanks for not getting too over my head.
You wrote –
“The lesson is that even if global may often seem simpler than local, the laws of nature always work in the other direction – from local to global.
That is why the often repeated statement “Climate is not weather” is misleading. The right statement is “Climate is uniquely dependent on weather because its properties can only be derived from a known weather by averaging it over some arbitrary space or time domain.””
Precisely so. Anyone treating climate as an entity in itself, with unique properties, is quite simply a fool. Or simply simple, if you wish to avoid calling a climatologist a pseudo scientific charlatan.
Once again, climatology provides stark evidence that intelligence does not necessarily provide protection against attacks of extreme and persistent gullibility.
Live well and prosper,
Way to go for verbolical nullity.
Tomas is actually wrong. Weather is not climate refers to using cold records to disprove global warming. Climate is defined by the WMO as the average of weather taken over a sufficiently long period – usually taken as a 30 year ‘climate normal’.
Climate is actually the average of weather within one state in a set of states that makes up the system global strange attractor.
As a trivial take home message from the post – and rationale for the bulk of your bellicose and pointless whines – it makes even FOMBS look good.
Life is too short for bad coffee
You wrote – “Tomas is actually wrong.”
If you say so, it must be true – and black is white, and the lack of warming is proof positive that the globe is, indeed, getting hotter.
On the other hand, Tomas is right. Both he and I agree, so you are outvoted two to one. Nature appears to agree with Tomas also. That makes three to one. Neither devout belief nor the power of prayer are capable of changing fact, but you are welcome to try to bend Nature to your will. Until that happens, you lose.
No CO2 induced warming. No missing heat. No benefit to date from all the climatological nonsense produced at great cost. Can you produce just a tiny fact or two to support your odd assertion that reducing the CO2 content of the atmosphere will have a quantifiable beneficial effect for anyone at all, and no negative effects? Of course not – silly question. I apologise.
Live well and prosper,
What a splendid example of acute vacuity.
Nature rules and if you had any clue at all about the substance of the post you would understand that climate shifts and climate can only be meaningfully averaged while it persists in some finite volume of the state space.
I have had quite a few discussions with Tomas over the years – who unlike you is an intelligent, knowledgeable and rational person – and I wouldn’t presume so let’s ask.
What do you think Tomas? Weather is climate – is a bit of a furphy as it relates only to specific weather events. Longer term averages of weather is climate – with the proviso that means and variances shift at bifurcations. Which happen on decadal scales.
1979 to 1997 and 2002 to date – for example. .
The whines about warmists are not merely tedious and irrelevant but misplaced. If you had the intellectual capacity to understand anything – rather than insisting on fantasies of the Earth cooling over eons from the core dominating climate, day and night disproving vibrational transitions in greenhouse gas molecules and other eccentric skydragon type notions.
In reality clouds dominate the global energy dynamic in the satellite era, shifts in ocean and atmospheric circulation suggest the Earth is not warming for decades at least and prediction beyond that is problematic. I realize that goes right over your head – but for future reference those are the words you should put in my mouth. Although I vaguely recall saying this to you previously. Which would make this what? Lies and deception? Deliberate misdirection? Bad faith and false witness? Thick as a brick? None of which would surprise me in the least.
Whhoops – forgot the pretentious sign off.
Just another tequila sunrise,
I don’t buy Tomas’s response to the seasonal and geographic objections to his unpredictability thesis. We can indeed accurately predict that summers will be warmer than winters holding location constant and that low-latitude spots will tend to be warmer than high-latitude spots holding season constant. (I know that you also have to correct for altitude and other things, but for clarity ignore those factors.) Moreover, we can make these predictions based on “linear” energy calculations. If we knew that the sun were going to be brighter, for example, that would lead to a general prediction that the entire spatio-temporal temperature distribution would shift toward higher temperatures, with lesser confidence as we tried to predict the impact on smaller regions of time and space.
The slightly odd aspect of Held’s argument to me came out in his discussion with our hostess and others at the AAS debate where he seemed to be plumping for the possibility that heat going into the oceans might not be well-mixed and so might emerge suddenly with drastic impact on surface temperatures. That hypothesis would make the climate LESS linear and more unpredictable.
‘Recent scientific evidence shows that major and widespread climate changes have occurred with startling speed. For example, roughly half the north Atlantic warming since the last ice age was achieved in only a decade, and it was accompanied by significant climatic changes across most of the globe. Similar events, including local warmings as large as 16°C, occurred repeatedly during the slide into and climb out of the last ice age. Human civilizations arose after those extreme, global ice-age climate jumps. Severe droughts and other regional climate events during the current warm period have shown similar tendencies of abrupt onset and great persistence, often with adverse effects on societies.’ http://www.nap.edu/openbook.php?record_id=10136&page=1
Assume for a moment that ‘Earth’s climate system is highly nonlinear: inputs and outputs are not proportional,
change is often episodic and abrupt, rather than slow and gradual, and multiple equilibria are the norm.’ http://www.globalcarbonproject.org/global/pdf/pep/Rial2004.NonlinearitiesCC.pdf
One problem is not in predicting that summer will be warmer than winter but in whether this summer will be warmer than one in 10 years time should a climate shift intervene. But the problem extends to whether this summer will be wetter or dryer – cooler or colder – as a result of dynamic changes. Obviously direct irradiance at a point changes with season – and this changes seasonal temps. Climate shifts in the modern record are a bit more subtle – although they could be much more obvious in the longer term.
Heat in the oceans is never well mixed – btw – the distribution is determined by a mix of turbulent transport and warm water buoyancy. Buoyancy dominates. Although as ocean heat content follows CERES net – it seems unlikely that there is any extra heat currently – as opposed to last decade.
Those quotations do not contradict the premise that higher forcing will shift the spatio-temporal distribution of surface temperatures to the right. It might be sudden or slow, but the direction should be unambiguous. Unless you want to postulate an overshooting albedo increase triggered by the forcing, which would be an interesting hypothesis if a believable mechanism could be identified.
Forcings rely on atmospheric changes as well as albedo. Albedo changes and so does natural contributions to atmospheric content as the world naturally warms and cools and with volcanoes etc. And these shift unpredictably.
Assuming that net forcings are increasing is just that.
That simplification by analogy is possible is best illustrated by analogue computers. The earliest computational devices were indeed analogue computers and it was many years later that comparable dynamic computer power was available from digital computers. But early analogue computers had to be validated by showing that they obeyed the same laws as the systems they simulated. This process emphasized the importance of the mathematical model, as the essential intermediate step in programing the analogue computer.
The “earliest computational devices” were used to model situations that conformed to linear assumptions. If they had been used in Lorenz’ efforts, they would simply have given results all over the board, with little or no indication of why. Since actual analog inputs could never be replicated exactly, and since differences, no matter how small, could send the overall system off in a different direction, it’s unlikely he could have made any significant scientific investigation of the phenomenon he discovered without digital computers.
AFAIK human efforts at analog computers have never even begun to match the sophistication of biological systems, which have their own method of being “validated by showing that they obeyed the same laws as the systems they simulated.”
AK: Thank you for your comments. In the 1950’s the Adelaide Children’s hospital wanted to study cardiac blood flow, I lent them an experimental analogue computer constructed by P R Benyon, a pioneer of both analogue and digital computing. I conveniently forgot that I had lent it to them, so they were able to study for the first time, blood flow in arteries and veins.
In those days we could handle non-linearity, like supersonic aerodynamics with Taylor series like expansions which worked well. With AGWAC the largest and most famous of analogue computers, originally constructed by Jack Lonergan at RAE, and extended by Peter Benyon’s mark/space variables multiplier. In the fifties and sixties this was probably the world’s largest analogue installation and was described by me at the Third International Conference on Analogue Computation in Opitja, Yugoslavia, 1961.
Thanks Tomas Milanovik,
…while ENSO is a particular case of spatio-temporal chaos related to large inertial scales.
Have you assigned a kind of weight to ENSO? Perhaps how important it is to what’s happened in the last 1000 years?
If there is a disagreement about the climate and if we’re looking for success, perhaps it’s about being in the right place. As much as Tisdale has written about the Pacific Equatorial Region, I get the impression he is in the right place and that’s what it may take for success. Wyatt & Curry seemed to be in the right place with the Stadium Wave as Milanovik wrote. Looking for system changes as I understand it. Being in the right place.
This is a thoughtful essay, which gets to the point of (and rebuts effectively) Isaac Held’s earlier argument for simplicity (or even linearity) of our climate. Thanks for posting it.
Held has fallen into the trap of oversimplification and myopic fixation on only one relatively small aspect of our climate (human forcing), while essentially ignoring all the rest.
Sort of like concentrating with a microscope on a tick on the tail of a dog to describe the dog.
Held asks the question:
And then goes on to cite several analogies in order to try to make the point that it is not (and, therefore, that predictions made by the climate models are “useful” or realistic).
Our hostess pointed out her reservations to Held’s paper on the earlier thread.
She then raised the question:
And your post points out in more detail why Held’s analogies are contrived and why future climate cannot be predicted by linear models.
IOW the climate system IS “just too complex for useful prediction” with the tools and knowledge we have today.
And it certainly makes no sense for climate model predictions, which are “not useful” to be used to advise “policy makers” on important decisions that could prove to be deleterious for us all without achieving anything positive if the models were wrong in the first place.
Thanks again for posting this.
When Dr. Molina (Noble science winner) talks about AGW he references Lord Rayleigh (Rayleigh scattering, Rayleigh distillation), van der Waals (equations of state), Wien Law, Planck Constant (central to radiation theory), Boltzmann constant, Kirchhoff’s Law.
In today’s blog we are talking about simplicity – and my simple question is: Per the established science referenced by Dr. Molina, is it a bad idea to take a trajectory path to 800 to 1,000 ppm? Dr. Molina says it’s a bad idea.
As I follow this blog, I conclude that the skeptics rightfully cite the complexity of feedback loops and interplays within natural variability. But where is the established science that says things like cloud formation will offset/neutralize AGW (using Molina’s cited science principles) — especially at levels of 1,000 ppm?
There is nothing wrong in poking holes in science theory. But poking holes in Theory X doesn’t prove Theory Y – that’s called anti-science.
The key word is “Dr. Molina says it’s a bad idea.” He says it; he does not prove it. Dr. Molina’s gut feeling carries a lot of weight with me; but it is still a gut feeling – not science.
It’s not that he is wrong in what he said. He is begging the question. The problem isn’t with the basic principles of physics underlying climate, it is that the climate system is a very complex system based on the simple principles. It is the complexity of climate that is the problem, not fundamental physics.
The CFC issue was a relatively simple problem in chemistry. Climate isn’t simple at all.
The climate is kind of like Germany’s Enigma code machine. It was made of primitive mechanical elements; gears, levers, etc. But,. put it all together and it’s complicated. This is a concept you don’t seem to be getting.
Chaos doesn’t negate physics – but it is certainly the central organizing principle of complex dynamical systems. In climate we either have chaos or global warming – and the latter is just absurd. Chaos explains better both known data – and improves at least decadal predictability – so it is a better paradigm. Nor does it negate risk – indeed the spread of possible outcomes is broadened considerably. Nor can it reduce uncertainty – which at this stage is almost total.
So what do we liken increasing CO2 equivalent emissions to 8%, 16%, 32% of natural fluxes as economies grow this century? Perhaps careening down a hill on a skateboard with your eyes closed.
Acting with almost total ignorance of the consequences.
Stephen Segrest: But where is the established science that says things like cloud formation will offset/neutralize AGW (using Molina’s cited science principles) — especially at levels of 1,000 ppm?
the science says “might offset/neutralize”; just as the science says the sun “might” cool and the cooling sun “might” cool the earth. The science says that an increase in CO2 “might” warm the surface and troposphere, by an incalculable amount over an incalculable passage of time.
About crop and ecosystem primary productivity, the science right now leans toward increased CO2 causing increased productivity. That would not be “bad”. Dr. Molina’s gut feeling that 1000ppm is bad needs to be supplemented by evidence; Newton had the gut feeling that location in space and motion in space could be “absolute”; Einstein had a “gut feeling
(a deep religious conviction) that probabilistic models could not be adequate science. It’s to Dr Molina’s credit that he does not assert certainty. But everyone is wrong sometimes, and Dr Molina might be wrong about this.
Based on his successful experience solving the CFC problem, Dr. Molina poses the (rhetorical?) question:
With all due respect to Dr. Molina, the CFC problem is a completely different story from the CO2 question.
First of all, we must look at whether or not a “trajectory path to 800 or 1,000 ppm CO2” is even realistic. It appears from what we know about possible future recoverable fossil fuel resources (WEC 2010) that 1,000 ppmv would be the upper possible limit when all inferred fossil fuel resources have been totally used up, so this range could be “possible” (if no cost competitive replacement for fossil fuels is developed and commercialized in the next 150 years or so – a highly doubtful assumption).
Then we must all realize that, unlike CFCs, CO2 is a natural trace gas in our atmosphere, which is essential for all life on our planet. Higher CO2 levels up to this range will not be harmful to animals (including humans) but could be a boon for plants.
And, third, we have no notion what the climate impact of such a CO2 level would be.
The recent “pause” in warming (actually slight cooling) has occurred while almost one-fourth of all human CO2 was emitted into the atmosphere, even though climate models erroneously predicted warming of 0.2C per decade as a result of AGW.
Even IPCC concedes that the temperature impact a doubling of CO2 could be as low as 1.5C “at equilibrium”. Several independent recent studies tell us it could raise global temperature by around 1.35C (or 1.9C “at equilibrium”), while other studies show that up to 2C above today’s temperature the net impact on humanity would be positive rather than negative, so it appears that even this range of atmospheric CO2 would provide more benefits on balance than negative impacts.
Increasing atmospheric CFCs had no positive impacts.
And, finally, it was an easy thing to replace CFCs with a new series of refrigerants that do not cause problems in the atmosphere – the molecules had already been developed and were just waiting for commercialization.
Eliminating CO2 emissions is a totally different story, as a significant portion of the world economy is based on the ready access to a low-cost source of energy based on fossil fuels. Forcibly curtailing or abating these emissions would have the highest impact on the least affluent of us.
So, with all do respect, Molina is comparing apples with oranges.
And the answer to his question is another question: “why not?”
And I do not believe that Dr. Molina has the answer to that question.
And, third, we have no notion what the climate impact of such a CO2 level would be.
Don’t get the problem with this in a chaotic climate? Or equivalently that we can’t anticipate biological or hydrological shifts?
ENSO is not chaotic but follows a deterministic pattern.
It is pretty simple actually.
We construct a network of observed climate indices in the period 1900–2000 and investigate their collective behavior. The results indicate that this network synchronized several times in this period. We find that in those cases where the synchronous state was followed by a steady increase in the coupling strength between the indices, the synchronous state was destroyed, after which a new climate state emerged. These shifts are associated with significant changes in global temperature trend and in
ENSO variability. The latest such event in the 20th century is known as the great climate shift of the 1970s. Extending this analysis in the 21st century confirms that another synchronization of these modes, followed by an increase in coupling occurred in 2001/02. This suggests that a break in the
global mean temperature trend from the consistent warming over the 1976/77–2001/02 period may have occurred. We also find the evidence for such type of behavior in three forced and unforced climate simulations using state-of-the-art models. This is the first time that this mechanism, which appears consistent with the theory of synchronized chaos, is discovered in a
physical system of the size and complexity of the climate system. Anastasios Tsonis
The use of a coupled ocean–atmosphere–sea ice model to hindcast (i.e., historical forecast) recent climate variability is described and illustrated for the cases of the 1976/77 and 1998/99 climate shift events in the Pacific. The initialization is achieved by running the coupled model in partially coupled mode whereby global observed wind stress anomalies are used to drive the ocean/sea ice component of the coupled model while maintaining the thermodynamic coupling between the ocean/sea ice and atmosphere components. Here it is shown that hindcast experiments can successfully capture many features associated with the 1976/77 and 1998/99 climate shifts. For instance, hindcast experiments started from the beginning of 1976 can capture sea surface temperature (SST) warming in the central-eastern equatorial Pacific and the positive phase of the Pacific decadal oscillation (PDO) throughout the 9 years following the 1976/77 climate shift, including the deepening of the Aleutian low pressure system. Hindcast experiments started from the beginning of 1998 can also capture part of the anomalous conditions during the 4 years after the 1998/99 climate. The authors argue that the dynamical adjustment of heat content anomalies that are present in the initial conditions in the tropics is important for the successful hindcast of the two climate shifts.
You can actually see this – as people have for decades. More and bigger blue to 1976, more and bigger red to 1998 and blue again since.
Do I hear a sad and lonely voice faintly bleating about how simple it all is, making misshapen and grotesque math on the far fringes of the blogosphere, emerging briefly only to be chased back to Wilma’s basement by revolting peasants with torches and pitchforks like some Minnesotan Frankenstein? Yes? Thank God – I thought I was hallucinating.
Second quote – http://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00626.1
And, finally, it was an easy thing to replace CFCs with a new series of refrigerants that do not cause problems in the atmosphere – the molecules had already been developed and were just waiting for commercialization.
The major problem with CFCs that was urgent to get fixed was that the patent on R12 had run out and everyone was producing it and it did cost less than a dollar a pound and that was the small cans that were less than a pound. They had a different product that had a new patent on that they desperately needed to force people to buy. The new cans are even smaller and they cost more than an order of magnitude more.
The Ozone holes grow and shrink in natural cycles that just had never been measured before. They caught it in the increasing cycle and took full advantage of it. What do you pay for the new Freons? You pay hugely more.
Dear Matthew Marler — You make the statement: The science says that an increase in CO2 “might”. This is what is at issue. Nobel Prize winning Dr. Molina says using established science laws (Planck, etc.) an increase in CO2 “will”.
Dr. Curry has constantly agreed with the basic science statement of Dr. Molina when she says: All things remaining constant, an increase in CO2 “will” (not your statement of “might”). Using established science and documented observations, we can also use the word “will” (rather than “might”) on topics like aerosols, volcanoes, etc.
Of course, all things don’t remain constant. We have tremendous complexity in climate science in CO2 feedback and likely impacts on natural variability. But this doesn’t change the “basic science” that Dr. Molina cites.
Today’s blog is discussing simplicity — where a simple question can be asked: Is it a bad idea to go down a trajectory path of 1,000 ppm? The basic science tells us that this “will” have a heating effect.
Your use of the word “may” is totally appropriate when we add in all the unknowns of feedback and impact on natural variability — and these factors (in total) may or may not significantly “offset” what the established science (that Dr. Molina cites) is telling us.
Where I struggle on this blog is when this “basic science” is refuted or questioned. Questions on non-established science of feedback loops (with probably the biggest being cloud formation) and impact on natural variability is 100% appropriate.
Going down a trajectory path of 1,000 ppm will have a warming impact — established science tells us this. The amount, timing, and possible offsets to this warming — we don’t know.
Stephen Segrest: Nobel Prize winning Dr. Molina says using established science laws (Planck, etc.) an increase in CO2 “will”.
Yes. That is what Nobel Prize winning Dr. Molina says, using the established laws along with insubstantial claims about “equilibrium” in a simplified model of the Earth that has uniform illumination, uniform surface, and uniform temperature.
And, Nobel Prize winning Dr Einstein said that “God does not play at dice.”, and Nobel Prize winning Dr Rutherford said that Wegener’s hypothesis could not be true because there did not exist a source of power great enough. It is not that hard to find unsubstantiated and indeed false claims by Nobel Prize winning scientists, even when using established laws. We won’t elaborate on Nobel Prize winning Dr Shockley’s writings regarding genetics and public policy. Everybody is sometimes wrong, which is why the appeal to authority is fallacious.
The evidence to date does not substantiate, confirm etc that the simplified equilibrium model is sufficiently complete and accurate to be relied upon. I grant you that if the public policy debate were strictly between Dr Molina and me, nobody should take my word for anything; not nobody should take Dr Molina’s word for anything either. Everyone who can should try to evaluate all of the evidence with respect to every proposition.
Dear Matthew Marler — I am a layman who uses this blog to try and understand issues of climate science better. I studied and professionally practice in the field of Ag science. As such, listening to folks like Oppenheimer (Princeton) talk about the near-term catastrophic impacts of AGW drives me nuts. What he doesn’t mention is (1) His claims are based that the climate models will be correct as to their impact, timing, and region; (2) He cherry-picks regions (which current models are not very good at) and not the World (where crop productivity may increase); (3) He assumes Ag science can not possibly make any significant adjustments in areas of soil science to retain moisture (e.g., soil polymers, increased organics), or in plant genetics to use less water.
So — I’m I am not some liberal socialist — OK?.
Where you confuse me in questioning the basic science are the so many real world applications of this basic science that we use everyday. Things like transistors, the engineering things Mosher talks about all the time, shoot — even things like photography.
The basic science (with tons of real world applications) say that increasing CO2 will have a warming effect. This is not saying that increased CO2 levels WILL increase the Earth’s temps (by X degrees at certain time frames) as we have so much complexity in things we don’t understand very well right now.
I hope I’m being respectful, as this is my intent. But I scratch my head in your (and others on this blog) questioning the established basic science. The established science tells us things on CO2 levels, aerosols, volcanoes, etc.
Stephen Segrest. You are assuming the “science is established.” It isn’t.
Dear Jim2 — Its seems like you are asking me to list and then justify (in a blog post no less) the gazillion applications that we use in our every day lives on the established science laws that Dr. Molina cites. This doesn’t seem to be “good faith” on your part.
It doesn’t seem you can separate (1) warming effect versus (2) global temperature change — and why making this distinction is important in any dialogue.
Stephen – Newton’s laws are solid, radiational physics is solid, thermo is solid, all the basic building blocks of physics are solid. But that in no way means climate science is equally solid. The climate is a complex interaction of the basic building blocks of physics. No one knows how the climate operates with enough understanding to say more CO2 will lead to a catastrophe for humans. If there is no catastrophe, then we have better, more pressing problems on which to spend time and resources.
Beth, are you suggesting cllimate is turbulent? Perhaps then fibonacci and fractals would best explain it. Feeling you serf girl.
Dr. Molina says he believes in AGW based on the established science (Planck, etc.). AGW skeptics say they don’t know whether 1,000 ppm is a good or bad idea because of science uncertainty.
What are the skeptics established science arguments of how things like cloud formation will offset/neutralize (at 1,000 ppm levels) Molina’s established science citations? Do skeptics have a “theory” based on established science that has any semblance of wide peer review?
A vigorous spectrum of interdecadal internal variability presents numerous challenges to our current understanding of the climate. First, it suggests that climate models in general still have difficulty reproducing the magnitude and spatiotemporal patterns of internal variability necessary to capture the observed character of the 20th century climate trajectory. Presumably, this is due primarily to deficiencies in ocean dynamics. Moving toward higher resolution, eddy resolving oceanic models should help reduce this deficiency. Second, theoretical arguments suggest that a more variable climate is a more sensitive climate to imposed forcings (13). Viewed in this light, the lack of modeled compared to observed interdecadal variability (Fig. 2B) may indicate that current models underestimate climate sensitivity. Finally, the presence of vigorous climate variability presents significant challenges to near-term climate prediction (25, 26), leaving open the possibility of steady or even declining global mean surface temperatures over the next several decades that could present a significant empirical obstacle to the implementation of policies directed at reducing greenhouse gas emissions (27). However, global warming could likewise suddenly and without any ostensive cause accelerate due to internal variability. To paraphrase C. S. Lewis, the climate system appears wild, and may continue to hold many surprises if pressed. http://www.pnas.org/content/106/38/16120.full
Science comes in many flavours – try to understand the above in the context of the post and the dominant climate paradigm. Then we might be able to move onto non-linear and nonequilibrium. Although I will understand if you are unable to respond rationally.
Wide peer review is corrupted, especially in climate science, where it’s all pseudoscience and (C)AGW.
I hope it isn’t too controversial to say that if you’re not obsessed with minute variations in temperature, some climate models have done an excellent job of charting the broad sweep of climate change. The form of the curve in (some of) these models is really close to what has happened.
Instead of blithely dismissing models as insufficient for the task, maybe we should redefine the task… and have a few words in private with some modelers who have been busy over-egging the pudding…
Hubert Lamb once said about historic temperatures
‘We can understand the tendency but not the precision.’
We know from CET, dating to 1659, the general sweep of climate and can see it has been warming for hundreds of years longer than seen in such as the Giss 1880 instrumental record. In other words, Giss is a staging post of rising temperatures and not the starting post.
To believe we know either the historic temperature OR the ‘global average’ or Ocean heat content to fractions of a degree is hubris.
Hi Tony, how are you?
We’re sort of talking past each other, I think. In a post that seems to postulate that climate can’t be forecast to any great extent, it seems only fair to mention that some models in fact have done a creditable job.
You and I both would agree that some consensus activists have gone way over the top in assigning infallible credibility to their six sigma predictions. But some of the model outputs, especially early on, have done a decent job.
The pause may be making some of their charts look bad, but decadal variability was kind of thrust upon them, wouldn’t you say?
Good, thanks. Are you still in China?
The temperature data bases show approximately what has been happening for hundreds of years-A long Slow Thaw.
Why we had the Roman Warm Period, MWP, LIA and other road bumps in the climate are not explained by the models,which are Ok at broadly showing what we already know to be happening.
There are too many known unknowns and unknown knowns etc that make forecastng an accurate science, whether with weather or climate. Yes, you are right that the claims of consensus activists have somewhat muddied the issue in their belief that models are highly credible.
Tom Fuller – When you say
“[..] some of the model outputs, especially early on, have done a decent job.
The pause may be making some of their charts look bad [..]”,
you need to point out that the models all assumed that the warming current then was due to CO2, so they all predicted a continuation of the warming. While the warming did continue, it looked like they were doing a “decent job”. But the failure of the warming to continue does more than make the models “look bad”, it shows that they are seriously wrong, and that the major error was that initial assumption.
“I hope it isn’t too controversial to say that if you’re not obsessed with minute variations in temperature, some climate models have done an excellent job of charting the broad sweep of climate change.”
That’s a little like “Other than that Mrs. Lincoln, how did you like the play?”
The problem is that the “minute temperature variations” are what the CAGW movement is all about. That is why billions have been spent on climate research, and billions more are budgeted, including on the models.
The purpose for which the models are marketed is predicting future “minute” temperature changes. And they can’t do it.
Those are the IPCC’s before and after (politicized alterations) of the comparison of the models with reported temperatures. So they suck at temperature projections. If the caption of the first graph is accurate, that the models were first programmed to the “TAR and AR4 scenario design,” then they have been wrong for almost their entire existence.
What parts of the “broad sweep of climate change” have they accurately forecast? The “form of the curve is remarkably close?” To what? For what time period?
Using tuned hind casts for models that can’t project the future worth a damn, only shows that given enough time and money, a few modelers have been able to fit their model’s curve to some small portion of the historical reported observations. And they can’t even do that for any length of time.
This comment unfortunately reads to me like the “Obama is a really good president” comment. More of a wistful prayer, than an actual description of reality.
Thomas Piketty’s book, ‘Capital in the Twenty-First Century’, has been the publishing sensation. The French economist central claim is that wealth inequalities are heading back up to levels last seen before WWI. This, he claims is because as the rate of return on investment in a country exceeded the growth in GDP, it tended towards a positive feedback loop and caused even further gains in passive investment wealth growth while general wealth stagnating.
So like in ‘climate change’, where CO2 heating drives water vapor rises, leading to increasing heat, melting permafrost, releasing CO2/CH4, increasing heat, when the return on investment is greater than general economic growth, the people making money, investors, invest more, getting greater returns due to capital gains, but the economy stagnates because non-investors have less economic activity.
He based this postulate on analyzing a huge amount of data. However, the FT has cried foul, and stated that his numbers are wrong or invented, and the FT finds that, for example in Europe, using the correct published data Europe has shown no tendency towards growing inequality since 1970.
I quote the FT
‘ It also appears that some of the data are cherry-picked or constructed without an original source.’
So what is Prof Piketty’s response to this reanalysis of his work
“[I had to use] a very diverse and heterogeneous set of data sources … [on which] one needs to make a number of adjustments to the raw data sources.
I have no doubt that my historical data series can be improved and will be improved in the future … but I would be very surprised if any of the substantive conclusion about the long-run evolution of wealth distributions was much affected by these improvements”
So settled economic science, the data can be improved, but the conclusions drawn from the data will stand forever.
To think that people like FOMD, Bert and Appell worry that Climate Science isn’t making a large enough impact on peoples thinking.
Sounds very like the theory of surplus value – which is as far as I got in Das Kapital. Nonsense which serves only to confirm the resurgence of socialist thought – which needs squashing politically when it raises it’s ugly Hydra head.
I was reading Bart’s link in reference to Elinor Ostrom and ‘subtractability’. Bart’s use of the term in commandeering of Ostrom’s gravitas in support of his perennial obsession. I have a high regard for the late Elinor Ostrom and it sounded very un-Ostrom like.
Goods and facilities can generate a flow of services that range from being fully subtractable upon consumption by one user to another extreme where consumption by one does not subtract from the flow of services available to others. The withdrawal of a quantity of water from an irrigation canal by one farmer means that there is that much less water for anyone else to use. Most agricultural uses of water are fully subtractive, whereas many other uses of water–such as for power generation or navigation–are not. Most of the water that passes through a turbine to generate power, for instance, can be used again downstream. When the use of a flow of services by one individual subtracts from what is available to others and when the flow is scarce relative to demand, users will be tempted to try to obtain as much as they can of the flow for fear that it will not be available later.
Effective rules are required if scarce, fully subtractive service flows are to be allocated in a productive way (E. Ostrom, Gardner, and Walker 1994). Charging prices for subtractive services obviously constitutes one such allocation mechanism. Sometimes, however, it is not feasible to price services. In these instances, some individuals will be able to grab considerably more of the subtractive services than others, thereby leading to non-economic uses of the flow and high levels of conflict among users.
Allocation rules also affect the incentives of users to maintain a system. Farmers located at the tail end of an irrigation system that lacks effective allocation rules have little motivation to contribute to the maintenance of that system because they only occasionally receive their share of water. Similarly, farmers located at the head end of such a system are not motivated to provide maintenance services voluntarily because they will receive disproportionate shares of the water whether or not the system is well-maintained (E. Ostrom 1996).
Ostrom’s major focus was on management of commons through what she called polycentric governance. Multiple tiers of governance including government, business and the community. It was an idea she called going beyond the tragedy of the commons. This is a very quick and fun overview from the girl herself. Complexity is a big part.
These systems have worked over long periods. Here is a very beautiful example from Japan.
So what is it about the concentration of carbon dioxide in the atmosphere that makes it a ‘common good’ to which ‘subtractibility’ applies. We might easily burn all of the available fossil fuels and add the emissions to the atmosphere. So it is not subtractable in that sense. There is no cost to accessing the sky – so no sense in which one user subsidizes free riders in the need to maintain infrastructure. It seems not a commons which one user can monopolise.
What Bart means is that there are external costs attributed to everyone and benefits to the few. Which seems to make it a different problem in theory – sloganeering notwithstanding. The real problem is that there are costs attributed to everyone – very nebulous and provisional costs – and very real benefits to all. Just so we are clear on just what the real theory is.
The combined result of all these problems is to make wealth
concentration during the past 59 years rise artificially …uh oh
… sounds like deja vu all over again.
DocMartyn | May 23, 2014 at 10:38 pm |
Piketty has to do with something I’ve said how?
It seems you’re engaging in some self-comforting tribalism, echoing the call of your side in the hopes of reinforcing already-held views.
Piketty’s work is somewhat important, and may be largely valid, as part of a much larger discourse. FT’s critiques are largely valid, and may be important too, though also hardly definitive.
And if you had said anything new or useful, I wouldn’t be saying you’ve said nothing useful here, either toward that discussion nor this one.
Being useless, how does that comfort you?
Robert I. Ellison | May 24, 2014 at 3:10 am |
Nice to see you in approximately the right ballpark.
But no, you’re still zero for three. Next batter. It may have something to do with you holding the wrong end of the bat, and facing away from the pitcher. (That’s a baseball analogy. Baseball is a sport, played in America. Google it.)
What is scarce about CO2 in the atmosphere isn’t the issue. It’s what is scarce about the service provided by the carbon cycle to dispose of waste CO2 put into the atmosphere that is the issue.
If you keep inverting that simple element, you’ll never get on base.
Oh Bart – you were only peripheral to the discussion. Can’t bear it? Need to assert yourself like some over indulged child? You have to maintain that smarmy stream of calumny and misrepresentation just to pad out the vacuum of your commentary? Please – you rehearse the same obsessive song and dance at every inopportune opportunity. Sort of like a webby – without the Quasimodo cadence admittedly but with the intensity and persistence of a mean spirited and perennially discontented Captain Ahab chasing a white gnat.
I have given a definition of the problem in terms of externalities – costs imposed as a result of increased CO2 – and you waffle on about services divorced from effect or rationale. One wonders what these services might be – why they should be provided – how they are not being provided – but these are not questions I would ever ask a serial miscreant with an attention deficit disorder, who has a bad case of jingoism and to crown the infamy indulges in an opportunistic theft of reputation by association with and misrepresentation of a true champion of humanity. One with complex ideas and complex multifaceted solutions to the problem of the commons.
One realizes that you are incapable of shame – but we feel shame and embarrassment for you.
That’s gonna leave a mark.
The reason the rich, people in the stock market, and holders of other assets are doing well is because of government intervention in the economy – specifically, quantitative easing and low interest rates.
Quantitative easing has flooded the economy with money, making it easy for companies to raise money to buy back stock. This inflates the price of the stock since there is more value (of the company itself) per share of stock. This sort of stock price increase isn’t due to a booming economy, but stock holders make money nonetheless.
Meanwhile, more people in the middle class join the poor. Keynesian economics at its best.
Chief makes a good point –
==> ” The real problem is that there are costs attributed to everyone – very nebulous and provisional costs…”
Yes. Deaths due to particulates and trillions of dollars and tens of thousands of lives spent to keep oil flowing – nebulous and provisional indeed.
Billions flowing to governments of countries where some 50% of the citizenry don’t have basic civil rights – nebulous and provisional indeed.
Let’s just turn a blind eye to some aspects of capitalism. It’s fine. Nothing to see. Just move along. You know, ’cause it’s all “nebulous and provisional…”
If it weren’t for the government policies, even the rich wouldn’t be doing as well without them as with them, that is part of the point.
==> “Typical of a pro-government-control freak.”
Indeed. You got me pegged. Totalitarianism is my raison d’être. I especially like it when governments invade other countries to keep oil flowing.
Josh – really?
The last decade is a strong example of how the conservative idea of trickle-down economics doesn’t work. Instead of everyone getting richer, the middle class have become poorer. Ideas for raising net income at the lower end through redistributing tax burdens, lower-cost benefits like healthcare, and having a minimum wage that at least keeps up with prices, are aimed at putting the money where it will be spent in the economy. A country is judged by how its poorest 10% live.
Read about what the architects of the invasion of Iraq had to say. Get back to me. We’ll talk.
Jim D: A country is judged by how its poorest 10% live.
The poorest of the US now do pretty well compared to the poorest of other places and times. However, they did better before the current administration; there are not literally more in the bottom 10%, but the number below the poverty level has increased under the current administration. The current administration has widened the gap between rich and poor in the US by, among other things, borrowing money to support wasteful crony enterprises such as Solyndra. The proposed regulations of CO2 will do more of the same, as have the other tens of thousands of new regulations written by this administration. The economic failures of this administration are fairly dramatic compared to America’s history. It’s hard to find another administration in our history that, in peacetime, increased the federal debt so dramatically, drove people out of their medical insurance plans, drove as many people into permanent unemployment and out of the job market altogether, maintained a permanently low GDP growth rate, reduced American military power, and used the tax power to hound its political opponents.
As “etc” goes, that was pretty outside the usual interests of this blog.
Picketty’s book is bound to be widely read and to stimulate much discussion of much discussed topics in political economy. Probably stimulate a resurgence in the sale of Frederick von Hayek’s books.
Bart, like FOMD you seem to have ‘faith’ in this view of science
reason → mathematics → physical theory → experimental observation → inventive technology → viable enterprise → political ideology
Problem is the starting point; reason is personal and subject to bias. People who actually do science are aware of bias and know that they can only temper it, not eliminate it.
My anticancer drug is going to be prepared for a phase I trial. Its toxicology will be examined in two types of mammals; in our case mice and monkeys. The trials will be performed by an outside agency and will be performed double blind. I will take no part in any of the work, other than providing drug. The drug will also be sent for independent chemical analysis and will be certified by a licensed expert.
If it passes the toxicology, it will go through:-
Phase I: Researchers will test it a small group of people for the first time to evaluate its safety, determine a safe dosage range, and identify side effects.
Phase II: The drug will be given to a larger group of people to see if it is effective and to further evaluate its safety.
Phase III: The drug will be given to a group of people to confirm its effectiveness, monitor side effects, compare it to commonly used treatments, and collect information that will allow the drug or treatment to be used safely.
Phase IV: Studies are done after the drug or treatment has been marketed to gather information on the drug’s effect in various populations and any side effects associated with long-term use.
The success rate in drug development int different steps is
in vitro to animal model 10%
animal model to human patient in phase I 30%
Phase I to phase II 64%
Phase II to Phase III 35%
Phase III to Phase IV 60%
Phase IV to NDA/BLA approval 80%
Failure after NDA approval and withdraw 12%
I wish I could start of with reason and then use math to work out how to cure cancer without all the cost/time spent faffing around. Us drug designers must be useless as this science and modeling business.
Matthew Marler, that is your party line, but note that in the first two years the recovery saved the auto industry, re-employed a lot of people left out of work by the 2008 crash, and enacted affordable health care. Further progress was stopped by the 2010 mid-terms that led to blockage of any further recovery efforts, and that has been the situation in the last 4 years. They have done well considering the limited power they have. Employment has still risen. They won’t get a federal minimum wage or immigration reform which would also have helped the recovery but would be blocked. They won’t get infrastructure building for jobs, and they are going as far as they can for the environment without Congress. The Republicans suddenly regret their raising the debt prior to 2008, and won’t let the country rebalance its income and expenditure even by closing tax loopholes, such as giving low tax rates for certain kinds of income. The bottom line is that the Republicans clearly by their actions want to reduce or prevent anything that directly benefits the poor, old, unemployed or unhealthy, so who do you think is working in the right direction in this regard?
DocMartyn | May 24, 2014 at 11:38 am |
What a load.
Per Newton: all observation + parsimony + simplicity + universality → reason → explanation until new observation require amendment.
Do you see the difference?
” The real problem is that there are costs attributed to everyone – very nebulous and provisional costs…” and benefits to all.
I had in mind particulates – and I don’t live in a city because it makes me nauseous and headachy – but the risk is not greater than many others – chloresterol, high blood pressure, car accidents, diabetes, iatrogenic disease, etc. It adds to the risk of death per 100,000 people – hence nebulous. Other costs are even more nebulous and provisional.
Jim D: but note that in the first two years the recovery saved the auto industry, re-employed a lot of people left out of work by the 2008 crash, and enacted affordable health care.
Anyone who can write that with a straight face has not been paying attention.
Even skeptics would agree that if the sun warmed by 1%, it would get noticeably warmer, especially since the LIA was a much smaller perturbation. Similarly volcanoes cause measurable cooling. It is that simple. But when it is pointed out that doubling CO2 has the same effect on the energy balance as this 1% solar increase, they immediately bring up chaos and say its net effects would not be so obvious. Something changes in their thought process between these two forcings from a linear response demonstrated by observational evidence to unpredictable chaos. It just looks like inconsistent thinking to me.
So, now Jim D speaks for skeptics.
Linear thinking is pretty good. People have done a lot with it. If we could see closer, add more decimal places, we might see that what appears linear is more complex than that.
Trying to say it another way. I agree a 1% change matters and either cools or warms almost all the time and a linear representation of that has some utility. But perhaps the mechanism for the change is many transitions from the beginning to the end. And the interesting part is this is, assuming they exist, the transitions.
Connecting the cooling in LIA with the sun’s inactive period is an example of linear thinking. Skeptics used to like that. Perhaps they have changed recently, but I missed it. Are they debunking it somewhere now?
There is linearity and there is turbulence. There, does that make it better?
kim, so the LIA was just turbulence or a linear response to a forcing change? How about the Pinatubo response?
How about chaos and a linear forcing? Maybe it’s both, as Forrest Gump said.
Much more likely that clouds will change 1% from chaos.
The volcano erupted in June 1991 .Nasa says it causes winter warming
Temperatures had been on a 5 year downward trend long before the volcano
What objective evidence do you have for the Pinatubo effect you claim? How are other similar dips accounted for?
‘Since irradiance variations are apparently minimal, changes in the Earth’s climate that seem to be associated with changes in the level of solar activity — the Maunder Minimum and the Little Ice age for example — would then seem to be due to terrestrial responses to more subtle changes in the Sun’s spectrum of radiative output. This leads naturally to a linkage with terrestrial reflectance, the second component of the net sunlight, as the carrier of the terrestrial amplification of the Sun’s varying output.
It’s not so much skepticism as actual science of internal amplifying mechanisms. This seems pretty consistent with a complex and dynamic system. But I have quoted this to Jim before. So what does that make this? Deliberate bait and switch? Pandering to the bleachers? Scoring points in some private and very sordid game? The attention span of a goldfish?
The world wonders – but not too long or deeply. .
Two issues here. (1) The Sun is external to the atmosphere whereas carbon-dioxide isn’t; and (2) Earth’s surface temperature does not change linearly with the sun e.g. after volcanos, El Ninos, etc. Also, doing a lab test in a confined space does not prove that the same thing will happen in a completely unbounded atmosphere. So everything’s to play for.
I think Donald Rumsfeld has the state of climate science summed up ;
“There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know.”
Jim D. Ignoring UV for the sake of discussion, a 1 watt increase in visible light from an increase in solar output or a decrease in aerosols due to volcanoes WILL NOT have the same effect as a 1 watt increase in back radiation from CO2.
The ocean’s albedo is about 0.1. Therefore, the one watt increase of visible light will result in 0.9 additional watts getting absorbed and converted to heat, with the concomitant rise in temperature.
But the IR from CO2 is different. Much of the 1 watt increase will immediately be absorbed by the conversion of liquid water to vapor. This will occur at a constant temperature, given the relatively constant atmospheric pressure. The resulting heat will be carried into the atmosphere where is can be convected to the upper atmosphere and radiated away. So, maybe only 0.2 watts is absorbed by the ocean.
While I don’t have a number for heat absorbed by the ocean due to an increase in CO2 IR, I think Dr. Curry put up an estimate at one point. I believe it was about 20% of the IR was absorbed.
But the point is that water will have a different reaction to visible vs IR. Due to this alone, it is reasonable to expect the net heating will be different between the two wavelength ranges.
The effect of a CO2 change of 1 W/m2 only differs from a solar 1 W/m2 in distribution. CO2 tends to warm land and higher latitudes first, and cools the stratosphere, for example, while the sun would warm tropical latitudes first. Since these both affect the earth’s energy balance they have equal effects from this external view of the system. To imply, as Tomas does, that there is no linear predictable effect is plain wrong both for the sun and for CO2.
climatereason doubts that Pinatubo caused any cooling, and that is up to him. Lots of publications have said it did, and none have said it didn’t, so I just go by that.
Jim D says: Since these both affect the earth’s energy balance they have equal effects from this external view of the system.
Please support this statement with some facts.
There is a sizable influence of large volcanoes on cooling the climate system, which has been demonstrated in many research studies. The effect is both immediately observable in the troposphere, but also has a delayed effect, in that large volcanoes reduce ocean heat content. Given the larger thermal inertia of the ocean, and the strong influence of the ocean heat content on tropospheric temperatures, a large volcano can influence tropospheric temperatures via ocean to atmosphere heat flux decreases for many years after the initial eruption and tropospheric cooling. See, for example:
GS, if you are arguing that even more subtle sensitivities caused the LIA, your argument seems to be with Milanovic, not me. I was saying that the climate shows strong observable responses to forcing changes. Maybe you agree now. Read what Milanovic says again.
jim2, you have seen the facts here countless times. The ideas go back to Arrhenius, Callendar, etc. It can be explained with one-dimensional radiative/convective models, or full GCMs. You have chosen not to believe them, possibly even including the no-feedback response. I can’t help you.
Jim D | May 24, 2014 at 10:22 am |
To imply, as Tomas does, that there is no linear predictable effect is plain wrong both for the sun and for CO2.
My take on what he wrote:
“Facing this complexity, is there any chance that all or parts of this system can be deterministically predicted with a reasonable accuracy ?”
It is safe to say that the answer can at best be only a partial yes and even then only if the system can be simplified.
No analogy with statistical thermodynamics applies to this dissipative non equilibrium system so that an analogous simplification will not take place.
Taking CO2, one of a number of variables, can we then predict? All things being equal, yes. But are all things equal? I am not sure. There’s also the quality of the prediction to consider. And if that’s not high, maybe it’s time to reconsider things.
GS, if you are arguing that even more subtle sensitivities caused the LIA, your argument seems to be with Milanovic, not me. I was saying that the climate shows strong observable responses to forcing changes. Maybe you agree now. Read what Milanovic says again.
Connecting the cooling in LIA with the sun’s inactive period is an example of linear thinking. Skeptics used to like that. Perhaps they have changed recently, but I missed it. Are they debunking it somewhere now?
What I am saying is that you are mischievously misrepresenting the case and now are deviously shifting the goalposts.
Jim D: But when it is pointed out that doubling CO2 has the same effect on the energy balance as this 1% solar increase, they immediately bring up chaos and say its net effects would not be so obvious.
“Pointed out”? The assertion that a doubling of CO2 would have the same effect on the energy balance as a 1% increase in solar output is based on calculations from an extremely simple and not well-tested model. Hence it is a “dubious” assertion. It is correct to say that the system is chaotic and the net effects of doubling CO2 would not be so obvious. There is not a single energy transport process for which the net effects of doubling CO2 have been demonstrated, and there is no “equilibrium” that has been demonstrated.
Matthew Marler, it can be quantified well: 3.7 W/m2 for doubling CO2, 3.4 W/m2 for a 1 % solar increase, 150 W/m2 for the total greenhouse effect, etc. Certain numbers like this come just from physics.
Jim D: Matthew Marler, it can be quantified well: 3.7 W/m2 for doubling CO2, 3.4 W/m2 for a 1 % solar increase, 150 W/m2 for the total greenhouse effect, etc.
The claim that they have the “same effect on energy balance” comes from simplified models that ignore the dynamics of the energy transfer processes, just as I said. The “physics” is explained in many places, such as the books by Pierrehumbert and Randall that I have often cited, and the “explanations” are based on the counterfactual assumption of equilibrium.
Your earlier phrase “pointed out” is just your way of saying that you accept the counterfactual assumptions as “physics”.
Jim D. : Skeptics used to like that. Perhaps they have changed recently, but I missed it. Are they debunking it somewhere now?
You ought to quote at least one “skeptic” exactly. “Linear thinking” is a start. That’s all. Sometimes the result is accurate enough, sometimes it isn’t. Clearly it is not adequate, as “equilibrium thinking” is not adequate, to assessing the effects of CO2 increase on future climate.
Is the Stephan-Boltzman law an example of “linear thinking”? Some skeptics believe it (to a certain degree of accuracy) some don’t.)
Matthew Marler, the important thing about equilibrium is that you have to know where you are relative to it in order to predict a climate trend. Currently we are somewhat below it, hence warming is expected.
Jim D: , the important thing about equilibrium is that you have to know where you are relative to it in order to predict a climate trend. Currently we are somewhat below it, hence warming is expected.
No. The , important . thing about “equilibrium” is that it does not exist. The calculated value for “it” does not correspond to anything in the Earth climate, so knowing “its” calculated value is no information whatsoever.
Meh, Matt, these guys think that equilibrium was before man disequilibriated it. Hey, that’s gotta be right.
Matthew Marler, that is a very odd view if you are saying that we can’t state whether the earth’s energy balance is in equilibrium or not. How about after a volcano, wouldn’t we be out of equilibrium for a while, with less heat coming in, and don’t we move towards it by cooling? Not sure what you mean. Equilibrium is just a description of a state of the energy balance that can be calculated quantitatively from earth’s radiation budget. Equilibrium just means zero net from this calculation, so the distance from equilibrium can be calculated, unlike what you said.
This is another one of those things that need re-calculating every instant. First, tell me how long an instant lasts. Then we might be getting somewhere, but when, who knows?
The entire Earth IS very close equilibrium with space, excluding solar variations, and yet (probably) no given spot on Earth has attained equilibrium.
kim, yes the energy balance changes probably daily with clouds. As you take longer averages you see it dominated by the annual cycle due to the shape of the earth’s orbit and its varying albedo to the sun. As you take multi-year averages it will asymptote to the actual climate imbalance, probably by the time you get to decadal scales. Not to say it will flatline at the exact value, but the oscillations at those time scales would be small enough that mean would be noticeably offset from zero.
Jim D: Equilibrium is just a description of a state of the energy balance that can be calculated quantitatively from earth’s radiation budget. Equilibrium just means zero net from this calculation, so the distance from equilibrium can be calculated, unlike what you said.
The equilibrium calculation follows from a simplified model of the Earth. What it refers to on Earth is not made explicit.
What can be estimated on the Earth is the mean (over time and space) temperature, but not the Equilibrium. That the Earth is never in equilibrium is illustrated by the persistent temperature gradients from the Equator to the poles; I say that the gradients are persistent, but quantitatively they fluctuate. There are other obvious deviations from equilibrium, such as daily and seasonal mean temperature changes and other mean regional changes. What the mean temperature is with respect to the equilibrium has never been made explicit. In his book “principles of planetary climate”, Pierrehumbert explicitly addresses the error inherant in trying to claim the “equilibrium” as the estimate of the “mean”.
Over long periods of time there are imbalances between net input from the sun and net radiated and reflected radiated energy. There is, for example, an alternation of this imbalance related to the Earth’s revolution about the sun, because the Earth is closer at perihelion, when the insolation is greatest, and farther at aphelion, when the insolation is least. Besides that, something has evidently caused imbalances in the past leading to the history of temperature fluctuations. Over the last 10,000 years, none of these imbalances has been very great for very long. What CO2 will do to the imbalance is not known (except conditional on an unrealistically simplified model, what I call “counterfactual” because it is obviously false in detail.)
An Applied Mathematician seems to believe that it is possible to sit in a closed room and from theory alone deduce what the Universe must be like by a process of pure reason, rather like a mediaeval scholastic determining how many angels can sit on the head of a pin.
The Navier-Stokes equations are not the real world. They describe the behaviour of a continuum, a hypothetical medium which is continuous and infinitely differentiable. No real fluid is like that. Fluids exhibit Brownian motion. The N-S equations do not take this into account, they are pre-atomic. Perhaps this is the reason why these equations are so poor at dealing with turbulence.
I should also point out that neither is chaos theory about the real world; it largely concerns the pathological behaviour of sets of partial differential equations. In contrast, physics, statistics and communications engineering use the idea of stochastic process. It is much more powerful and has lead to some important results. Perhaps it is about time fluid dynamics did something similar.
Chaos is a metatheory – of a system with multiple positive and negative feedbacks. It’s strength as a theory of climate and Earth sciences is that it explains data that has all the appearance of abrupt transitions and multiple equilibria – and suggests at least decadal predictability. Perhaps even the potential of predicting phase shifts.
The manifestation of non-linear effects can be discovered in a wide variety of examples, from sociology, population dynamics, economics and ecology. In each case mathematical models can be built that have the potential for a wide range of behavior from stability, gradual growth, persistent oscillations, self-organization, rigidity to change, infinite sensitivity to externalities, all the way to chaotic and unpredictable swings.
My take is that nature is chaotic and real world. Maybe I am just seeing what I want to see.
And another interesting part at the above:
Not only will growth rates change but the whole balance of a region will be modified, with some species being favoured over others. For example, what may be good conditions for the growth of a certain crop may be even better for weeds and predators. In turn, the effects of these changing vegetation patterns will feed back into the atmosphere, both directly – in terms of the amount of carbon dioxide that is fixed by plant-life – but also indirectly, for as the mixes and yields of different vegetation changes so too will the economics and even the lifestyles of a given region. As the economy and social structure of a region changes so too does its energy demands, which results in different amounts of carbon dioxide being released into the atmosphere. Moreover, there will be a variety of lags in the various feedback loops of such a system, so that attempts to control variations in one part of a cycle may have the effect of magnifying another. ‘Even the attempt to isolate a single variable in this whole complex system becomes incredibly complex. A single variable will exhibit the whole range of behaviors from extreme sensitivity to extreme stability as well as limit cycles, bifurcation points, large oscillations and possibly even chaotic behavior.’
Ragnaar – I fully agree that nature is chaotic. However that is not the same thing as being “chaos theoretic”. Modern mathematicians have taken the ancient word “chaos” and used it for their own narrow purpose which is to describe the peculiar behaviour of certain classes of non-linear deterministic differential equations. This confusion of meanings is at the heart of what I am talking about. At issue is whether this peculiar behaviour has anything much to do with the real world apart from specially constructed gadgets such as multiple pendulums. The term “chaos” gives a mystical frisson to this branch of pure mathematics.
There is no empirical evidence that chaos theory provides a useful description of natural processes. At best it provides an excuse for the failure of deterministic equations to deal with real world phenomena such as turbulence. In my view turbulence is associated with increased entropy (i.e. disorder) but the Navier-Stokes equations cannot account for this because they are deterministic whereas entropy is a property of a stochastic system.
I am aware that chaos theory has its own special definitions of entropy. It is a very interesting and exciting branch of mathematics but its “entropy” is not the same thing as the entropy of physics.
Robert – can you give me an example of chaos theory making a “risky prediction” (in the Popper sense) which later turned out to be correct?
‘The Earth’s climate system is highly nonlinear: inputs and outputs are not proportional,
change is often episodic and abrupt, rather than slow and gradual, and multiple equilibria are the
norm. While this is widely accepted, there is a relatively poor understanding of the different types of
nonlinearities, how they manifest under various conditions, and whether they reflect a climate system
driven by astronomical forcings, by internal feedbacks, or by a combination of both.’ http://www.globalcarbonproject.org/global/pdf/pep/Rial2004.NonlinearitiesCC.pdf
We could call it abrupt climate change – it is an outcome of a system pushed past a threshold.
‘Technically, an abrupt climate change occurs when the climate system is forced to cross some threshold, triggering a transition to a new state at a rate determined by the climate system itself and faster than the cause. Chaotic processes in the climate system may allow the cause of such an abrupt climate change to be undetectably small.’ http://www.nap.edu/openbook.php?record_id=10136&page=14
Deterministic chaos explains the otherwise perplexing nature of climate data. Although I am not sure why entropy would be different. We are talking a real physical system that is dynamic and complex.
‘Using a new measure of coupling strength, this
update shows that these climate modes have recently
synchronized, with synchronization peaking in the year
2001/02. This synchronization has been followed by an increase in coupling. This suggests that the climate system may well have shifted again, with a consequent break in the global mean temperature trend from the post 1976/77 warming to a new period (indeterminate length) of roughly constant global mean temperature.’ http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.380.7486&rep=rep1&type=pdf
Abrupt shifts are very evident in records of ocean and atmospheric indices and in hydrology.
Skippy – Yes of course climate processes are often non-linear and sometimes abrupt changes do occur. Why would anyone think otherwise? The arguments presented in these Comments boil down to:
(1) the real world is sometimes chaotic and unpredictable,
(2) the solutions to the Navier-Stokes equations are sometimes chaotic and unpredictable,
(3) therefore the Navier-Stokes equations describe the real world.
People seem to be reluctant to let go of the idea that we live in a deterministic universe(typified by Einstein’s comment that “God does not play dice”). See http://www.scienceheresy.com/2010_09/NavierStokes/index.html . The alternative description, the stochastic one, has been around since Planck but for some reason it has not be taken up by Fluid Dynamics which remains firmly entrenched in the 19th century. Instead fluid dynamicists have become infatuated with Chaos Theory. In my view Chaos Theory is a red herring; a branch of Pure Maths which tells us nothing about the real world.
Re “risky prediction”: One of the criteria needed to confirm a model or hypothesis is that it predicts, in detail, behaviour which other theories do not and which at first sight appear counter-intuitive. That is what I mean by a risky prediction. An example is the gravitational deflection of starlight passing the sun as predicted by the General Theory of Relativity and later confirmed by Eddington. For Karl Popper’s description of the scientific method see: http://www.scienceheresy.com/2010_10/Popper/index.html
‘Lorenz was able to show that even for a simple set of nonlinear equations (1.1), the evolution of the solution could be changed by minute perturbations to the initial conditions, in other words, beyond a certain forecast lead time, there is no longer a single, deterministic solution and hence all forecasts must be treated as probabilistic. The fractionally dimensioned space occupied by the trajectories of the solutions of these nonlinear equations became known as the Lorenz attractor (figure 1), which suggests that nonlinear systems, such as the atmosphere, may exhibit regime-like structures that are, although fully deterministic, subject to abrupt and seemingly random change.’ http://rsta.royalsocietypublishing.org/content/369/1956/4751.full
We come to the crux of your problem John – imagining that chaos in the sense of complexity theory is not completely deterministic.
As I said – chaos is a metatheory describing the behavior of a complex and dynamic system and assigning systems that have that behaviour to the broad class of deterministically chaotic systems that include economies, population dynamics, growth and decay dynamics, heart beats and nerve impulse, lake eutrophication, etc.
So in a sense you are right. If it quacks like a duck and looks like a duck – we will assume it is a duck. N-S is duck. The three body problem is a duck. Weather is a different duck. Climate is a different duck again. They all share behavior.
Is predicting non-warming decades out not risky enough for you? Humbug.
@ Rob I. Ellison, John Reid
“It’s strength as a theory of climate and Earth sciences is that it explains data that has all the appearance of abrupt transitions and multiple equilibria – and suggests at least decadal predictability.” [mwg bold]
“explains” carries the baggage of looking back, not forward; “suggests” shows little hope of going beyond the qualitative. One thinks of how catastrophe theory, bursting on the scene with promise, has languished so many years in purgatory.It explains things, it provides imagery, but it doesn’t quantitfy.
Also meta-theory is an additional layer of abstraction making reality even further removed. Meta-theory is lovely on the mantle but out of place in the workshop.
At sometime some folks have got to come up with a whole bunch of useful numbers–sometime. And there will always be risk to that endeavor. Folks just need to get used to that. To me John Reid has simply noted practical limitations on current expositions of chaotic systems and (noted) how stochastic approaches have demonstrated utility in that regard.
Now you can argue that is where the quantifying science ultimately has to go and that may be the case. But that is not the urgent question at this time and thus chaos is indeed a bit of a red herring.
Anyway that is my read.
Skippy – if you are referring to the Robert Ellison article predicting a non-warming decade, it would certainly have been a risky predictuion had it been done prior to 1998, i.e. prior to the event in question. Ellison’s article dates from 2007 when the “Pause” had already been going for 9 years. Hardly a prediction. Ellison does make a prediction – he says it is going to be cooler over the next few decades. Given the Pause that is not particularly risky – an autoregression model would no doubt predict something similar.
There is a huge difference between making a prediction before the event and coming up with an ad hoc explanation after the event, which is what Slingo and Palmer appear to be doing.
It looks like we might be starting to be on the same wavelength. If we are going to use the oxymoron “non-deterministic chaos” we may as well use the accepted terminology, i.e. “random” or “stochastic”. There is a wealth of techniques for dealing with these concepts in statistics, physics and communications engineering. We don’t need to invoke the unnecessary complexity of Chaos Theory – it’s a furphy.
Thanks mwgrant, you are spot on. I particularly like your mantelpiece-workshop analogy. Science is, or should be, a workshop.
I’m a bit late to the comments, but once again a very enjoyable discussion by Tomas that makes a huge amount of sense. It doesn’t take much experience of real world numerical modelling to realise that the points Tomas makes are absolutely at the heart of the problems of climate modelling and should be the absolute number one priority of climate modellers to address. The problems Tomas describes are fundamental and current modelling attempts only tell us one useful thing – how much we don’t know. The belief that their outputs are some kind of indication of future trajectory of climate is jaw-droppingly bizarre.
I do get frustrated by absurd analogies, and they are appropriately skewered here. The aircraft analogy is absurd – aircraft are constrained by design to be built within the limits of what we understand. We cannot model, design or build aircraft outside of these highly constraining limits; for example the basic shape and construction of the wing has hardly changed at all. Unlike aircraft manufacturers, mother nature did not design the world and its climate with an eye to constraining the design conveniently to what we are capable of modelling. This ludicrous analogy is nearly as bad as the analogies drawn to the Lorenz system of equations. These are not climate; there are fundamental differences between climate and these systems, so results from the systems cannot be extrapolated to climate.
I am glad also that David Hagen above linked to the work of the Itia group at the National Technical University in Athens. Dr Koutsoyiannis is one of the few scientists doing work in this area. I would draw people’s attention to his invited talk at the EGU General Assembly a month ago in Vienna:
“Hydrology, Society, Change and Uncertainty” http://itia.ntua.gr/en/docinfo/1441/ (click through to “full text”)
There are several interesting slides, on “future-telling industries” from the Oracle of Delphi to modern climate models, an eye-opening slide on how poor climate models are at modelling global rainfall (slide #13). I would add that it is a great credit that the hydrological community will permit such views to be aired without vilifying the author.
” I would add that it is a great credit that the hydrological community will permit such views to be aired without vilifying the author.”
Indeed. Researching and reading work in stochastic hydrology has always been a pleasurable and productive experience because of the open nature of the community. Growing pains in the 1970’s and 1980’s somehow were handled differently. and over the years stochastic methods found their way into application. I suspect a number of factors bode well for the development of a healthy community including but not restricted to
1.) there were at least too important real and immediate problems areas under active consideration by government agencies–environmental pollution and nuclear waste disposal–where significant interest and support were sustained;
2.) some extraordinary talent functional in all the niches of the community–academia to consulting–and obviously some enlightened views in government. Strong interaction/integration from academia and ‘practical’ consultants on projects–great for the projects and great for those growing into the science;
3.) the (geo)hydrology problems while knotty are more tractable conceptually and in implementation than climate;
4.) real-world problem sets, i.e., sites, were ‘numerous’ (at least compared to one site only, earth). This afforded numerous and varied opportunities to apply and improve methodologies;
5.) Clearly model verification is an issue in hydrology but contaminated site (as an example) are event in progress that offer at least some partial insight into behavior over time;
6.) often substantive quality assurance is put into place although even that does not provide guarantees;
7.) development of formal approaches for integration into decision making started early;
8.) problems are local or at most national;
9.) decades of conceptual and practical experience before stochastic roots began to grow in.
10.) No blogs.
You missed the essential role of hyrology in investigation and design of irrigation and hydro electric schemes. The analysis of probability distributions of water inflows over various periods (years, decades, etc) to dams, probability of extreme floods, the maximum probable flood, etc, were being done in the 1950’s and probably earlier. I understand probabilistic safety assessment for nuclear power built on this earlier work.
Thanks for the important addition Peter…subsurface geohydrology is my comfort so I tend to think in those termsCertainly a lot has come in through the civil engineering portal. That origin certainly made adoption of stochastic and bayesian tools much easier. Indeed hydrology departments and degrees came on the scene much later.
Other strong influence of course has been the the mining engineers and geologists and chemical engineering. As many have pointed out these endeavors are kept honest because of the monies involved–folks are playing for keeps.
Finally, perhaps relevant to this post by Tomas both deterministic and stochastic tools are understood to have utility in hydrology; and relevant to Judith’s focus on uncertainty hydrology has developed a pretty good track record of innovation regarding uncertainty.
[repost with corrected fromat]
I hadn’t previously realised how closely our interests are aligned. Yours are in geohydrology and I’ve been involved in many hyrogeological studies in Canada and other countries around the world (piezometric profiles down to 1000 m depth in the Canadian Shield and Canadian Rockies, large scale pump tests and tracer testing.
Hydrology has been a part of civil engineering for a long time.
“Although hydrology has existed as a focused, quantitative discipline addressing water storage and flux for almost a century …”
“History of hydrogeology” by the American Geophysical Union
The Snowy Mountains Scheme (3 video clips from 1952)
Sir William Hudson: excerpts:
Read the 2000 word biography: http://adb.anu.edu.au/biography/hudson-sir-william-10563
Sir William Hudson led the Snowy mountains project that were well ahead of their time and many of the things he implemented are now accepted as standard practice half a century later. The hydrology lab in Cooma was one of the best in the world at the time. Other areas of leadership: rock mechanics, engineering geology, hydrology, biology (soil erosion protection, flora regeneration) industrial relations, safety, compulsory use of seat belts, family support, housing, relations with politicians, public relations and much more).
Back to the main point. He was interested in hydrology in 1920’s and it was a major factor in his career and his enormous success in leading and completing the Snowy Mountains Scheme to meet and exceed all reasonable expectations (other than the usual whinges from the usual suspects).
See the 2000 word biography linked above.
“Hydrology has been a part of civil engineering for a long time.”
Yes, and still is at many universities. Separate hydrology programs and degrees are more recent in the US. Another root is soil science and agronomy departments. I moved into contaminant geohydrology largely with more of a chemical engineering perspective. Looking back I am happy to have landed there at the time I did, but I do not think I would want to be there now ;O). Too much plug and chug–and political agendas, and too little roll your own.
Thank you for your comments. I know what you mean and agree.
I appreciate your very informative contributions. The chemical engineers, civil, hydrogeologists and other disciplines are all needed for aquifer management and for contamination studies. At one stage I was involved in the bid for the clean up of the industrial contamination of the Homebush site in Sydney where the 2000 Olympic sports facilities and Olympic Village were to be built. The chemical engineers had a major input. Very interesting. We didn’t win the bid, so if someone gets sick – it wasn’t my fault :)
Willful ignorance by Robert I Ellison, FOMD-link to computational science.
In the same vein as Robert I Ellison …
Judith Curry dubiously cherry-picks Words of wisdom from Ed Lorenz
The GWPF’s astro-turfers uncritically parrot Judith’s Words of wisdom from Ed Lorenz
The American Institute of Physics contributes some basic science, historical context, and mathematical common-sense:
Conclusion Five decades after Ed Lorenz’ 1967 work, climate-science has moved on from Robert I Ellison’s ignorance, Judith Curry’s cherry-picking, and the GWPF’s astro-turfing campaigns.
That’s clear to *ALL* mathematicians, scientists, and thoughtful cscientists, eh Climate Etc readers?
Just about any decent computer model predicted a “pause”. R.I.P.
Robert I Ellison condemns “The sort of fool who can mention a boiling pot and ‘Rayleigh-Taylor dynamical instability’ in the same sentence.”
Pretentious and quite astonishingly misguided nonsense from FOMBS – down to Earth distaste of prententious and misguided nonsense by Richard Feymann.
‘What defines a climate change as abrupt? Technically, an abrupt climate change occurs when the climate system is forced to cross some threshold, triggering a transition to a new state at a rate determined by the climate system itself and faster than the cause. Chaotic processes in the climate system may allow the cause of such an abrupt climate change to be undetectably small.’ NAS
‘The climate system has jumped from one mode of operation to another in the past. We are trying to understand how the earth’s climate system is engineered, so we can understand what it takes to trigger mode switches. Until we do, we cannot make good predictions about future climate change… Over the last several hundred thousand years, climate change has come mainly in discrete jumps that appear to be related to changes in the mode of thermohaline circulation.’ Wally Broecker
‘Lorenz was able to show that even for a simple set of nonlinear equations (1.1), the evolution of the solution could be changed by minute perturbations to the initial conditions, in other words, beyond a certain forecast lead time, there is no longer a single, deterministic solution and hence all forecasts must be treated as probabilistic. The fractionally dimensioned space occupied by the trajectories of the solutions of these nonlinear equations became known as the Lorenz attractor (figure 1), which suggests that nonlinear systems, such as the atmosphere, may exhibit regime-like structures that are, although fully deterministic, subject to abrupt and seemingly random change.’ Julia Slingo and Tim Palmer
Science has indeed moved on and left a FOMBS fulminating futilely in it’s wake.
a fan of *MORE* discourse: Five decades after Ed Lorenz’ 1967 work, climate-science has moved on from Robert I Ellison’s ignorance, Judith Curry’s cherry-picking, and the GWPF’s astro-turfing campaigns.
The science has moved on, but there are still no accurate models, and the climate system is still chaotic. A clear presentation of a subset of the climate is Henk Dijkstra’s book “Nonlinear Climate Dynamics”. I should mention that Robert I Ellison recommended to me an earlier book by Henk Dijsktra called “Nonlinear Physical Oceanography”, a clear demonstration that fomd’s characterization of Ellison’s “ignorance” is an unsubstantiated slur.
Thanks to Tomas Milanovic for this very thorough and informed article. This is essentially what IPCC said in FAR before forgetting the problem and spending the next quarter century trying to do the impossible.
It’s a problem, explaining that not only are their models failing, but that they may never work. If they are too direct and honest, it may lead to direct unemployment.
… and that they knew and recognised this 25 years ago.
This caveat was missing form SAR, TAR AR4 and AR5 it seems.
ENSO is not chaotic but follows a deterministic pattern. It is pretty simple actually.
Well don’t leave us hanging. I’m interested, where is it?
“ENSO is not chaotic but follows a deterministic pattern.”
Perhaps you are using the term chaotic differently, but chaotic systems are quite deterministic, as opposed to some random walk. They are nonlinear and far from predictable, however, even if you know every single component, and even given the exact same initial conditions, the system will never evolve exactly the same way twice, This was one of Lorenz great contributions.
The paper on Lorentz modes was interesting, looked a lot like ice ages.
Here is a good reminder that climatic effects are regional. Much has been made of drought in the US, but globally, there is no significant increase.
“Recruiting a sample of Americans via the internet, they polled participants on a set of contentious US policy issues, such as imposing sanctions on Iran, healthcare and approaches to carbon emissions. One group was asked to give their opinion and then provide reasons for why they held that view. This group got the opportunity to put their side of the issue, in the same way anyone in an argument or debate has a chance to argue their case.
Those in the second group did something subtly different. Rather that provide reasons, they were asked to explain how the policy they were advocating would work. They were asked to trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have.
The results were clear. People who provided reasons remained as convinced of their positions as they had been before the experiment. Those who were asked to provide explanations softened their views, and reported a correspondingly larger drop in how they rated their understanding of the issues. People who had previously been strongly for or against carbon emissions trading, for example, tended to became more moderate – ranking themselves as less certain in their support or opposition to the policy.”
I prefer the opposite approach. Ask someone to explain just as thoroughly the position they oppose. I have attempted that several times here at Climate Etc., and not a single warmist taker yet.
But the principle is the same. If you force someone to actually think about a proposition in detail, it undermines the default beliefs they bring to the topic.
(Note, the way even the second phase was phrased – ” trace, step by step, from start to finish, the causal path from the policy to the effects it was supposed to have” – is from a progressive perspective. That is not how conservative policy operates, in a free market. Many things, climate and economics included, do not lend themselves to tracing policy to specific effects. It’s just more complicated than that.)
==> “Ask someone to explain just as thoroughly the position they oppose. ”
Which misses the key point.
For example, you “explain…thoroughly the position [you] oppose” rather frequently in your comments. The problem is that your explanations are inaccurate, caricatures, fantasies. For your “principled” approach to be of any value, it has to be in good faith. Your “explanation” of the positions you oppose has to be a match for their actual positions, and not your contrivance of their positions.
Give it a shot. I challenge you. If you think this is a valuable approach – put yourself on the line. Explain a position that you oppose in good faith.
I extend that challenge knowing full well that the chances you might be able to take it on are miniscule. Why? Because you are on here regularly explaining how those who hold positions you oppose are intellectually and morally inferior to yourself. How could you extend good faith to people you judge in such a fashion (which is, of course, the vast majority of Americans you brand as “progressive’ – which includes all Dems and virtually all Repubs), or even to their positions?
But prove me wrong. It’s my challenge to you. Go for it.
I don’t debate liars like yourself. it is a waste of time, as you have proved over and over again, including here. I have done exactly that in the past. Go nip at someone else’s ankles.
==> “I don’t debate liars like yourself.”
You’re ducking. It is quite evident that you construct a world where those who disagree with you are morally and intellectually inferior. That kind of elitism oozes from many of your comments. I have pointed this out to you on quite a few occasions. I will continue to do so. That isn’t a comment on you, personally. For all I know you are as egalitarian as they come. It is a critique of your arguments.
You don’t need to debate me, anyway, to take on my challenge. All you need to do is make a good faith explanation of the positions you oppose, in such a way that those who hold those opinions would assent that your explanation is accurate.
It would have nothing to do with debating me.
It would have to do with you demonstrating the principle that you spoke of.
Equating my challenge with debating me is a duck.
Don’t debate me. I’m not challenging you to debate with me. I’m not asking you to debate with me.
Prove that you can follow through on what you think should be a fair expectation of others. Be accountable.
My last response to you. (It was clearly a mistake to respond to you the first time.)
I have done so. Don’t be so dishonest, or so lazy. If you want to accuse me of something, actually find out if what you are saying is true.
I will not go back through the last two years of posts here to find examples. I feel not the slightest need to defend myself against your lies. Which is why I have ignored them for so long.
However, I know that you are incapable of critical analysis, so I’ll make you a deal.
You pick a climate science subject of debate. Your choice. I will then post what I believe to be the consensus position, as fairly and objectively as a warnmist would.
On the condition that you agree to do the same for the skeptic position.
I will even go first.
But when you either refuse to reciprocate, or demonstrate your inability to do so, then you agree not to respond to any further comments of mine here in the future. It’s a small price to pay to prove your intellectual superiority.
Oh, and no googling, copying or pasting. The question is not whether you can find a skeptical article on a subject, but whether you are capable of engaging in critical thought based on what you already know.
Your choice, but if you decline, this is my last response to you. You are a serious waste of time on this site. It is Climate Etc., not Joshua Etc.
JimD : “climatereason doubts that Pinatubo caused any cooling, and that is up to him. Lots of publications have said it did, and none have said it didn’t, so I just go by that.”
And what do you suppose the chances are of getting something like that published are?
There is good reason to suspect that the cooling is only temporary Noting also that in the tropics the degree.day integral (ie total energy) is actually restored:
I’m in the process of writing this up but you can also see that there is a 2W/m^2 INCREASE in SW entering the troposphere after Pinatubo has settled:
Now that extra 2W/m2 is not going to be cooling anything down!
I agree the cooling from volcanoes is temporary. Meanwhile the GHGs build up, so what you get after the dust clears is the pent-up warming that looks like an acceleration. This happened post-Pinatubo.
“There is good reason to suspect that the cooling (from volcanoes) is only temporary.”
Actually better reason to suspect it is not:
Remembering that ocean heat content dictates tropospheric heat, or said another way, tropospheric heat follows ocean to atmosphere sensible and latent heat fluxes, this article on ocean heat content content and volcanoes is worth a read:
The effects of a large volcano can linger in the climate system for far longer than the initial period of tropospheric cooling because of the effects on ocean heat content, which also impacts sea ice, which would reinforce the initial ocean heat content decline.
No Jim, that’s false attribution. It’s not the GHG “build up ” on any timescale that reflecting or not reflecting SW.
Look at my graph. Both stratospheric temp and reflected SW drop and stay low. This is an after effect of volcanism, and cannot be attributed to CO2 & co.
As for “peruse” why don’t you peruse and link me something that backs up what you say. There is nothing that can reasonably be attributed to volcanoes. If you think it’s otherwise : peruse and show.
Indeed volcanoes and aerosols in general can delay, but not prevent, the inevitable net warming from increased GHGs.
I say that because I’m actively seeking such evidence to assess climate sensitivity. I find nothing that can credibly be correlated to the radiative forcings I have.
OK, try this site. It was the top link in my search for the effect of volcanoes on global temperature. There are famous episodes (year without summer, etc.) which would make the cooling effect hard to deny.
Here’s the bit NASA carefully avoid showing
Look at El Chichon, it’s perfect anti-correlation and if you look Mt P, the correlation is mostly spurious. There was variability of the same magnitude in the 10y prior and probably would have gone down anyway.
There is no way to justify that attribution in the context of that kind of variability. That’s why the carefully cropped out the context and avoid talking about El Chichon.
(BTW the pink line is usual volcanic forcing, the red line is something else I’m doing that I won’t bother going into in detail here.)
climategrog, as my link mentions, there was a large El Nino about the time of El Chichon, and both had opposite effects. You can have two big things canceling occasionally. There has been a lot published on detecting Pinatubo’s cooling. It should not be hard to find for yourself.
“Jim D | May 24, 2014 at 12:22 pm |
Indeed volcanoes and aerosols in general can delay, but not prevent, the inevitable net warming from increased GHGs.”
It is a matter of times frames for various forcings plus times frames for internal variability. The long-term forcing from increased GH gases eventually outlast aerosols and internal variability. It is only the noise from shorter term forcings and internal variability that allows fake-skeptics to have any room for their faux-skepticism related to AGW.
climategrog | May 24, 2014 at 11:42 am says:
JimD : “climatereason doubts that Pinatubo caused any cooling, and that is up to him. Lots of publications have said it did, and none have said it didn’t, so I just go by that. And what do you suppose the chances are of getting something like that published are?”
This guy climatereason is right – there was no cooling from Pinatubo. What is assigned to it as its volcanic cooling is nothing more than a normal La Nina cooling that by chance happened to be where they expected to find Pinatubo cooling. There is no such thing as a volcanic cooling as I said in my book “What Warming?” in 2010. What happens is that the volcanic gases ascend directly into the stratosphere which they warm at first. In a couple of years cooling follows but it never reaches the troposphere. All so-called “volcanic” coolings are nothing more than La Ninas falsely identified as volcanic cooling because they accidentally were in the right place when the eruption happened. There are also volcanoes that have no associated cooling at all because the time of the expected cooling coincides with an El Nino peak. This makes me laugh because Self who originally assigned the 1992/93 La Nina to Pinatubo cooling was convinced that Pinatubo had created it by suppressing a combination of an El Nino that was originally in that spot and greenhouse warming that was also working against it. While Pinatubo was lucky to find a La Nina for its cooling El Chichon was not so lucky because it was followed by an El Nino peak that took over the spot where its cooling should have been. In addition, there can also be intermediate situations where a small temperature dip is observed. This is because the ENSO oscillations are independent of volcanism and chance determines what part of the ENSO system occupies the expected volcanic cooling site. So far the so-called “experts” who deal with volcanic cooling simply have paid no attention to what I have written, either because of stupidity, laziness, or arrogance. As a result, even climate models have code built into them for locating imaginary volcanic coolings wherever they might be, and whether or not they exist.
Find tropical temperature dataset : air or sst that show Mt P caused cooling. The roll back and have a look at El Chichon 10 years earlier.
climatereason inadvertently posted a link above with a lot of graphic evidence. You can peruse that.
There was nothing ‘inadvertent about it as that was the correct place in the nest. Here is the graphic material again. Things started to cool off BEFORE the Volcano erupted.
We have another famous example with the 1257/8 super volcano eruption, which caused Dr Mann to write an explanatory piece as to why the cooling wasn’t picked up in tree rings. Again, the problem is that the climate had deteriorated BEFORE before the 1257/8 volcanic eruption and where there was an effect (i.e Laki) temperatures recover quickly after.
According to Dr Mann and Dr Miller (the moss man) the 1257/8 volcano and ones that followed soon after, were the cause of the LIA. That is not what the observational evidence tells us at all.
That is not to say that optical depth or super volcanos have NO effect whatsoever, but from the observational evidence it is generally short term and depending where it occurred, the effect ranges from substantial to virtually nothing. It also seems to warm up the Northern Hemisphere winters.
I understand that the current pause is being partially blamed on aerosols and I guess that to support the narrative, minimal optical changes in the past are needed to explain the current pause.
climatereason, you might also want to comment on the 1816 “year without a summer” episode mentioned in what I linked above in connection to your last comment.
THe discussions seems to have moved down here so I haven’t seen your comment about 1816. Please link to your comment.
We don’t have any proper data for those historic volcanoes.
Try concentrating on the ones where we had satellites beamed in on what was happening and detailed sea and atm temp records.
Sorry, my comment escaped too early.
I didn’t see your original comment about the year without a summer. Please link to it again.
1816 was undoubtedly cold, However, it is yet another example of the climate already having deteriorated a decade earlier. It then improved immediately after
There are various examples through history of ‘years without a summer.’
Perhaps volcanos temporarily make an existing cool climate worse but it then seems to be the catalyst for it to improve again? Seems unlikely but IF winters do warm as Nasa states perhaps there is something going on that we don’t understand. Not for the first time.
I only go by observational evidence not models.
these historic volcanos are well documented. We know their effect from the crop records and other data, where for example the Church had to give poor relief.
If crops failed that was catastrophic for people so I think the evidence is as good-in its own way- as the very recent Satellite data.
climatereason, here is the link.
It is mentioned among other evidence. Tambora in 1815 was connected.
Here’s clearer look at TLS note the way after each eruption there is a step drop in temp. and then it stays level.
There is some hand-waving explanations about gradually mounting GHGs stopping heat getting OUT to stratosphere , causing the cooling.
My earlier plot of reflected SW shows this explanation to be false. It’s a step change caused by the volcano, not a steady climb. Also if the data is low pass filtered as the effect will be when taking a near global average TLS, the form of the two curves match almost perfectly.
Now if volcanoes don’t cause a permanent cooling offset in surface temps, there’s no need to pump up the CO2 forcing. In fact much of “CO2 forcing” is provided by the +2W/m^2 caused by the volcanoes.
Basaltic Mt Laki wouldn’t be the biggest bang in recent centuries, but it was likely the most polluting with the strongest, longest effect on atmosphere. Yet less than a decade later (related Grimsvotn was still erupting in 1785) the new settlement of Sydney experienced the ferocious heat of the 1790s El Nino conditions which also brought catastrophic drought to India. Shortly after Laki it got very, very hot here in Oz, okay?
Since Decade Volcanoes etc are underfunded and climate “experts” have such a powerful need to look away from what lies beneath, volcanism is merely an ace-up-sleeve in case some alarmist wants to explain something away, like an LIA or some such party-pooping event. Volcanism is there just to maintain the aerosol narrative which Tony mentions.
The one likely cause of short-term and short-lasting climate disaster just doesn’t rate the billions. Yet those creaking whirlygigs and grimy solar panels could look mighty out of place for a year or two if another Laki occurred. And who is going to assure us we won’t have something as powerful as Tambora and as dirty as Laki? Humanity would come through it, but we’d certainly have a problem or two with climate.
Surprisingly – or maybe not? – such marked climate change is of little interest to those who peddle climate change most vigorously. Which seems odd. Maybe the sheer horror of not being able to make jet trails to climate conferences for a year is too much to contemplate.
“these historic volcanos are well documented.”
Yes they had an effect. I don’t call that data. It’s historical evidence that is to be considered but it’s not data.
Jim, thanks for the link. A good resume of the orthodox position.
My stack plots show volcanoes do have some immediate effect even in tropics, more so outside tropics. The question is what happen from a couple of years out.
The stratosphere and ERBE SW data seem pretty clear. The stratosphere become less opaque than it was before the eruption, resulting in less reflection and more SW energy making it into the lower climate system.
There is a strong possibility that this will get falsely attributed to GHG unless it is recognised and counted. I don’t see any evidence that is the case.
Jim’s link does say a bit about warmer winter cooler summers, but does not make any mention of the extra 2W/m2 getting through to troposphere. Am I really the first person to notice this?!
I doubt it but I don’t find any mention.
Here are two extracts from the Exeter Cathedral records I researched last year
‘1740 January ‘£23 to be given to poor in consideration of the severity of the season.’
1783 ‘Extra poor relief in extreme cold’ (due to Iceland volcano?)
Such records are good indicators of severe events. The 1740 reference was to one of the coldest periods on record and followed immediately after the 1730’s decade that was only fractionally lower in temperature than Britain’s warmest ever decade’ that finished at the turn of the 20th century. It caused Phil Jones to query whether science underestimated natural variability.
I am surprised that there are people who don’t think volcanoes do any cooling, but I guess I shouldn’t be. How does this jibe with the Milanovic position that no amount of forcing does anything that looks linear: warming or cooling, or do you disagree with that?
Yes,, its historical ‘evidence’ rather than ‘data.’ However the data which reaches back only to the start of the satellite era is so short (and not necessarily typical) that I am not sure what sort of scientifically valid conclusions could be drawn from it.
‘I am surprised that there are people who don’t think volcanoes do any cooling, but I guess I shouldn’t be.’
I have said nothing of the sort. We have evidence of volcanic induced cooling but it appears to be short term (when it occurs) and in the winter, according to Nasa, there can be warming.
Interesting about Essex, tonyb. I hope you keep beavering away with those old records.
The CET remembers 1739-40 as one of the big cold events of the record. There was a big cold around Laki, but also heat and drought years shortly after. Certainly no mini ice-age, and Laki-Grimsvotn was an all-time whopper for atmospheric pollution.
Interesting that there was extreme cold before as well as concurrent with Tambora, the biggest bang, though not as messy as Laki. Also interesting that such cold coincided with a well-observed Arctic opening.
If Laki and Tambora explain anything about climate long term then it escapes me. Mind you, the masses of hot plasticky substance which underlie earth and ocean may yet explain a thing or two. When our experts have a moment.
“Yes,, its historical ‘evidence’ rather than ‘data.’ However the data which reaches back only to the start of the satellite era is so short (and not necessarily typical) that I am not sure what sort of scientifically valid conclusions could be drawn from it.
I’m not suggesting that this can automatically be applied to all volcanoes but this is the most detail information we have and is orders of magnitude more use than ecclesiastic anecdotes about helping the poor.
It may be that one of the effects of the last two major events was that it wiped all sorts of industrial much out of the stratosphere. Who knows?
However, most of the panic and hysteria is based on the latter part of the 20th c. and that’s where the top quality data lies. We’d be wise to wring everything we can out of it.
There was an extra 2W/m2 _averaged_ right across 20S-20N, then there was 98 El Nino the reguritaged enough heat to warm the planet 0.1K
Maybe there’s link.
“According to Dr Mann and Dr Miller (the moss man) the 1257/8 volcano and ones that followed soon after, were the cause of the LIA. ”
No, this is not what they say. But the 1257 volcano, and the general 50 year period of increased volcanic activity around that time, had a major impact on ocean heat content, reversing the trend of rising heat content during the relatively low volcanic activity period of the MWP.
But as large as it was, the 1257 volcano was a SH volcano, with stronger effects on the ocean heat content of the IPWP. The mega-volcano of 1453, equally as large, if not larger, was a NH event, with major impact on the OHC in the Atlantic. Together, these two volcanoes were the largest of the past 1500 years or so, and certainly made a contribution to the cooling of the LIA, when one considers that via decreased OHC, volcanic aerosol cooling can linger far past the initial cooling. But neither volcano explains the full cooling of the period, especially in the NH. One needs to consider the decreased solar activity as well.
Here is the direct quote from GIff Miller confirming those volcanoes precipitated the LIA
‘This is the first time anyone has clearly identified the specific onset of the cold times marking the start of the Little Ice Age,’ says lead author Gifford Miller of the University of Colorado at Boulder.’
Its ok, apology accepted
‘The eruptions could have triggered a chain reaction, affecting sea ice and ocean currents in a way that lowered temperatures for centuries.’
You just take your pick these days, don’t you? That unsightly downward bend in the hockey stick is actually there – maybe. If it is there – and they’re not saying it is, they’re just saying there’s a North Atlantic signal, not just a local one! – it was triggered by volcanos. Whew. Off the hook again. Silly skeptics with their ice fairs and Breughel scenes!
The team used the Community Climate System Model to work it all out. Just so you know. Who argues with a CCSM?
‘The eruptions could have triggered a chain reaction, affecting sea ice and ocean currents in a way that lowered temperatures for centuries.’
More “tipping point” mentality applied to a system that is obvious inherently dominated by negative feedbacks.
The only real evidence of “tipping points” or “chain reactions” which are caused by what (real) scientists and engineers call _positive_ feedbacks is the glaciation / deglaciation transitions.
This does seem to lend itself to the Lorentz attractor type of model.
Exeter Cathdral records . Thx tony b fer yer research…
1740. ‘ Extra $23 to be given to the poor in consideration of
the scarcity of the season,’
That’s weather data, more reliable than some cherry-picked
tree-ring-er-data selected in the context of an ideological or
Aca-dem-ic not ‘dam’ damn it!
Hmm …keep on makin; these typo;s maybe freudian slips,
maybe bad lighting, eye-sight, maybe carelessness?
Came across this at the Bishop’s, Dr. Curry’s interview at Quadrant.
“TONY THOMAS: If the skeptic/orthodox spectrum is a range from 1 (intense skeptic) to 10 (intensely IPCC orthodox), where on the scale would you put yourself
(a) as at 2009
(b) as at 2014,
and why has there been a shift (if any)?
JUDITH CURRY: In early 2009, I would have rated myself as 7; at this point I would rate myself as a 3. Climategate and the weak response of the IPCC and other scientists triggered a massive re-examination of my support of the IPCC, and made me look at the science much more sceptically.”
A 3. That’s barely lukewarmerish.
“Climategate and the weak response of the IPCC and other scientists triggered a massive re-examination…”
Odd, but I thought only actual data should change someone’s skeptical leanings.
Climategate caused a re-examination of the “data” as presented by the consensus. When you base your “science” on consensus and appeal to authority, evidence that that authority is corrupted should cause anyone to reconsider “one’s skeptical leanings.”
Those who live by appeal to authority, die by the death of that authority.
It has been interesting to watch as some people’s perspective’s have changed over the past several years. It is also amusing to watch how others continue to hold their positions despite of the failing foundation for their positions.
The data showing a continued net accumulation of energy in the climate system has only gotten stronger over the past few years, and this has nothing to do with the IPCC nor “Climategate”. Those seem like convenient excuses for what would then be actually emotional, rather than rational reasons to alter your skeptical stance.
“The data showing a continued net accumulation of energy in the climate system…” are spares, inaccurate and worse that the surface temp reports.
Climategate just undermined the appeal to authority because it showed so many of those supposed authorities were dishonest, vain manipulators of data and the publishing process. Not convenience, just fact.
Ye Shall Know Them By Their Fruits
• Judith Curry boldly predicts (21 May, 2014) “Solar effects, combined with the large scale ocean-circulation regimes, presage continued stagnation in global temperatures for the next two decades.”
• James Hansen boldly predicts (21 January 2014) “Record global temperature is likely in the near term. However, the rate of future warming will depend upon changes of the tropospheric aerosol forcing, which is highly uncertain and unmeasured.”
A main difference is James Hansen’s public call for more-and-better scientific data (whereas Judith Curry to data has issued no comparable call):
Conclusion In regard to near-term stagnation versus heating, Judith Curry and James Hansen cannot both be right.
Rational Advice Judith Curry, you are missing an outstanding opportunity to cloak yourself with GLORY, by joining James Hansen in consistent rational sustained scientific advocacy for better climate-system observational data!
Climate forcing results in an imbalance in the TOA radiation budget that has direct implications for global
climate, but the large natural variability in the Earth’s radiation budget due to fluctuations in atmospheric and ocean dynamics complicates this picture. http://meteora.ucsd.edu/~jnorris/reprints/Loeb_et_al_ISSI_Surv_Geophys_2012.pdf
SW – and IR – radiant flux anomalies are measured with great precision. SW integrates a number of factors including sulphates.
It shows that reflected SW decreased last decade – despite any mooted changes in sulphates – while IR emissions were trendless. In the latest available data there is no trend in SW. IR or, naturally, net TOA radiant flux.
This is the data that Hanson rejects – a bit like John West – because if it is true his whole career and legacy collapses in a heap along with the faith of acolytes like FOMBS.
The reality is that the current state of ocean and atmosphere is a cool mode and these persist for 20 to 40 years. Frankly – I have great difficulty with the idea that the next climate shift is necessarily to a warmer mode and not to yet cooler.
This is apparently related to cloud radiative effects.
Science moves on leaving FOMBS fulminating in it’s wake.
Fan,. I know your a tool for Hansen, but using just one statement by Dr. Curry is not only biased but dishonest! Here is just one statement she recently made at her senate hearing:
Not only is more reasearch needed to clarify the sensitivity of climate to carbon dioxide and understanding the limitations of climate models but more research is needed to understand solar variability, sun climate connections, natural internal climate variability and climate dynamics of extreme weather events. Improved understanding of these aspects of climate variability is needed to help government officials, communities, and business better understand and manage the risks associated with climate change.
This is just one example, I’m sure there are plenty more given she often speaks of the uncertainty.
I believe you owe Dr Curry and the folks here at Climate Etc an apology for your purposeful and blatent misrepresentation.
I’m sure given her attitude for more and better research she would be very supportive of more satellite data and by your statement there there was no explicit call by Dr Hansen to replace that mission.
Oh, Fan’s great at apologizing, for the brain dead, self destructive policy recommendations of the alarmists.
JimD: ” How does this jibe with the Milanovic position that no amount of forcing does anything that looks linear:”
You should pay more attention to what is said and not said. I suggest you try quoting people rather than paraphrasing what you thought they meant or said.
Look out for the word “predictable”.
h/t to Tomas Milanovic and Judith Curry!!!
Best article to date covering the obvious complexity!
Having built a successful “low dimensional deterministic toy model” (to satisfy my own curiosity) of the kind that Tomas Milanovich proves to be impossible, I have some technical comments about the strengths and limitations of his argument.
(1) Attractors. Non-linear systems tend to have multiple attractors, so they can flip from one state to another. But this fact does not constrain how disparate the states may be. One might imagine that recent preindustrial climate conditions have several attractors which differ by a few tenths of a degree; for practical purposes we may consider this to be a single initial state which has been disturbed by anthropogenic factors. Eventually these will drive the system by a few degrees, much larger than the differences between the natural states. So something which can be proved to be false (there is a singe equilibrium state) may still be a good working approximation.
(2) Linearity. If f(t) is the anthropogenic forcing function (possibly a vector of forcings), consider the response to a forcing x*f(t) where x is some positive multiplier. The response (which may again be a vector, and may involve averaging – for example to give annual global temperature anomalies) is a function of parameters including x. By Taylor’s Theorem, as x tends to zero the response must be proportional to x, even if the system is highly non-linear. (Exceptions are where the derivatives are zero, infinite, or indeterminate; but these special cases are vanishingly rare.) As x increases, the response ceases to be linear and may become arbitrarily complicated. What about when x=1? The linear approximation is certainly false, which is Milanovic’s point, but whether it is “good enough” is an empirical question. Willis Eschenbach found that climate models behave linearly as various parameters change (http://wattsupwiththat.com/2011/05/14/life-is-like-a-black-box-of-chocolates/). He took that as evidence that the models were broken, but it is more plausible that he had just rediscovered Taylor’s Theorem.
(3) Time scales matter. The solar system is an interacting many-body system and is known to be chaotic. Yet the planets continue to turn up where simple approximations predict, and Halley’s Comet comes back every 76 years without fail. We know exactly where the planets will be in 100 years, but we have no idea where they will be in 100 million years because on that time scale the chaos makes the solution non-computable. If climate is chaotic on some time scales, it may still be highly predictable on other time scales (such as those relevant for public policy).
Putting these thoughts together, it is possible that the climate system has chaotic attractors leading to widely different states over scales of 100,000 years (ice ages), moderately distinct states over millennia (medieval warm period), and distinct but only slightly different states over decades (PDO), and yet may still respond in a near-enough-to-linear fashion that simple models can isolate the effects of a small but persistent push (anthropogenic warming). Milanovich’s analysis of the mathematics proves that the system is chaotic, but that does not really answer the important practical question that Isaac Held raised. Of course, “it is possible that” is not at all the same as “it is true that”. Whether really simple models actually work well enough to be useful in the sense of answering at least some questions that the big models cannot, and if they do not work then how much more realism has to be added until they do, are empirical questions not mathematical ones.
You have included some very important points about climate and chaos in your essay. I would like to mention them in a paper I am preparing. Is there a paper in the scientific literature that I could reference? Also I would llike to read your posts
However, clicking on the either link does not connect to any post.
Donald, delete the %5D from the links and you will get to Tomas’ posts. Alternatively, you can go to judithcurry.com, and search for ‘Milanovic’, which will point you to Tomas’ posts
Attention Tomas Milanovic:
We can use EOP, the law of large numbers, and the law of conservation of angular momentum to tune aggregation to detect attractors that will afford prediction of some statistical properties of climate to within the limits of multiaxial turbulence (including energy storage-form axes, not just spatial axes):
Reinterpreting ERSST EOFs 1-4
The Tsonis+ synchronization framework is lunisolar. (There’s more to report at a later date…)
Yes, that’s one thing I noted in the article. It states that Earth rotation and orbit and the ONLY strong cyclic influences that could cause phase locking.
That ignores luni-solar tidal forces. The lunar component is quite complex.
One factor in synchronised oscillator model could be heat transport in and out of the tropics due to circa 9 and 18 year variations in tides. Timing of declination angle, perigee variations and eclipse cycles act on the interdecal time-scale that is of importance. It’s not just monthly tides.
So far it seems to have ignored as “internal variability”.
Lunisolar is internal (to the Earth-Moon system), including QBO (and the event series dictated by it’s alignments with the terrestrial year — more details at some future date).
As an illustration for the usefulness of the Fourier method, may I point to our paper: Clim.Past, 9, 447-452, 2013
In the meantime the principal Fourier components (dominant cycles) are identified as the AMO/PDO and De Vries-Suess cycles
I’m sorry but what you are doing figure 6 there is just replicating the sample period as a projection. The beginning of the series drops and this is reproduced as the beginning of the future projection.
You do no windowing of the data so your major long components represent little but the general U shape of the sample and will simply reproduce this U end to end ad infinitum. I regret to say, that strong downward curve you present is a result of your processing and the length of the data.
Repeat your processing on the first half of the window and see if it looks like the latter half. It won’t is will jump then trend downwards like the beginning.
Also the HISTALP data that you used (“homogenised”) had been heavily manipulated. Most of the long term signal is “adjustment” rather than data. HISTALP refuse to release original data nor the data that is the basis of the corrections done by Bohm at al. So validation is impossible.
HISTALP series are very long may contain some valuable information but until they are prepared to open up access to the data to allow third party validation of the substantial modifications they are making, the data has no objective value.
If you have any influence in that organisation you may like to explain that blackbox corrections and obstructing validation are not acceptable scientifically.
Sadly this seems a rather prevalent attitude, especially in Europe.
Chief : regarding climate vs weather.
Weather is a well defined state of all relevant fields (velocity, pressure, temperature, density …) which evolves in time according to mostly Navier Stokes.
It is fundamentally only N-S because at the time scales of weather (hours, days), all slower systems may be considered constant. Then weather is solution to N-S and these solutions belong to the global attractor asymptotically as has been shown by Foias and Temam.
Climate is defined as a time average of weather at scales that are vastly beyond the weather scales (e.g decades).
So knowing weather is a necessary and sufficient condition for to know the climate so that one may say that climate is (derived from) weather while weather most definitely is not (derived from) climate.
I must also make a comment about the “Boeing analogy” which has still been misunderstood by some.
Indeed as has been observed this is a question of scales.
CFD as used for aeronautics uses space scales that are 10^7 or more smaller than the significant scales for the structures seen on the picture.
As the computing time scales like d^3, it means that using CFD for the large scale structures would need a computing time 10^21 or more longer than the computing time needed for the computing of the dynamics at the wing scale.
This is clearly impossible as everybody having used CFD perfectly knows.
Besides if one wanted to go to climate scales with the same resolution, there are still 4 more orders of magnitude to go. The point was to mention that obviously there could be absolutely no analogy between the ability to do CFD at meter scales and the ability to predict the large 1000 km scales of atmospheric dynamics.
To some comments about linearity that inevitably appear.
The global attractor for N-S and for weather whose existence is firmly established defines the asymptotic dynamics of the system. It is the envelope of climate because all weather is necessarily contained within the attractor.
I have given links and quite extensiveley discussed how the attractor is transformed through averaging operators.
Taylor expansions are irrelevant because if a parameter or a forcing changes infinitesimally by dx, we are not interested how the orbit changes within a time dt (which would be answered by Taylor expansion).
We are interested how the attractor changes and for that we need very long times and not dt. And on long time scales a change dx doesn’t produce a topology change proportional to dx e.g is not linear.
I have to go to vote so will perhaps come back a bit later but a very interesting issue is the phase locking.
There are 2 very strong frequency lockings at 1 day and 1 year frequence that enable to be sure that winters are different from summers. This is quite trivial and doesn’t extend to any other time scale but one may mention that there are indeed many other periodical forces acting on the system.
The solar cycle is one. The precession is another.
Then there is a whole lot of gravitational tidal forces (Sun, Moon, Jupiter etc).
It is natural to ask how these other forces act on the attractor.
These forces are many order of magnitudes weaker than the dominating 2 during the time scales we consider (decades) so that they cannot significantly change the topology of the attractor and can be neglected.
However as we have seen in the point above, the “weakness” of a dx doesn’t of course imply the weakness of the response on large time scales.
So these other forcing are probably locking periods far beyond what we are talking here.
Indeed the precession is for example supposed to lock ice age cycles but I do not think that anybody has observed or established a theory of frequency locking by the extremely weak gravitational (tidal) forces.
In any case it would probably not be observable during the small time scales where we have validated and reliable data.
You can’t be totally unaware of the literature on decadal climate shifts. The work by Tsonis is a few of thousands.
I am not disputing that climate is an average of weather – merely saying that means and variance of temperature, rainfall, winds and currents change with decadal climate shifts. Thus a stratified approach to climate based on weather in the decades between transitions rather than on arbitrary periods of decades.
Kyle Swanson at realclimate for instance suggested that the relevant period for recent warming was 1979 to 1998. This avoid the times of noisy bifurcation (otherwise known as dragon-kings) – and focuses on climate as it persists in a specific basin of attraction.
Tsonis identified abrupt climate changes working through the El Niño Southern Oscillation, the Pacific Decadal Oscillation, the North Atlantic Oscillation, the Southern Annular Mode, the Artic Oscillation, the Indian Ocean Dipole and other measures of ocean and atmospheric states. These measurements of sea surface temperature and atmospheric pressure over more than 100 years which show evidence for abrupt change to new climate conditions that persist for up to a few decades before shifting again. Global rainfall and flood records likewise show evidence for abrupt shifts and regimes that persist for decades. In Australia, less frequent flooding from early last century to the mid 1940’s, more frequent flooding to the late 1970’s and again a low rainfall regime to recent times.
Tsonis and colleagues used a mathematical network approach to analyse abrupt climate change on decadal timescales. Ocean and atmospheric indices – in this case the El Niño Southern Oscillation, the Pacific Decadal Oscillation, the North Atlantic Oscillation and the North Pacific Oscillation – can be thought of as chaotic oscillators that capture the major modes of climate variability. Tsonis and colleagues calculated the ‘distance’ between the indices. It was found that they would synchronise at certain times and then shift into a new state.
It is no coincidence that shifts in ocean and atmospheric indices occur at the same time as changes in the trajectory of global surface temperature. Our ‘interest is to understand – first the natural variability of climate – and then take it from there. So we were very excited when we realized a lot of changes in the past century from warmer to cooler and then back to warmer were all natural,’ Tsonis said.
Four multi-decadal climate shifts were identified in the last century – coinciding with changes in the surface temperature trajectory. Warming from 1909 to the mid 1940’s, cooling to the late 1970’s, warming to 1998 and declining since. The shifts are punctuated by extreme El Niño Southern Oscillation events. Fluctuations between La Niña and El Niño peak at these times and climate then settles into a damped oscillation. Until the next critical climate threshold – due perhaps in a decade to three if the long proxy records are any indication. Nor can we be confident that the next shift will be to yet warmer conditions. A reversion to the mean is much more likely.
Tomas: “We are interested how the attractor changes and for that we need very long times and not dt.”
I am not sure that I am actually interested in how the attractor changes. To decide that, I need to know how long the time scales are, and your argument tells me nothing about that. “dt” is some time scale short enough that linear approximations are good enough; but that does not mean literally only an instant – it means short on the time scales over which chaotic climate effects become important.
If that time scale is 1 year for the sorts of forcing events of interest, then you are quite correct, and linearised models will not give us useful results. But if the relevant time scale is more like 1 million years, then I am quite happy using simple models, and it does not bother me that I cannot predict what the consequences of the current forcings will be millions of years from now.
My guess is that the relevant time scale is actually somewhere between a year and a million years. Do you have some estimate which is better than that? Without it, we still do not know whether linear models will work over 10 or 100 or 1000 years.
The temporal horizon for the maximum Lyapunov exponent in a typical synoptic system is around 0.2 time units ,a significant constraint for weather predictability.
the practitioners of geometric mechanics such as Nicolis and Nicolais show some good examples,(foundations of complex systems pg220) or the geometric constraints in Arnold and Khesin are good examples (Riemannian curvatures) of negative predictions ie you can predict when your prediction is uncertain under fine resolutions.
Oh come on. The behavior of ENSO is highly deterministic and is only complicated by the fact that it has a slight nonlinear perturbation via the Mathieu equation. What keeps it coherent over long time scales is the nearly periodic quasi-biennial oscillations (QBO).
Here is a model over an 70+ year interval of the SOI:
More to come.
@maksimovich: Thanks for the references. I knew this in outline, but it will be good to learn a bit more. But you did not answer the question that matters: how many years is a “time unit” for this system (that is, climate-related, not weather-related)?
“but I do not think that anybody has observed or established a theory of frequency locking by the extremely weak gravitational (tidal) forces.”
An individual excursion of a tide may not be globally sufficient. However, there is a about 15% change in lunar distance between perigee and apogee: the implies circa 50% in tidal force. Tides are essentially horizontal displace of water mass , not vertical. Where this and when this occurs has the means to displace massive amounts of water volume and hence heat energy in and out of the tropics on decadal scales.
I think this needs to be evaluated quantitative before classifying tidal forces as weak. It’s not the force that needs to be evaluated. It’s movement of heat energy.
Judith was recently co-author on a BEST paper that showed a 9.1 y cycle in cross-correlation of AMO and PDO. I think that this is tidal in origin.
The decadal drivers are more complex and thus less visibly obvious than the daily and annual ones. They may not be weak that they can be ignored.
Does Navier-Stokes really have a global attractor? Only for weak solutions, or statistical solutions, or similar surely?
More evidence of the denier/hypocrite here:
You see how two-faced they are about demanding peer review?
We’ve given up asking for actual published science from webby – we’d settle for validation as per the scientific method.
Stop waving your pencil webby – and put ’em on the line.
Here you go — 70+ years of a model of the SOI
The time series is synced by the QBO and an envelope related to the slower period of Chandler wobble beat frequency.
‘The essence of science is validation by observation. But it is not enough for scientific theories to fit only the observations that are already known. Theories should also fit additional observations that were not used in formulating the theories in the first place; that is, theories should have predictive power.’
webby’s homeopathic bathtub curve fitting doesn’t and can’t possibly predict. I can’t imagine greater nonsense.
No response possible I see
Yes – there is one response – actually predict something and not just run off at the mouth.
Sure, I can predict that the pressure difference between Tahiti and Darwin will continue to oscillate positive and negative about its mean value hundreds of years from now … long after you have started your extended dirt nap.
So Ha Ha, if you want that prediction now, well there is nothing you can do with it for the time being except to use it as a cudgel to whack me with.
… i didn’t fall off the turnip truck yesterday.
In conclusion, what I can infer is that you have nothing to respond with except empty rhetoric.
Cameras rolling. Take 29.
Pingback: The heart of the climate dynamics debate | Climate Etc.
Pingback: The SOIM Differential Equation | context/Earth
“All laws are local in nature.”
Brilliant. Climate is therefore weather in toto – obvious BUT, IPCC climate is CO2-driven and -defined. So IPCC weather must be changing in toto (to achieve the climate change) BUT each change must be linkable to the same driver, CO2.
A global change in local observations is required. If we see local changes that are not globally observed but “add up” (literally) to a “global” change, we are not seeing a global driver but a global redistribition of local conditions.
Which is what we are seeing.
Bart R isn’t a capitalist and what he spews isn’t capitalism. I was beginning to wonder if anyone was goint to comment on Milanovic’s brilliant piece.