The Uncertainty Monster: Lessons From Non-Orthodox Economics

by Vincent Randall

A perspective on economists’ grappling with the ‘uncertainty monster.’

In this essay I am going to try to introduce non-economists who work in fields where they are first coming into contact with the ‘uncertainty monster’ – as Judith Curry calls it – to what some economists have learned from their encounter with it. First I will try to explain why economists encountered the monster before others working in different disciplines. Then I will try to give the reader an overview of what different economists have said about it. Then finally I will briefly consider the differences and similarities between how economists are confronted with the uncertainty monster and how those working in ‘harder’ sciences, like climate science, are confronted with the uncertainty monster. There are definite differences and definite similarities.

A little bit of history

The questions raised by uncertainty seem to have been addressed in more depth and with more clarity in the discipline of economics than they have elsewhere. It seems that this is because they were encountered in economics more forcefully than in other disciplines that lent themselves to mathematical modelling and statistical hypothesis testing. The reason that they were encountered so much more forcefully is that economics deals with human behaviour – and humans are constantly faced with an uncertain future. For this reason all human behaviour is undertaken in the face of uncertainty.

Take the classic economic example of an entrepreneur who wants to make an investment. Let us say that he wants to build a factory that produces cotton goods. Let us further say that he is fully aware of all the costs – from the cost of the cotton-producing machines, to the raw materials, to the wages that the workers will need to be paid and so on. Now he needs to weigh these costs against the amount of unit sales that he can make times the prices at which he can make these sales – that is, . By subtracting the costs from the revenue he will be able to calculate his profit – that is, . Finally, he can compare the profits that he will make to the investment that he has to undertake and decide whether he should do it or not.

The problem is that he has to do this over many years. The initial investment – especially the buildings and machinery – will have to be used for years before they pay themselves off. We are probably talking on the order of 10-20 years. Now our entrepreneur may be able to get a good approximation of the price that he will be able to charge for his goods by looking at similar markets in the first year or two. He may also be able to get a fairly good approximation of the amount of market demand that there will be for his product in the first year or two. But beyond the first year or two everything will be a haze. He has no idea whether there will be a recession, a financial crisis or a depression. He will also have no idea how prices will change – will there be a general rise in prices (an inflation), a general fall in prices (a deflation) or will prices stay the same[2]?

These issues were brought to the fore in economics in the 1920s and 1930s by economists like Gunnar Myrdal and John Maynard Keynes. Prior to this the questions were ignored and agents in economic models were basically though to be omnipotent. But Myrdal and Keynes smashed this consensus – for a while at least. Here is a famous passage from Keynes’ General Theory of Employment, Money and Interest outlining the impossible problems that face the entrepreneur:

The outstanding fact is the extreme precariousness of the basis of knowledge on which our estimates of prospective yield [i.e. profits] have to be made. Our knowledge of the factors which will govern the yield of an investment some years hence is usually very slight and often negligible. If we speak frankly, we have to admit that our basis of knowledge for estimating the yield ten years hence of a railway, a copper mine, a textile factory, the goodwill of a patent medicine, an Atlantic liner, a building in the City of London amounts to little and sometimes to nothing; or even five years hence.

Keynes concludes that this means that a lot of economic activity is determined not by calculation of probabilities or anything like it. Rather it is determined by the state of confidence.

It would be foolish, in forming our expectations, to attach great weight to matters which are very uncertain. It is reasonable, therefore, to be guided to a considerable degree by the facts about which we feel somewhat confident, even though they may be less decisively relevant to the issue than other facts about which our knowledge is vague and scanty. For this reason the facts of the existing situation enter, in a sense disproportionately, into the formation of our long-term expectations; our usual practice being to take the existing situation and to project it into the future, modified only to the extent that we have more or less definite reasons for expecting a change. The state of long-term expectation, upon which our decisions are based, does not solely depend, therefore, on the most probable forecast we can make. It also depends on the confidence with which we make this forecast — on how highly we rate the likelihood of our best forecast turning out quite wrong. If we expect large changes but are very uncertain as to what precise form these changes will take, then our confidence will be weak.

Once this Pandora’s Box was opened up it started eating economic theory from the inside out. The whole theory was based on decisions made in the face of calculable certainty. But once we admitted that the future is properly uncertain the theory started to unravel. Within a few years the economists have put the ‘uncertainty monster’ back in the box. From where I’m standing this rendered their theories pretty much useless and I’m sure that many readers can make the connection between this fundamental epistemological error and the inability of economists to see the Great Financial Crisis coming (not to mention their complete inability to deal with the consequences adequately). But enough history. I am interested here in pointing the reader in the right direction if they want to get a sense of what some economists have learned from the study of their discipline through the lens of fundamental uncertainty.

A dummies guide to uncertainty in economics

First up is Keynes himself. We have already seen how Keynes introduced the concept into economic theory. But he also did some work on the implications uncertainty had for econometric modelling – that is, the use of mathematical and statistical models to try to predict future economic outcomes. Keynes addressed this in his paper ‘Professor Tinbergen’s Method’, written in 1939. The ‘Tinbergen’ in question was Jan Tinbergen, a Dutch economist who pioneered multiple linear regression modelling. Keynes had actually written an entire book on probability and statistics where he advanced a theory of probability that integrated uncertainty. This is too complex to look at now but interested people should get their hands on a copy of ‘Treatise on Probability’.

Keynes lays out some of the issues with statistical modelling in his Tinbergen paper. For example, he makes clear that…

Put broadly, the most important condition is that the environment in all relevant respects, other than the fluctuations in those factors of which we take particular account, should be uniform and homogeneous over a period of time.

Now most people will be taught in statistics class that the coefficients in a multiple linear regression can only be taken at face value if we assume that the statistical model is complete. That is, that all relevant variables have been included in the model. But as most people know, in practice most people do not follow this rule. But they should and the fact that they do not probably means that we should take what they say with more than a pinch of salt. Another problem that Keynes highlights in the paper is as follows:

For, owing to the wide margin of error, only those factors which have in fact shown wide fluctuations come into the picture in a reliable way. If a factor, the fluctuations of which are potentially important, has in fact varied very little, there may be no clue to what its influence would be if it were to change more sharply. There is a passage in which Prof. Tinbergen points out (p. 65), after arriving at a very small regression coefficient for the rate of interest as an influence on investment, that this may be explained by the fact that during the period in question the rate of interest varied very little.

Keynes’ criticism is as fresh today as it was in 1939. Because we have no access to repeatable controlled experiments the model is limited by the actual variability in the historical data. The relationship between one variable and another variable may not be linear. The coefficient may rise massively past a certain point. The example of the interest rate is a good one. If the interest rate only move within the bounds of one or two percentage points in a sample its impact on investment will probably be minimal or non-existent. A regression would tell us this. But if the interest rate was then raised in an unprecedented way – say, by 15% — then the impact on investment could be enormous. This actually happened in 1979-1980 when the interest rate was raised from around 10% to just over 17%. Investment crashed and the economy went into recession.

The next economist to deal extensively with uncertainty was GLS Shackle. Shackle tried to further integrate uncertainty into economic theory in books like Epistemics and Economics: A Critique of Economic Doctrine. That may not be of too much interest to non-economists but he also made some interesting points about uncertainty more generally. He was especially interested in the issue of decision-making under uncertainty – which he understood to be entirely different to decision-making in the face of a probabilistic or ‘risky’ future. He thought that decisions in the face of uncertainty were unique as they are often required but there is no definite way to approach them. From his book Epistemics and Economics: A Critique of Economic Doctrine:

To be uncertain is to entertain many rival hypotheses. The hypotheses are rivals of each other in the sense that they all refer to the same question, and that only one of them can prove true in the event. Will it, then, make sense to average these suggested mutually exclusive outcomes? There is something to be said for it. If the voices are extremely discordant, to listen to the extreme at one end of the range or the other will have most of the voices urging, in some sort of unison, a turn in the other direction. ‘The golden mean’ has been a precept from antiquity, and in this situation it will ensure that, since the mass of hypotheses will still be in disagreement with the answer which is thus chosen, they shall be divided amongst themselves and pulling in opposite directions. Moreover, the average can be a weighed one, if appropriate weights can be discovered. But what is to be their source? We have argued that statistical probabilities are knowledge. They are, however, knowledge in regard to the wrong sort of question, when our need it for weights to assign for rival answers. If we have knowledge, we are not uncertain, we need not and cannot entertain mutually rival hypotheses. The various hypotheses or contingencies to which frequency-ratios are assigned by statistical observation are not rivals. On the contrary, they are members of a team. All of them are true, each in a certain proportion of cases with which, all taken together as a whole, the frequency-distribution is concerned. Rival answers might indeed be entertained to a different sort of question, one referring to the result of a single, particular, ‘proper-named’ and identified instance of that sort of operation or trial from which the frequency-distribution is obtained by many-time repeated trials. But in the answer to a question about a single trial, the frequency-ratios are not knowledge. They are only the racing tipster’s suggestion about which horse to back. His suggestions are based on subtle consideration of many sorts of data, including statistical data, but they are not knowledge.

I have quoted Shackle at length to give the reader a sense of how reading his work might be a useful guide to making certain decisions that are encountered with some regularity in climate science. Epistemics and Economics is partly about economic theory but it is also a book devoted to how rational people can make decisions under uncertainty.

The next economist that may be of interest is Paul Davidson. Davidson highlights the fact that economics is a ‘non-ergodic’ science. By ‘non-ergodic’ he means that the future does not necessarily mirror the past; just because x happened in the past does not mean that x will happen in the future. He writes:

Logically, to make statistically reliable probabilistic forecasts about future economic events, today’s decision-makers should obtain and analyze sample data from the future. Since that is impossible, the assumption of ergodic stochastic economic processes permits the analyst to assert that the outcome at any future date is the statistical shadow of past and current market data. A realization of a stochastic process is a sample value of a multidimensional variable over a period of time, i.e., a single time series. A stochastic process makes a universe of such time series. Time statistics refer to statistical averages (e.g., the mean, standard deviation) calculated from a single fixed realization over an indefinite time space. Space statistics, on the other hand, refer to a fixed point of time and are formed over the universe of realizations (i.e. they are statistics obtained from cross-sectional data). Statistical theory asserts that if the stochastic process is ergodic then for an infinite realization, the time statistics and the space statistics will coincide. For finite realizations of ergodic processes, time and space statistics coincide except for random errors; they will tend to converge (with the probability of unity) as the number of observations increase. Consequently, if ergodicity is assumed, statistics calculated from past time series or cross-sectional data are statistically reliable estimates of the statistics probabilities that will occur at any future date. In simple language, the ergodic presumption assures that economic outcomes on any specific future date can be reliably predicted by a statistical probability analysis of existing market data.

He also makes the case – and this is of interest to those in other sciences – that non-ergodicity may apply to systems that are very sensitive to initial conditions. That is, systems which are commonly referred to as ‘chaotic’ today.

The next economist that merits mention is Tony Lawson. Lawson has gone right back to basics to try to tackle the aspect of uncertainty in economics. He makes the case that recognising uncertainty requires the economist/scientist to occupy an entirely different ontological position – that is, they have to view the world in an inherently different way to the way their uncertainty-free colleagues do. Lawson’s work is massively complex and attempts to build up new epistemological and ontological foundation through which scientists can access truths in the face of uncertainty. I will try to give the reader something of a flavour here. Much of this rests on Lawson’s attack on mathematical modelling as the end goal of science. Lawson claims that only ‘closed systems’ – that is, systems that are both deterministic and in which we fully understand the determinates driving the system – can be mathematically modelled in any serious way.

The first thing to note is that all these mathematical methods that economists use presuppose event regularities or correlations. This makes modern economics a form of deductivism. A closed system in this context just means any situation in which an event regularity occurs. Deductivism is a form of explanation that requires event regularities. Now event regularities can just be assumed to hold, even if they cannot be theorised, and some econometricians do just that and dedicate their time to trying to uncover them. But most economists want to theorise in economic terms as well. But clearly they must do so in terms that guarantee event regularity results. The way to do this is to formulate theories in terms of isolated atoms. By an atom I just mean a factor that has the same independent effect whatever the context. Typically human individuals are portrayed as the atoms in question, though there is nothing essential about this. Notice too that most debates about the nature of rationality are beside the point. Mainstream modellers just need to fix the actions of the individual of their analyses to render them atomistic, i.e., to fix their responses to given conditions. It is this implausible fixing of actions that tends to be expressed though, or is the task of, any rationality axiom. But in truth any old specification will do, including fixed rule or algorithm following as in, say, agent based modelling; the precise assumption used to achieve this matters little. Once some such axiom or assumption-fixing behaviour is made economists can predict/deduce what the factor in question will do if stimulated. Finally the specification in this way of what any such atom does in given conditions allows the prediction activities of economists ONLY if nothing is allowed to counteract the actions of the atoms of analysis. Hence these atoms must additionally be assumed to act in isolation. It is easy to show that this ontology of closed systems of isolated atoms characterises all of the substantive theorising of mainstream economists. It is also easy enough to show that the real world, the social reality in which we actually live, is of a nature that is anything but a set of closed systems of isolated atoms.

There is much more to Lawson’s work – including his exploration of an alternative methodology called ‘critical realism’ – but I will not try to dive too deep into it here.

Finally, a recent, more practical approach to studying systems under uncertainty comes from Philip Pilkington’s book The Reformation in Economics: A Deconstruction and Reconstruction of Economic Theory. Pilkington dedicates an entire chapter to approaching the study of economics while taking into account the existence of uncertainty. He tries to formulate pragmatic principles to do this in a coherent way. He argues that sciences that are not suited to straightforward model-building should instead take what he calls a ‘schematic’ approach. These ‘schemas’ are basic relationships that we can learn about how complex systems work. They are usually derived from logically or empirically provable relationships that exist in these systems. They are different to models in that they do not provide a complete picture of these complex systems – which Pilkington claims is impossible. Rather they let us get to know aspects of how the system works that we can then combine with our judgement to decide on what can be said about the system.

Economics, properly understood, is not the art of constructing models. Rather, it is the art of furnishing, elaborating, understanding and integrating schemas into one’s process of thought. Economics is not about building abstract castles in the sky. Nor is it learned or perfected by engaging in such constructions. It is more like a language that is learned through understanding and practice. You do not learn good sentence construction by studying linguistics; rather, you learn it by becoming as acquainted as possible with the language, with words and their multifarious meanings.

This is a far more open-ended approach than the strict mathematical and statistical modelling that Pilkington claims does not work when the material being studied becomes too complex[3]. It ultimately rests on informed people forming judgements about the material that they study.

What does all this have to do with climate science?

The reader that has made it this far is probably wondering what all this has to do with climate science. I think that non-orthodox economists have undertaken the most thorough study of uncertainty that exists in the sciences today and that their work should be consulted by anyone who writes or thinks about these issues. But there are similarities and differences between the two sciences.

Recall that we made the case that economics studies people who have to make decisions under uncertainty. While climate scientists themselves may have to make decisions in the face of uncertainty, they do not study people who have to make decisions in the face of uncertainty. For example, the effects of CO2 on the climate have little to do with how CO2 makes decisions on how it might affect the climate. The processes studied in climate science are natural processes while the processes studied in economics are human processes. This gives climate science an immediate advantage as the level of uncertainty that is being dealt with is only first order – that is, it is on the part of the scientist – rather than second order – that is, both on the part of the scientist and on the object of study.

Despite this, however, the climate, like the economy, is extremely complex. We can only examine little bits of it at a time. Trying to form a coherent vision of the whole will almost inevitably leave something out. Climate science and economics share this problem in common. Because of this, climate models and economic models have a high degree of indeterminacy that must be understood by those using the models. (At the very least, some might say that using models is the wrong approach given the complexity of the systems being studied).

Finally, statistical measurement is notoriously difficult in both disciplines. Economists are well aware that the statistics that they use are highly imperfect. Bad economists simply plug these into models and obtain garbage-in, garbage-out (GIGO) that they then publish in the journals. But good economists have to weigh up the strengths and weaknesses of the statistical material that they use before passing judgements. Climate scientists are arguably in an even more difficult position than economists here because the data that they use is notoriously disharmonious, spotty and ancient. Again, this is a form of decision-making under uncertainty. How much weight should we give this data?

Overall, there are more similarities between climate science and economics than there are dissimilarities. Climate scientists – and any scientist studying highly complex systems – should pay close attention to what non-orthodox economic theorists have been saying about uncertainty and its derivative problems. They could learn a lot.

Suggested readings

These suggested readings can be supplemented by various papers etc that the authors have written and are available online. The reader will have to use their judgement as to whether they will be of interest to the non-economist.

Davidson, Paul. (1991). ‘Is Probability Theory Relevant for Uncertainty?’.

Davidson, Paul. (1996). ‘Reality and Economic Theory’.

Keynes, John Maynard. (1921). Treatise on Probability.

Keynes, John Maynard. (1939). ‘Professor Tinbergen’s Method’.

Lawson, Tony. (1997). Economics and Reality. Parts I, IV & V.

Pilkington, Philip. (2017). The Reformation in Economics: A Deconstruction and Reconstruction of Economic Theory. Chapters 5 & 10.

Shackle, GLS. (1972). Epistemics and Economics: A Critique of Economic Doctrines. Chapters 1-8, 11, 31, 33, 38).

A very comprehensive bibliography can be found here:


[1] All of this is slightly oversimplified. We abstract from interest repayments, depreciation etc. But it serves to make the basic issues clear.

[2] Again we are oversimplifying here. If we count in, say, interest repayments he will also have to guess at where interest rates will be in a few years’ time.

[3] Pilkington also provides a ‘general theory of bias in science’ in Chapter 5 of his book which may be of interest to readers here.

Moderation note:  As with all guest posts, please keep your comments civil and relevant.

137 responses to “The Uncertainty Monster: Lessons From Non-Orthodox Economics

  1. Good post. Dealing with uncertainty is more art than science.

    Generation planning at the utility level has been strongly driven by concerns around uncertainty. Models projecting out generation costs are uncertain because of potential variance in known driver variables and the potential impacts of unknown variables. The defense against uncertainty was to maintain a diverse portfolio of generating resources. This push for resource diversity has recently supported various technologies that have worked out to be quite costly and that will no realistic way work as a hedge. As a long term believer in resource diversity, I am wondering if the economics of gas generation should, at least for the USA, call old ways and to question. Of course politics is a wild card. It seems the scope is so big that the economy of it should drive the politics rather than vice versa.

    • Curious George

      Old Aztecs believed that the Sun would rise the next day only if offered a human sacrifice. There was no room for uncertainty. 100% consensus.

      • They had also a good way of dealing with skeptics.

      • russellseitz

        It worked like a charm until Monteczuma ran out of economists.

      • Russell,

        A dearth of economists was not what did in Monteczuma. It was a surplus of Spaniards.

        Actually, it was a dearth of allies. Many of the tribes the Aztec’s had conquered were quick to join Cortez’s small band as they marched on the capital.

    • Peter Lang

      Planning Engineer,

      The defense against uncertainty was to maintain a diverse portfolio of generating resources.

      I don’t see the evidence to support that statement. Australia had reliable electricity supply when it was about 80% to 90% coal some hydro a little gas and oil and little else. Similarly France has had 76% of electricity generated by nuclear for 30 years. France and all its neighbours rely on this for reliable baseload power supply. Coal and nuclear have provided most of the electricity in most countries for many decades. These technologies are proven reliable. The risk of disruption is mostly human causes: organised labour disruptions, ideology, beliefs, etc. not the the reliability of the technologies themselves. Diverse locations of power stations and ownerships of vendors and utility provides the main diversification, not the technologies.

      • Peter, if you go to Tallblokes latest blog you’ll find an interesting take on Trumps energy policy compared to what is happening in the Australia now.

      • I see a hedge as a supply thing. Natural gas is all the rage. Wait until the price goes up. The users want to play all the suppliers against each other, hopefully in a free market. In Minnesota the powers that be want to retire a coal plant replace it with natural gas. I’d like to keep that coal option open to hedge against a natural gas price increase. I was in favor of extending the life of our Redwing nuclear plant. That hedge should provide long term baseload at predictable costs.

        Retiring nuclear plants as some Green countries did, is the opposite of a hedge. It’s a bet on renewables. Not a bet I want to make. I kind of think we have enough wind and solar and should over about 3 years, phase out the subsidies. It can be argued wind and solar are hedges. They are, and I think we’ve aquired enough of them until they are proven reliable and economical. Adding more of them, is increasing our downside. I’d also say they are disrupting our reliable sources.

      • The 73 oil embargo, Fukushima, a potential natural gas price spike due to Saudi Arabia war with Qatar, etc. Some of us learned not to put all our generators in one basket, others don’t, some get lucky, others don’t.

      • Peter Lang

        As far as I know, people are still using oil for transport fuels. No diversification there. None needed! Same with electricity. Coal and/or nuclear for baseload, gas and or hydro for peak and intermediate load is the reliable, lowest cost way to provide electricity over multi decade periods. Sure gas is cheaper in the US at the moment, but that is probably an aberration and unlikely to last the life of the generating plants. Coial and nuclear would still be the cheapest for baseload by far if not for the damage being done by the enviro-evangelists, with their irrational religious beliefs.

      • Peter – the hedge was from economic impacts, not for reliability purposes. Can’t say I’ve been to any international conferences on resource planning, so it might be US thing. Southern Company speaks of needing “all arrows in the quiver” in pushing for resource diversity. You might like this link –

      • Ragnaar,

        I worked at that plant. Prairie Island Nuclear Generating Station. To this day I consider the Westinghouse 2 loop design as perhaps the best ever developed.

      • Peter Lang


        Thank you. That’s an excellent post. How wonderful for US. How embarrassing for Australia (but absolutely true).
        The Trump Doctrine on Energy

      • Peter, thanks for link. It’s difficult for me to do them with my Samsung phone.

  2. Roger Knights

    TYPO: Change “it” to “is” in this Shackle quote:

    “when our need it for weights to assign for rival answers.”

  3. Weird to write about uncertainty in economics without mentioning Knight, von Neumann, Savage, Raiffa, Arrow …

    The contemporary source on this stuff is Rodrik’s Economics Rules.

    • Peter Lang

      Also weird that a 4000 word post on climate economics does not contain a single mention of damage functions or of impacts attributable to climate change? And no biosketch of the author.

      And this statement is nonsense:

      The questions raised by uncertainty seem to have been addressed in more depth and with more clarity in the discipline of economics than they have elsewhere.

      • Jeez Peter, did you consider that Vincent’s objective was limited to discussing uncertainty and not impacts and damage functions? Or that at 4000 words he felt his essay was already long enough?

      • Peter Lang

        Yes. You are correct. On second, more careful reading it is a very good, very interesting post. Thank you for pointing it out. I re-read it carefully as a result of your comment (which, when I first saw it, annoyed me).

      • manicbeancounter

        The issues of uncertainty covered in this post renders any damage functions utterly speculative. If we could start to define the actual costs of climate change, along with guesstimates of likelihood of occupancy then the damage costs would be lower as people built into their plans these risks. It is like people in Japan and California adapting to the risks of earthquakes so when they do occur the damage costs are much reduced.

      • The issues of uncertainty covered in this post renders any damage functions utterly speculative.

        That statement applies to just about everything in economics and climate science.

      • I accidentally hit the reply butto9n before I’d finished. Continuing …

        We need to validate or refute the damage functions. If we don’t the climate alarmists will continue to believe the estimates of SCC are correct and continue to use them to justify policy – as Obama did and the EU is doing.

    • manicbeancounter

      Von Neumann’s major contribution was with Morgenstern in developing game theory. This assumed that alternative outcomes were known and measurable. That is there are probabilistic risks, not uncertainties which are not calculable in advance.
      Frank Knight’s work mostly pre-dates Keynes.
      More relevant is the work of the Austrian School, although Shackle ‘s Epistemic and Economics draws to some extent on this work.

  4. Reblogged this on Climate Collections and commented:
    “…The climate, like the economy, is extremely complex. We can only examine little bits of it at a time. Trying to form a coherent vision of the whole will almost inevitably leave something out. Climate science and economics share this problem in common. Because of this, climate models and economic models have a high degree of indeterminacy that must be understood by those using the models. (At the very least, some might say that using models is the wrong approach given the complexity of the systems being studied).”

  5. Curious George

    Economics has an “envy of physics”: Physics has 20 laws covering 80% of cases. Economics has 80 laws covering 20% of cases. Climatology is closer to economics than to physics.

    • The climate, like the economy, is extremely complex. We can only examine little bits of it at a time. Trying to form a coherent vision of the whole will almost inevitably leave something out. Physics has 20 laws covering 80% of cases. Economics has 80 laws covering 20% of cases. Climatology is closer to economics than to physics.

      Wow, people take something really simple and pretend it is complex.
      Well, there is more money in alarmism and complex.

      When oceans get warm, oceans thaw and it snows more and it snows and increases ice on land until it spreads and causes it to gets cold.

      When oceans get cold, oceans freeze and it snows less and ice depletes and allows ice to retreat and allows it to get warm.

  6. Lots of hidden uncertainty in the IPCC carbon budget that ascribes changes in atmos co2 to fossil fuel emissions.

  7. Logically, to make statistically reliable probabilistic forecasts about future economic events, today’s decision-makers should obtain and analyze sample data from the future.

    But, that’s impossible! Sure… but, the state of confidence will always be a problem except that that future is here now and no sequel to Gore’s, “An Inconvenient Truth,” can change the fact that there has been no warming since CRUgate that cannot be explained by natural cases.

  8. Lawson has aced it about only closed systems being mathematically modelled in any serious way. Climate science has a lot of unserious people.

    • Yes. But depends on the meaning of serious. Predictive precision is different than general understanding in ‘toy’ models. The problem with climate GCMs is they are only ‘toy’ models yet pretending to have predictive precision. Christy’s 29 March 2017 congressional testimony is the gotcha on that.

      • Steven Mosher

        Christy’s wrong. Sorry.

      • Kip Hansen

        ristvan ==> “Lawson claims that only ‘closed systems’ – that is, systems that are both deterministic and in which we fully understand the determinates driving the system – can be mathematically modelled in any serious way.” This is the main point that I have been writing about in my Chaos series and essays on Trends. Non-linear dynamical systems defy predictive modelling (in any useful sense regarding climate) and we certainly do not “understand the determinates driving the system” — we only have vague guesses arrived at from past associations — which is why trends may not be assumed to extend into the future.

      • “Non-linear dynamical systems defy predictive modelling (in any useful sense regarding climate)”

        And Tetlock showed expert prediction under significant uncertainty was actually worse than laymen’s. Experts are great when there’s little to no uncertainty, because they’re typically much more conversant with the hard facts. When they’re not, you get claims with a straight face that half of America will be starving by the year 2000, because HEY LOOK AT ME!

      • russellseitz

        We have been seeing good models drive the bad out of circulation in climate science for over a generation, starting with Steve Schneider & Starley Thompson using an early GCM to pull the rug out from under Sagan’s 1-D nuclear winter model circa 1986. Inflating the model supply in recent years may have devalued that trend, but it’s nonsensical to say “Non-linear dynamical systems defy predictive modelling (in any useful sense regarding climate) when you are talking about thermodynamically determinate systems , even when complex biogeochemical exchange rates , long time scales and large thermal masses mean they take their own time to equilbrate: complex as planets are , they still are things made of stuff and remain science’s lawful prey.

        Face it- modeling progress is as heuristic in climate science as economics, and economic modeling has crashed and burned far more spectacularly on occasion- I wish Kip would be as candid as James Hansen in that regard, albeit model intercomparison is a sort of metaphysical blood sport in both fields.

      • Kip Hansen

        russellseitz ==> You might want to read my four-part series on Chaos at WUWT. Start here. Links to the following three parts are at the end of the essay just above the Comment section.
        Part Three: Chaos & Models is particularly intended to address the problems of modelling climate processes.

      • Steven Mosher

        ” Non-linear dynamical systems defy predictive modelling (in any useful sense regarding climate) and we certainly do not “understand the determinates driving the system”

        Wrong. The key thing is understanding what METRIC you are trying to predict. Suppose you want to model the flow of air over an airfoil. At the lowest level of the system you know that your predictions of exactly where the air will go is impossible. Try to predict which way a forebody vortex will go as you increase AOA ? locally and microscopically unpredictable. Still,
        If you want to know the stall angle or the lift, well that is quite predictable. At the lowest levels we cant know which way the wind will blow or how exactly the currents will shift or where exactly it will get warmer, but we do know that the energy of the system must balance.
        We know that even a chaotic system will not defy that law of nature.

        In the 1930s guy callandar built a very simple climate model. It was one dimension. No oceans, no land, no winds no rain. no clouds. He simply predicted that if you add co2 ( upsetting the balance) that the temperature would respond accordingly. It was a great model. it worked.
        It worked because while the smaller units of the system may interact in unpredictable ways, the system as a whole must follow certain laws.

      • He simply predicted that if you add co2 ( upsetting the balance) that the temperature would respond accordingly. It was a great model. it worked.

        Uh huh. What was his prediction for the global average temperature this year?

      • “we do know that the energy of the system must balance.”

        All that tells you is that the Earth won’t catch fire or near absolute zero anytime soon, it does not in any way imply you can reliably predict the annual temperature of the relatively tiny mass of Earth’s atmosphere for the next century to within a degree or two with high reliability, especially not as a function of variations in the level of a trace gas. It is little more than blind faith to assume there are no other significant factors at play.

      • At the lowest levels we cant know which way the wind will blow or how exactly the currents will shift or where exactly it will get warmer, but we do know that the energy of the system must balance.

        It worked because while the smaller units of the system may interact in unpredictable ways, the system as a whole must follow certain laws. [bold by edh]

        The words in bold present a critically important simplification, and to a certain degree, a mis-representation of the conservation of energy Law of Physics when applied to Earth’s climate system.

        Earth’s climate system is an open system relative to energy with both inflows and out-flows of energy at the system boundary which is the focus of interest in these discussions. Additionally, phenomena and processes internal to the system affect, to lesser and greater degrees, both the inflows and out-flows of energy. Further, extraordinary occurrences internal to the system can significantly affect energy flows.

        Conservation of energy obtains everywhere, at all temporal and spatial scales, for all times.

        There is no Law of Physics, none whatsoever, that can be invoked to ensure with absolutely certainty that a system which is open to energy transports, and these transports affected by the normal range of the effects of phenomena and processes internal to the system, must approach and maintain a state of radiative energy balance at the system boundary.

        It can be hypothesized that such a balance will attain, but a hypothesis is not a Law of Physics. It is instead a construct obtained by applying radical simplifications to the Law of Conservation of Energy. However, you seldom, if ever, see a rigorous derivation or development of the hypothesized state from the general mathematical representation of Conservation of Energy that includes accounting of all the storage mechanisms, all the materials and their phase and energy content, or all the inter-exchanges of the various forms of energy (mechanical to thermal, for example).

        Instead of authoritative hand-waving invocation of “certain laws”, which in fact are not “certain laws”, state the actual foundation of the basis of a statement.

      • ooops, not “with absolutely certainty”, but “with absolute certainty”

  9. While human behavior is unpredictable, the effects of human behavior in this case are very deterministic. By 2100, we could emit anywhere between 1,000 GtCO2 and 10,000 GtCO2, which has an enormous range of impact on the climate at 2100, probably with a range of 1-6 C for those extremes. Policies have to weigh the effects of these temperatures against what it takes to mitigate emission rates or stabilize the climate. The largest uncertainty factor is human behavior, but the hope is that control is there via truly effective international cooperation.

    • Curious George

      I mostly agree. In my opinion, “climate sensitivity” is the largest uncertainty factor.

      • When emissions and consequent forcing can range by 500%, even a 50% uncertainty in sensitivity is a secondary factor in what happens to the temperature.

      • Curious George

        Should the sensitivity be a zero (even though I don’t really believe it), any uncertainty in emissions would be of no consequence.

    • If you go to Tallblokes latest blog it tells the difference between Obama and Trump energy policy. It sounds like the difference between your two ends of gigatonnage.

    • If you go to Tallblokes latest blog the difference between Obama and Trump administration policy is probably like the difference between your two ends of gigatonnage.

    • Jim D (accidently?) hits on a key point. Much of climate policy being pushed is at its core about instituting control over human behavior.

      • As in economics, that is where the uncertainty is. What we do from now could impact the temperature by many degrees C, and this should make people think of consequences for global policies. Seeing the direct line between emissions and temperatures is useful for these decisions. It’s not control so much as just being sensible given what is known. It’s like taking action to put the brakes on before going over that cliff. It is action to take for safety where no action leads to danger.

      • To use your over blown analogy Jim – what happens when you slam on the brakes at 60 mph in the belief there is a cliff up ahead, when none in fact is shown on the map and the 18 wheeler immediately behind you hits you at said 60 mph? No danger there, right?

        What people supporting IPCC want is to set up a road block, make everyone stop and get out of their vehicles and walk to what ever destination they were headed to. All for their own good. Of course they reserve the right to keep their own vehicles, because how else can they be expected to maintain the good of the new order.

      • It’s about informed decisions. Do we want 700 ppm with its 4 C of warming by 100 years from now? Not if we can avoid it. We prefer a stable climate to a rapidly changing one, right?

  10. I would argue that the economic and climate problem are actually quite related. Empirically both systems seem to maximise entropy production.
    See e.g.
    (the last chapter or so is about economics. Unfortunately the theory as pertains to economics is quite shallow. related also Bejan, Constructal Theory of Social Dynamics, Odums Maximum Power Principle)

    The system dynamics permanently changes direction to maximize entropy production. Unforseen innovation and emergence of thermodynamic gradients (forces) determines the system path. The more degrees of freedom the system has the more possibilities to optimize.

    Models have naturally orders of magnitude (of orders of magnitude) fewer degrees of freedom and incorporate other explicit constraints which limit the selection of an optimal path. So it is logical climate models run too “hot” since they cannot catch all possible ways to dissipate heat the fastest possible.

    These limits to modeling such systems can be analyzed as uncertainty. The phenomena are actually the same in the atmosphere and economics.

    • Curious George

      CO2 correlates beautifully with Dow Jones. Also with a total human population.

      Still wondering, in Mauna Loa data, CO2 decreases every Northern summer. Where does it go?

    • CG, the NH summer atmospheric CO2 declines as more is consumed by land plant photosynthesis.

      • Curious George

        This indicates that the photosynthesis rate on land is much higher than in the ocean, probably contradicting “Scientists believe that phytoplankton contribute between 50 to 85 percent of the oxygen in Earth’s atmosphere.” Maybe someone could use the CO2 uptake to come up with a better estimate.

        Let’s green Australia and Kalahari.

      • Primary production in the oceans is limited by light – on land it is water and temperature primarily.

        And we are doing fine thank you for asking.

      • How do you know the decline is not a result of the SH winter cooling the oceans, perhaps increasing their ability to sink CO2?

      • Geoff Sherrington

        Nice. Reminds one of the frequent under study of reverse causation in science generally, though your question is more about alternatives. Climate research is full of reverse causation possibilities. Does rainfall reduce surface land temperatures, or do hotter temperatures bring more rain? One of the GCM problems is failure to fix an absolute anchor point for processes, so one can say “At this stage, all is a normal climate until a perturbation starts one or more cycles.” Another part of the uncertainty monster.
        I can’t see much benefit coming to climate research through study of economics. You can reverse a lot of economic decisions and try a new approach. Harder to do to the climate. Geoff

      • Geoff,
        The reason I asked the question is because it seems to me there is a whole lot of guessing going on in Climate Change Science.

        I agree with you. There is too much at stake to see how the study of uncertainty in economics can benefit climate change decision making. When there are so many unknowns, the only logical course of action is to first reduce their number through observation, experimentation and data acquisition. What exactly are the oceans doing and why? What exactly are the temperature feedbacks and CO2 feedbacks doing? Where in the world exactly are the increases in CO2 coming from? Where in the world exactly are the CO2 sinks located, what exactly are their magnitudes and how are they changing over time?

        Finding answers, rather than guessing at them, is one of the key roles of science in the climate change decision making process.

      • Peter Lang


        The far more important issue is: does any of it matter? If global warming is beneficial, as it may well be, what does do the climate science and their projections of global warming matter?

      • Rob,

        And nutrients.

      • Peter,

        Global warming may well be beneficial but we won’t really know unless/until it actually happens. People in general see change as something to avoid. If you try to persuade them otherwise through a fancy prognostication methodology, I suspect you will probably just run up against the notion (as expressed by Niels Bohr) that “Prediction is very difficult, especially about the future” and get ignored.

      • Peter Lang


        The point is we are probably wasting trillions on climate policies and retarding global development as a result of incorrect damage functions. We need to move research effort from studying climate change to studying the impacts of global warming. We need the studies to be done. We need the data. The climatariate are using the lack of data to mislead the world to implement what are probably bad and unjustifiable policies. I fully realise all the excuses you are making to not do the work which means in effect allowing the misinformation and bad policies to continue indefinitely.

      • Peter Lang


        Did you read my comment here: . did you look at Figure 3 in the link I gave to Tol (2011) “THE ECONOMIC IMPACT OF CLIMATE CHANGE IN THE 20TH AND 21ST CENTURIES” ?

  11. It would be foolish, in forming our expectations, to attach great weight to matters which are very uncertain. It is reasonable, therefore, to be guided to a considerable degree by the facts about which we feel somewhat confident, even though they may be less decisively relevant to the issue than other facts about which our knowledge is vague and scanty.

    –e.g., the benefits of not becoming the victims of political huxsters and faux-science flimflammery are much more certain than any putative gains from ignoring the benefits of modernity and destroying our economy to guard against rising seas.

  12. Jim, you imagine certainty where there is little or none.

  13. The third great idea in 20th century physics – along with relativity and quantum mechanics – is dynamical complexity. Dynamical complexity has many applications in ecology, population, epidemiology, physiology, weather and climate, planetary orbits, earthquakes and economics to name a few. We tend to accept relativity and quantum mechanics as great ideas without understanding much about them. Dynamical complexity is more widely known as chaos theory – and is as little understood. The broad class of dynamical systems is known by certain behaviours. This is shown in economics by the potential for small initial shocks – a few hundred billions in toxic debt for instance – to cause a global economic meltdown as a result of collective emergent behaviour. Fear and greed ran rampant putting an end to decades of economic stability and growth. Didier Sornette calls these events dragon-kings.

    There are a few warning signals of crashes that make them potentially controllable – primarily hyper growth with positive feedbacks. Long before dragon-kings Friedrich Hayek and the Austrian school of economics developed principles of management of interest rates that are used to maintain price stability (low inflation) and stable economic growth. In monetary policy the Australian government instituted a consumer price inflation target of 2–3 per cent in 1993. This is managed primarily through the overnight cash rate. When the economy is at risk of overheating – the overnight cash rate is increased putting a damper of demand. Conversely – rates are decreased during downturns. Over the period of the target Australia has had uninterrupted economic growth. At the same time we had low government debt, conservative banking practices, a strong democracy, an effective legal system and low levels of official corruption. Growth and stability are as much psychological as technical and flourish during periods of moderate change.

    Markets do best in a democratic context. Politics provides a legislative framework for consumer protection, worker and public safety, environmental conservation and a host of other things. Including for regulation of markets – banking capital requirements, anti-monopoly laws, prohibition of insider trading, laws on corporate transparency and probity, tax laws, etc. A key to stable markets – and therefore growth – is fair and transparent regulation, minimal corruption and effective democratic oversight. Markets do best where government is large enough to be an important player and small enough not to squeeze the vitality out of capitalism – government revenue of some 25% of gross domestic product maximises economic growth. Economic growth is the path to the future.

    The global economy is worth about $100 trillion a year. To put aid and philanthropy into perspective – the total is 0.025% of the global economy. If spent on Copenhagen Consensus smart development goals such expenditure can generate a benefit to cost ratio of more than 15. If spent on the UN Sustainable Development Goals you may as well piss it up against a wall. Either way – it is nowhere near the major path to universal prosperity. Some 3.5 billion people make less than $2 a day. Changing that can only be done by doubling, tripling and more global production – and doing it as quickly as possible. Optimal economic growth is essential and that requires an understanding and implementation of explicit principles for effective economic governance of free markets.

    Economically the world is locked into a growth cycle – despite any and all reservations and interventions. A high growth planet creates resources to solve people and environment problems. The clearest way to economic growth is markets – and the biggest risk is market mismanagement.

    Climate may be an ergodic chaotic system and economics non-ergodic – but there is a key metaphorical similarity. Greenhouse gas emissions are the rampant inflation that create conditions for further instability in the system. The solution is to decouple – as they say – economic growth from carbon emissions. This is merely a technical problem.

    • Curious George

      Myron S. Scholes and Robert C. Merton shared the 1997 Nobel Memorial Prize in Economic Sciences for a “new method to determine the value of derivatives”. They used their Nobel-anointed theory to run a Long-Term Capital Management hedge fund, which spectacularly crashed in 1998.

      • And the relevance seems typically missing in action.

      • It is actually very relevant. The algorithms used were used to enabled the trading of options. It was known as black box trading. It’s all electronic now. The hedge fund they set up was supposed to work like the algorithm, similar to rocket guidance systems, and smooth out the trading mechanisms.

        When the hedge fund failed it was the largest failure in US history. It was due to the collapse of the Russian currency. It also helped George Soros become a billionaire. He was always looking for unstable markets. That one bit the Nobel winners in the arse. The uncertainty monster prevailed.

      • Don Monfort

        Scholes and Merton were directors hired primarily to attract well-heeled investors. They didn’t run the fund. The strategy of the fund was not based on a black box algorithm. It was way more complex than that. Look into the significance of the “Long Term Capital Management” in the name of the fund.

        The Black -Scholes (-Merton) model is still a commonly used tool to estimate the value of options. Nobody thinks it’s perfect, and since everybody uses, it nobody with any sense thinks they can identify mis-priced derivatives with its use.

        LTCM made big money for a few years, then got hit hard by the Asian and Russian financial crises 97-98. They were way over leveraged and the result was a flight to liquidity. Boom to bust. They got bailed out and survived for a year or two. The bailors made a little money.

      • “Declining productivity, a high fixed exchange rate between the ruble and foreign currencies to avoid public turmoil, and a chronic fiscal deficit were the reasons that led to the crisis.” Wikipedia

        More the mismanagement gremlin – regardless of what derivative valuing algorithms were doing. What I was talking about was macroeconomic principles for market management and not the microeconomic practices of George Soros. George may do what he likes – as long as markets remain fair and open. If you need to close some loopholes – by all means. That’s what democracies are for.

        Macroeconomic principles include free trade and encouraging innovation to obtain the productivity bonus – and of course a floating exchange rate. You can be pro-market as much as you like – without understanding the mechanics of sound management it is all just gaming economics itself. Something that will never work.

  14. Pingback: Modeling Redux | POLITICS & PROSPERITY

  15. I cannot take seriously claims of looming existential climate catastrophe by those who’s only answer is deindustrialization. An unwillingness to consider nuclear power means to me that they don’t really believe their propaganda.

    • Steven Mosher

      Hansen and muller and others advocate nuclear.
      Now what’s your problem.

      • They are the exception Mosher.

        Or has Sierra Club started a sister program to their anti – coal campaign, urging construction of new nuke plants?

    • Especially since deindustrialization would likely cause an existential catastrophe.

    • Nuclear can play several roles. Most people overlook the potential benefits of global thermonuclear war acting as a limiter of long term climate emissions while providing a lot of negative forcings from upper atmospheric particulates. Of course if you start a global thermonuclear war the usual nabobs will of course start whining about the environmental damage.

      But if you’re going to live in a pastoral, post-industrial society with greatly reduced populations, you could at least aspire to live in a really edgy, post-apocalyptic, scavenging, post-industrial society. One where the wind-farm militias raid the wussy solar farmers, but both are afraid to tangle with the remote coal towns where people went underground.

      And while sifting through the rubble of the collapsed civilization in the middle of a radioactive California desert, a dad will comment to his son that all the destruction was necessary to make sure the average annual temperature didn’t go up 2 degrees.

      • Peter Lang

        +1 !!

      • George

        Your apocalyptic post reminded me of Shelly’s epic

        “I met a traveller from an antique land,
        Who said—“Two vast and trunkless legs of stone
        Stand in the desert. . . . Near them, on the sand,
        Half sunk a shattered visage lies, whose frown,
        And wrinkled lip, and sneer of cold command,
        Tell that its sculptor well those passions read
        Which yet survive, stamped on these lifeless things,
        The hand that mocked them, and the heart that fed;
        And on the pedestal, these words appear:
        My name is Ozymandias, King of Kings;
        Look on my Works, ye Mighty, and despair!
        Nothing beside remains. Round the decay
        Of that colossal Wreck, boundless and bare
        The lone and level sands stretch far away.”


      • +100

        Can’t wait to see the movie.

  16. Vincent –

    I’m curious as to what an economist with experience in dealing with economic monsters of uncertainty has to say about people who reach certain conclusions that there is a huge “cost” from mitigating ACO2 emissions, even though they don’t know how to calculate the costs and benefits of internalizing the externalities (positive and negative) of obtaining energy from fossil fuels relative to the costs and benefits of internalizing the externalities (positive and negative) of obtaining energy from other sources?

    • Don Monfort

      The short answer is: Trump Rules! Vincent may want to go into more detail.

      • Hi Don –

        Nice to see you back.

        Please explain.

      • Don Monfort

        About this time last time last year, I was telling you characters that when The Donald became The Most Powerful Man in the World the endless bickering about monsters and the internalities of externalities would be rendered moot. Trump Rules!

      • Don

        Have you heard anything about the alleged refusal of Dr Mann to pass his material over to the judge with regards to the Tim Ball legal case? Fake news? Real news?

      • Don Monfort

        That is the story going around, Tony. Haven’t seen anything contradictory. CNN must be too busy to lie about this one. Do Canadian judges care about contempt of court? I wouldn’t bet on it. They will probably just keeping granting continuances, until Dr. Ball passes on.

      • Don

        Yes, I’m not at all certain that its the killer blow that some believe. I can’t believe Dr Mann would deliberately indulge in contempt of court as that would be headline news, so there is presumably more to this than meets the eye


      • Thanks for the explanation, Don.

        BTW, speaking of your explanation try Googling “Jonathan Haidt and Donald Trump supporters” and check out the Vox article that comes up near the top.

        Some interesting reading there.


      • Tony,
        one source only, principia-scientific-org, seems to be doing the rounds.

    • Maybe they think the same about people who reach certain clonclusions, that there’s no cost to mitigating CO2 emissions, even though they don’t know how to estimate the costs and benefits of …..

      • fernando –

        That would be my guess. But as we see in these threads idten, there are many who see no problem with reaching confident conclusions about the “cost” of mitigating ACO2 emissions despite not being able to present a coherent and comprehensive analysis related to the externalities of different energy resourcing pathways – even as they speak about the importance of not ignoring uncertainties.

        So I am curious as to how someone with a lot of experience in exploring such economic uncertainties views that situation.

  17. Much of economics is simpler than anyone would allow you to believe.
    Interest collected on debt is exponential.
    Nothing beats the exponential.
    Eventually a few people own everything and turn everyone else into a slave.
    Civilisation collapses.
    It happens over and over. And the universities and their economics programs are bought off to teach ‘junk economics’ rather than point out the obvious.
    This is the thesis of Michael Hudson. I see no reason to doubt his conclusion.

    • The real exponent is productivity growth. Hudson’s thesis doesn’t really wash in any historical or financial context — theories of how the 1% are secretly screwing everyone else will always be popular, but none of them can offer a sensible model of growth to explain why living standards rise, because the reality is that wealth is created by a small portion of society that captures only a small portion of the gain.

      • Usury works while the rate of production keeps up.
        But it never can indefinitely, and then usury destroys the economy.
        I suggest you read E Michael Jones “Barren Metal”.
        Its the history of every country since the Romans.
        The only ‘wealth’ is labor, i.e. the ability for a person to create a family on a living wage and propagate humanity.
        Everything else is illusory.
        As you may note, our society no longer has any wealth (not in America, at least).
        But lots of rich people.

      • If the only wealth was labor, we wouldn’t be fabulously better off in every way on a per-hour-worked basis than every society that has ever existed before. Try picking up Hayek’s Law, Legislation and Liberty or Friedman’s Capitalism and Freedom for something less trendy and fun but a lot more sensible.

      • Well, rather than read more libertarians, I think I’ll just take a look at the window at the wonderful world they have brought us and watch the little lady knit her registry.

      • Well, if you prefer the hardcore socialists, there’s still a couple countries left for you to visit and compare.

    • We have the Marxist theory of surplus value and Anne Rand’s Atlas shrugged. Wealth is created by people who do productive work – ultimately whether it monetaryily rewarded or not.

  18. The nominal interest rate is a deeply and surprisingly uncertain thing, due to the fact it is partially a function of both long- and short-term expectations about how the central bank will behave with respect to inflation, which itself is a somewhat nebulous and often capricious prospect.

    There’s also enormous confusion about interest rates stemming from the fact a successful long-term attack on inflation using higher interest rates, like the Volcker-Greenspan regime, eventually results in… lower interest rates, as investors begin to demand lower inflation premia because they trust the central bank to hold down inflation. Fisher and Friedman both wrote about this and the inverse phenomenon, in which a policy of lower interest rates eventually results in higher interest rates. Interest rates are thus often not a reliable indicator of the stance of monetary policy relative to other periods.

    So, the real change in the Volcker regime wasn’t so much the higher interest rates — the driver of that policy was the fact the Fed started targeting inflation in greater preference relative to targeting employment. At the time, no one know if that was a good idea, or how long that policy would continue, so investors continued to demand large inflation premia. After thirty years, investor belief in the Fed policy of tamping down inflation was so strong that the Fed could not only lower interest rates to near zero, but also exchange trillions of dollars for assets — and still miss the targeted inflation number on the low side.

    The Fed itself is still waking up to the new “market monetarist” reality, as they now debate how large the balance sheet should be — that giant reservoir of investor belief might now make it not just possible, but perhaps even necessary for the Fed to maintain a balance sheet on par with the national debt (effectively making that debt nearly cost-free), a idea that would have sounded too insane to even refer to as a “pipe dream” in Volcker’s day.

    • Actually, the government is the one keeping inflation down.
      By fudging their indicators, swapping on items on the inflation index with cheaper variants and calling them the same.
      Also, by massively manipulating the Gold futures market and keeping the dollar artificially propped up.

      There is no economy.

      • As a Congressional creation with Congressional mandates, the Fed is more or less the government, pretensions to independence aside. It’s certainly true there are questions about the inflation heuristics — hedonics is at best educated guesswork, and fundamentally there’s just no real way to boil down a multidimensional, multivariate, rapidly fluctuating concept like “what people think stuff is worth in dollars” into a single number that applies to everyone equally, let alone describe its quarterly rate of change with (snort) decimal precision.

        But there is, in fact, an economy. I double-checked earlier this morning, just to be sure.

  19. Risk Analysis When Probabilities are Not Enough

    Risk assessment under deep uncertainty: A methodological comparison


    Probabilistic Risk Assessment (PRA) has proven to be an invaluable tool for evaluating risks in complex engineered systems. However, there is increasing concern that PRA may not be adequate in situations with little underlying knowledge to support probabilistic representation of uncertainties. As analysts and policy makers turn their attention to deeply uncertain hazards such as climate change, a number of alternatives to traditional PRA have been proposed. This paper systematically compares three diverse approaches for risk analysis under deep uncertainty (qualitative uncertainty factors, probability bounds, and robust decision making) in terms of their representation of uncertain quantities, analytical output, and implications for risk management. A simple example problem is used to highlight differences in the way that each method relates to the traditional risk assessment process and fundamental issues associated with risk assessment and description. We find that the implications for decision making are not necessarily consistent between approaches, and that differences in the representation of uncertain quantities and analytical output suggest contexts in which each method may be most appropriate. Finally, each methodology demonstrates how risk assessment can inform decision making in deeply uncertain contexts, informing more effective responses to risk problems characterized by deep uncertainty.”

    This is all very well. It would be a substantial refinement on what policy analysts and Treasuries have been doing for centuries – i.e. routine cost benefit analysis – if it were feasible. However, we don’t have even the basic data needed to do the standard approach properly. So there is even less of the data that is needed to do a PRA or other sophisticated approaches. And, as I’ve said many times previously, I would be concerned that “robust decision making” is not robust at all. It would be taken over and run by ideologues with an interest in the outcome – groups like the IPCC. It cannot be properly reviewed and critiqued by most competent organisations that specialise in economic analyses of policies.

    Therefore, the first step should be to gather the required data needed for the basic cost benefit analysis that have been done for centuries (improving all the time) and is the standard way to justify expenditures.

    Clearly, cost benefit analysis can be done for climate policies (and no policies) because the IAMs do it (DICE, RICE, FUND, PAGE, etc). And these analyses are used to generate the Social Cost of Carbon which is then used to justify policies. However, the issue is that the data used to derive and calibrate the damage functions is sparse to negligible. The necessary studies have not been done. We’ve spent the past three decades focusing on climate science instead of analyzing the impacts of global warming.

    • If anything there are way too many impact studies, with a track record reminiscent of Ehrlich (not to say Malthus). Not sure I would believe any amount of data can ever make these predictions better.

      • Peter Lang


        Thank you. I agree there are many impact studies; Most are biased such as listed here: A (Not Quite) Complete List Of Things Supposedly Caused By Global Warming

        But I don’t agree with this:

        Not sure I would believe any amount of data can ever make these predictions better.

        Of course we can never get as much data as we would like for policy analyses. But we have to do the best we can with what is available. Otherwise, policy cannot be justified or may be wrongly justified. Funding for all policies are justified on an economic basis. No policies have all the data needed. But the economic analyses have to be done. At the moment, climate polices are being justified on misleading, biased, incorrect analyses – such as the Social Cost of Carbon.

        Most of the studies that have been done so far are biased, selective and skewed towards looking for negative impacts. Some are not. Some are objective and unbiased, but are negligible in number and scope, and insufficient (see some quoted from IPCC at end of this comment). We need much better, unbiased analyses along the lines of the good ones that have been done so far.

        Richard Tol has occasionally posted a link with 1200 references. And he lists the ones he has used to derive and calibrate the impact functions in FUND. As does William Nordhaus for DICE and RICE. However. All are insufficient. In particular I suspect the impact function for energy in FUND is grossly wrong, to the extent of the trend of the impact being wrong in sign. Look at the energy impact in Figure 3, bottom panel here: . The economic impact of global warming on energy consumption from 1900 to 2000 was positive. This part is from empirical data of the real impacts attributable to global warming. Then, when the empirical data ends and the projection begins, the impact turns abruptly negative. Not only negative, but negative to such an extent that the projected negative impact of energy consumption exceeds the positive impacts of all the other impact sectors by 2080. If the impact of GW on energy was negligible or positive (as it had been for last century), GW would be beneficial to 4C increase on GMST (relative to 1990). This is one example of a study that demonstrates the impact analyses of global warmings can be done. But we don’t have the number and coverage of objective, unbiased studies needed.

        IPCC AR5 WG3 Chapter 3 mentions ‘Damage Function’ eighteen times . Some examples:

        • “Damage functions in existing Integrated Assessment Models (IAMs) are of low reliability (high confidence).” [3.9, 3.12]”

        • “Our general conclusion is that the reliability of damage functions in current IAMs is low.” [p247]

        • “To develop better estimates of the social cost of carbon and to better evaluate mitigation options, it would be helpful to have more realistic estimates of the components of the damage function, more closely connected to WGII assessments of physical impacts.”

        • “As discussed in Section 3.9, the aggregate damage functions used in many IAMs are generated from a remarkable paucity of data and are thus of low reliability.”

      • Peter Lang


        … If the impact of GW on energy was negligible or positive (as it had been for last century), GW would be beneficial up to around 4C GMST increase (relative to 1990). …

      • … the new climate paradigm of abrupt climate change… selective quoting at it’s best…

        “When orbital wiggles and rising greenhouse gases warmed the earth from the last ice age, proxy records show that smooth changes were interspersed with abrupt coolings and warmings, wettings and dryings. By analogy, the expected future warming may come smoothly, but may come with jumps, short-lived or local coolings, floods or droughts, and other unexpected changes. Societies and ecosystems have an easier time dealing with slower or better-anticipated changes, so the abruptness and unpredictability of the possible changes may be disquieting.” Committee on Abrupt Climate Change, 2002, US NAS.

        I may not have been the first to say it – but I am certainly the least illustrious. The Pacific dynamic of more or less upwelling on the eastern margin is the largest coherent change in the ocean/atmosphere interface on the globe. The change between cool and warm surface waters influences heat transfer between ocean and atmosphere as well as cloud dynamics all wiggly Navier-Stokes. It’s a chaotic system too. Solar activity modulates surface pressure at the poles which modulates the southern and northern polar annular modes (SAM and NAM) which spins up – or not – the sub-polar gyres. After that it’s all chaotic – as cold water upwelling waxes and wanes – and wind and ocean current feedbacks kick in. The whole Pacific Ocean bounces around in several dominant perodicities. It’s amazin’.

        I have had to come to the Gold Coast. My mum had a health scare – my sister was going to murder her. I figured it must be my turn at long last to care for mum. Suzie has done a sterling job for a long time mow and she needs a rest poor dear.

        But it did give me time to think on the road – and yes there are chaotic parallels in climate and economies. Climate bounces around completely deterministically but seemingly at random. (Julia Slingo and Tim Palmer, 2011) In economics – instability lets loose the twin beasts – bears and bulls – of collective fear and greed.

        Both climate and economics are more stable with… ‘slower or better-anticipated changes’ than abrupt change. The principle applies to interest rates. Talking of economics for the ages – the Austrian school had a focus on smoothing out the business cycle. A lot of it is the in the social context. Social stability, peace, laws, freedoms, economics, policy, human well being, freedom from hunger, health and education development,environmental conservation… targeting demand by buying and selling on the cash markets to target inflation at 2-3%. Something that needs to be frequently adjusted – to inject or withdraw cash to or from cash markets – and thus influence interest rates. The objective is to slow and minimise change in the economic system. Just as we should do with climate – cause we don’t know the frick what climate will do tomorrow let alone in a hundred years.

        I am starting a new elibrary – the NAS report is the 1st selection. The Epic of Gilgamesh is the second – but that’s a different story. One that is the first, known, literature. Now that’s old. It the first epic poem of a hero’s journey. People point out to the NAS report is old. Year sure – so is the theory of relativity and quantum mechanics – the other two great ideas of 20th century physics. The NAS report is a climate application of chaos theory. Chapter 6 – Findings and Recommendation – is still more than relevant as a game plan for science priorities – to get a better understanding of the implications for society and environments – and for those generations of the future.

      • But we have to do the best we can with what is available. Otherwise, policy cannot be justified or may be wrongly justified.

        That’s exactly the problem, though, very often policy should not be justified by the data — very often “we don’t know, and maybe we never will” is a much better policy answer than “here’s our best guess, go ahead and implement multi-trillion-dollar global policy solutions based on it.”

      • Robert — “Climate bounces around completely deterministically”

        Complex, unbounded n-body problems are completely deterministic too, but they generally can’t be solved in practical time. There are entire classes of deterministic problems like this in which uncertainty dominates our ability to predict, climate is only a more popular one.

      • The crux of the problem is multi-trillion dollar market distortions by governments trying to transform energy. The energy system can only be effectively transformed with innovation in free markets.

        ““Climate change can’t be solved on the backs of the world’s poorest people,” said Daniel Sarewitz, coauthor and director of ASU’s Consortium for Science, Policy, and Outcomes. “The key to solving for both climate and poverty is helping nations build innovative energy systems that can deliver cheap, clean, and reliable power.”

        “This pragmatic strategy centers on efforts to accelerate energy innovation, build resilience to extreme weather, and pursue no regrets pollution reduction measures — three efforts that each have their own diverse justifications independent of their benefits for climate mitigation and adaptation.

        Both technology in energy generation, efficiency etc – across multiple sectors – and continual improvement of practices in agriculture and conservation , cement and brick manufacture, etc.

      • Peter Lang


        That’s exactly the problem, though, very often policy should not be justified by the data — very often “we don’t know, and maybe we never will” is a much better policy answer than “here’s our best guess, go ahead and implement multi-trillion-dollar global policy solutions based on it.”

        Exactly! That is exactly why it is so important we put much more research effort into studies to get the impact data to derive and calibrate impact functions and damage functions to correct the misleading incorrect ones that are being used to justify policy.

        This is the point I’ve been trying to make for years. It’s frustrating that many others (not you) do not understand the difference between climate change and the impacts of climate change.

      • Peter — I agree, some meta-analysis of past climate impact predictions so policymakers could at least attempt to gauge the likely reliability of current predictions would be useful, even if the answer was “very few impact studies are useful for policy planning.” Of course Armstrong and others in the scientific forecasting field have established a lot of general guidelines for good practice, but some specific looks at climate impact predictions would be interesting.

    • specific looks at climate impact predictions would be interesting

      Specific looks at climate impact predictions “might be interesting” but “would and could not be helpful”. Flawed theory promotes flawed models and the output is just wrong.

  20. Steve Mosher writes–“The key thing is understanding what METRIC you are trying to predict.”

    Imo- The “key thing” (in regards to climate science) is first to determine what the metrics are that will show whether the climate is improving or worsening at various locations around planet. These metrics have not been established.

    • What metric is used to determine if the climate is improving or worsening? I’d suggest it the economic impact of the change in GMST. This is exactly what the IAMs are intended to do. Unfortunately, as IPCC points out the studies to determine the inputs needed are sparse. IPCC recommenda,as does NAS, that a major research effort is needed to get objective, reliable unbiased data to enable us to estimate (with reasonable uncertainty) whether the climate would improve or worsen with global warming.

      • Peter
        In most locations moderate changes in temperature are not nearly as important as changes in annual rainfall amounts.

        I have a home in Tucson AZ I’ll use as an example. A change in temperature of 2 degrees in the summer would not mean much. It would still be hot an miserable in the summer. An increase of an additional 2 inches of rainfall a year would change the climate.

        Different regions have different key metrics to measure to summarize the trend of the climate change in their region. As an example, south FL might well put greater weight on severe storms than would Tucson.

      • Climate has cycled in the same bounds, in both hemispheres, for the past ten thousand years. What has happened will repeat. Look at History to understand the Future. That does not require major research, just a little history lesson. The IPCC and the NAS needs major funding to continue their major scams. They scare us to promote their junk.

  21. Vincent, one of the more interesting posts I’ve read here. Thanks.

  22. Nothing is ”uncertain” in climate – climates on this planer are regulated by altitude, latitude and by H2O / WATER. ”Climate science” doesn’t exist – would be good if it did, to improve the climate where is bad climate, as in deserts and semi-deserts. Unfortunately the therm ”climate” is substituted for the PHONY global warming; to deceive the public. For the shonks deserts have good climate, because is no CO2 &WV; and because desert give hotter reading of temperature during the day = good for fear-mongering – deserts have COLDER nights, but that is conveniently overlooked:

  23. The IPCC, along with some others, to confound “climate change” with poverty is wrong headed. More than that, the idea that poverty everywhere around the world can be eliminated is pure hubris.

  24. Pingback: Weekly Climate and Energy News Roundup #276 | Watts Up With That?

  25. Do we want 700 ppm with its 4 C of warming by 100 years from now?
    The Trump Doctrine on Energy <a href="https://tallbloke.

  26. Do we want 700 ppm with its 4 C of warming by 100 years from now? Do we want 700 ppm with its 4 C of warming by 100 years from now?

  27. Do we want 700 ppm with its 4 C of warming by 100 years from now? Do we want 700 ppm with its 4 C of warming by 100 years from now?

  28. Do we want 700 ppm with its 4 C of warming by 100 years from now? Do we want 700 ppm with its 4 C of warming by 100 years from now?

  29. The science showing anthropogenic climate change is about as certain as the science showing that smoking kills people. That doesn’t mean 100% certainty, since science doesn’t deal in that. Just enough certainty that one has little-to-no reasonable grounds for doubting the science.

    That’s just not my assessment; that’s also the assessment of John Ioannidis. See:

    17:17 to 18:22 of “RS 174 – John Ioannidis on “What happened to Evidence-based medicine?””

    In a previous post on this blog, Ioannidis’ work in medicine was abused in an attempt to cast unjustified doubt on the certainty and usefulness of climate science:

    I hope this blog isn’t now going to abuse work on economics in much the same way.

    • The ‘science’ about the impacts of climate change / global warming is virtually non existent. There seems to be no valid evidence to support the alarmists dogma that global warming would be harmful (on balance – of all impacts)

  30. Do we want 700 ppm with its 4 C of warming by 100 years from now? Do we want 700 ppm with its 4 C of warming by 100 years from now?

  31. Pingback: Why Economists Study “Climate Science”, Uncertainty, Interest Rates, and A Crash | Musings from the Chiefio

  32. html
    In a previous post on this blog, Ioannidis’ work in medicine was abused in an attempt to cast unjustified doubt on the certainty and usefulness of climate science:
    <a href="https://judithcurry. Do we want 700 ppm with its 4 C of warming by 100 years from now?

  33. Muhammad Anees

    Does that mean that non-economists are really dont know much about the uncertainty? I think, the nature of economic uncertainty and uncertainty in the outcomes of decisions in other areas of thoughts can be linked to make a good understanding of the understanding of uncertainty and hence its predictability.

  34. Do we want 700 ppm with its 4 C of warming by 100 years from now? Do we want 700 ppm with its 4 C of warming by 100 years from now?

  35. Got any valid reason as to why not?

    No innuendo, or statements of your beliefs please, just valid, objective evidence to show that 4 C increase in GMST would be more harmful than beneficial.

    Food for thought:

    GMST averaged 7 C warmer than now since complex life began, 650 Ma ago.

    Life thrived during war times, struggled during cold times.

    The planet is currently in the severest cold house phase it has been in for the past 650 Ma. There has been only one previous on in the period that was almost as cold as the current one. That was 300 Ma ago and last 70 Ma.

    3 C GMST increase would be less than half way to the average temperature for this period, and arguable half way to the optimum temperature for life on Earth.

    There’s a big-picture start for you. I’m ready to go the next step, if you can understand and accept this.
    Also see comments starting here:

  36. No innuendo, or statements of your beliefs please, just valid, objective evidence to show that 4 C increase in GMST would be more harmful than beneficial.
    Food for thought:
    GMST averaged 7 C warmer than now since complex life began, 650 Ma ago.