Global portrait of greenhouse gases

by Judith Curry

NCAR/UCAR has issued a press release: “First global portrait of greenhouse gases emerges from pole-to-pole flights.”

BOULDER–A three-year series of research flights from the Arctic to the
Antarctic has successfully produced an unprecedented portrait of
greenhouse gases and particles in the atmosphere, scientists announced
today. The far-reaching field project, known as HIPPO, is enabling
researchers to generate the first detailed mapping of the global
distribution of gases and particles that affect Earth’s climate.

The series of flights, which come to an end this week, mark an important
milestone as scientists work toward targeting both the sources of
greenhouse gases and the natural processes that draw the gases back out
of the atmosphere.

“Tracking carbon dioxide and other gases with only surface measurements
has been like snorkeling with a really foggy mask,” says Britton
Stephens, a scientist with the National Center for Atmospheric Research
(NCAR) and one of the project’s principal investigators. “Finally, HIPPO
is giving us a clear view of what’s really out there.”

“With HIPPO, we now have views of whole slices of the atmosphere,” says
Steven Wofsy, HIPPO principal investigator and atmospheric and
environmental professor at Harvard University’s School of Engineering
and Applied Sciences. “We’ve been quite surprised by the abundance of
certain atmospheric components and the locations where they are most
common.”

The three-year campaign has relied on the powerful capabilities of a
specially equipped Gulfstream V aircraft, owned by the National Science
Foundation (NSF) and operated by NCAR. The research jet, known as the
High-performance Instrumented Airborne Platform for Environmental
Research (HIAPER), has a range of about 7,000 miles (11,000 kilometers).
It is outfitted with a suite of specially designed instruments to sample
a broad range of atmospheric constituents.

The flights have helped scientists compile extraordinary detail about
the atmosphere. The research team has studied air samples at different
latitudes during various seasons from altitudes of 500 feet (150 meters)
above Earth’s surface up to as high as 45,000 feet (13,750 meters), into
the lower stratosphere.

HIPPO, which stands for HIAPER Pole-to-Pole Observations, brings
together scientists from organizations across the nation, including
NCAR, Harvard University, the National Oceanic and Atmospheric
Administration (NOAA), Scripps Institution of Oceanography, the
University of Miami, and Princeton University. NSF, which is NCAR’s
sponsor, and NOAA are funding the project.

—–Surprises on the way to a global picture—–

The first of the five HIPPO missions began in January 2009. Two
subsequent missions were launched in 2010, and two in 2011. The final
mission comes to an end on September 9, as the aircraft returns from the
Arctic to Anchorage and then to its home base at NCAR’s Research
Aviation Facility near Boulder.

Each of the missions took the research team from Colorado to Alaska and
the Arctic Circle, then south over the Pacific to New Zealand and near
Antarctica. The flights took place at different times of year, resulting
in a range of seasonal snapshots of concentrations of greenhouse gases.
The research was designed to help answer such questions as why
atmospheric levels of methane, a potent greenhouse gas, have tripled
since the Industrial Age and are on the rise again after leveling off in
the 1990s. Scientists also studied how logging and regrowth in northern
boreal forests and tropical rain forests are affecting levels of carbon
dioxide (CO_2 ) in the atmosphere. Such research will provide a baseline
against which to evaluate the success of efforts to curb CO_2 emissions
and to enhance natural CO_2 uptake and storage.

The team measured a total of over 80 gases and particles in the atmosphere.

One of HIPPO’s most significant accomplishments has been quantifying the
seasonal amounts of CO_2 taken up and released by land plants and the
oceans. Those measurements will help scientists produce more accurate
estimates of the annual cycle of carbon dioxide in and out of the
atmosphere and how the increasing amount of this gas is influenced by
both the natural world and society.

The team also found that black carbon particles—emitted by diesel
engines, industrial processes, and fires—are more widely distributed in
the atmosphere than previously thought. Such particles can affect
climate in various ways, such as directly absorbing solar radiation,
influencing the formation of clouds or enhancing melt rates when they
are deposited on ice or snow.

“What we didn’t anticipate were the very high levels of black carbon we
observed in plumes of air sweeping over the central Pacific toward the
U.S. West Coast,” says NOAA scientist Ryan Spackman, a member of the
HIPPO research team. “Levels were comparable with those measured in
megacities such as Houston or Los Angeles. This suggests that western
Pacific sources of black carbon are significant and that atmospheric
transport of the material is efficient.”

Researchers were also surprised to find larger-than-expected
concentrations of nitrous oxide high in the tropical atmosphere. The
finding has significant environmental implications because the gas both
traps heat and contributes to the thinning of the ozone layer. Nitrous
oxide levels have been increasing for decades in part because of the
intensive use of nitrogen fertilizer for agriculture. The abundance of
the gas high in the tropical atmosphere may be a sign that storms are
carrying it aloft from sources in Southeast Asia.

—–Balancing the carbon budget—–

The task of understanding how carbon cycles through the Earth system,
known as “balancing the carbon budget,” is gaining urgency as policy
makers discuss strategies to limit greenhouse gases. Some countries or
regions could be rewarded with carbon credits for taking steps such as
preserving forests believed to absorb carbon dioxide.

“Carbon markets and emission offset projects are moving ahead, but we
still have imperfect knowledge of where human-emitted carbon dioxide is
ending up,” NCAR’s Stephens says.

Before HIPPO, scientists primarily used ground stations to determine the
distribution of sources of atmospheric CO_2 and “sinks” that reabsorb
some of the gas back into the land and oceans. But ground stations can
be separated by thousands of miles, which hinders the ability to measure
CO_2 in specific locations. To estimate how the gas is distributed
vertically, scientists have had to rely on computer models, which will
now be improved with HIPPO data.  END

Background information

Details on the HIPPO project can be found at the project web site.

The overview publication by S.C. Wofsy can be found here.

JC comment:  This is an important project that has the potential to sort out many issues related to greenhouse gases and their spatial and temporal variations.

185 responses to “Global portrait of greenhouse gases

  1. Willis Eschenbach

    More measurements are always a good thing. Thanks for the heads-up, Judith.

    w,

    • I agree, Willis.

      Global temperatures did not follow the AGW predictions. Maybe climate researchers can discover why by studying variations in concentrations of gases that might cause global warming.

    • More data is a good thing, but I get nervous when the people analyizing that data makes statements such as:

      “Carbon markets and emission offset projects are moving ahead, but we still have imperfect knowledge of where human-emitted carbon dioxide is ending up,” NCAR’s Stephens says.

      That statement sounds like someone already convinced on what should be done and that they are trying justify the rationale for “carbon markets” and “emissions offset projects”.

      • “That statement sounds like someone already convinced on what should be done and that they are trying justify the rationale for “carbon markets” and “emissions offset projects”.”

        They’d be perfectly happy to tell you that they ARE convinced. That’s what the consensus business is all about. They are trying to fine-tune their knowledge, not re-invent the wheel. Whether they are ultimately correct or not is another matter.

  2. “Tracking carbon dioxide and other gases with only surface measurements
    has been like snorkeling with a really foggy mask,”

    But 225 trillion will fix it!

  3. Wofsy paper: “dense pollution high over the Arctic in late autumn/early winter, with a notable component of BC”

    Can BC from coal in China be differentiated from Diesel soot from all those diesel cars in Europe?

  4. As it turns out trees are a greater carbon sink than AGW models ever allowed for. It is now understood that trees can get all the nitrogen they from sedimentary rock structures such as are prevalent throughout the Pacific NW.

    _____________
    Bedrock nitrogen contributes to nitrogen fertility and carbon storage across temperate forest ecosystems — Scott L. Morford, Benjamin Z. Houlton and Randy A. Dahlgren — Land, Air and Water Resources, University of California, Davis, Davis, CA

  5. The Japanese have had a satellite up for a few years now measuring co2 and ch4(although the orbit misses the poles i notice)… with a few public releases of data, i found the variability of co2 and ch4 surprising from that. http://www.gosat.nies.go.jp/eng/result/result.htm

    Although, i suppose the resolution/ altitude variations would be better with aircraft.

    • Whoa. 30ppm differences between some locations?

      10ppm higher in NH.

      360ppm in Alaska and 390ppm in China and and a few other hot spots.

      • The 10 ppm higher in the NH arises from the fact that there is a lot more industry in the NH. There is something like a five year lag btw. NH and SH increases. The seasonal cycle is also much stronger in the NH than in the SH because the SH has a much higher proportion of ocean and the land is generally tropical/warm temperate. This is all pretty well known from various measurement sites inc. MLO. Check the CDIAC.

      • I don’t of anyone who has suggested a 30ppm spread.

        Also, the paper suggests the atmospheric transport model shows a 2-4ppm difference, not 10ppm.

      • is that what was meant by a “well mixed greenhouse gas”?

        (maybe we need to re-look at all those pre-Mauna Loa readings cited by Ernst Beck?)

      • Yes.

      • Measuring CO2 in the middle of an agricultural field tells you a lot about the plants and very little about the CO2 mixing ratio in the entire atmosphere. The center of Paris as a place to measure CO2 is also fun.

      • GOSAT shows 30ppm differences.

        CO2 measured in Antarctica shows daily 2.3ppm fluctuations. No agricultural fields nearby.

  6. So the science is settled now? Isn’t it wonderful how, in the absence of such data, the IPCC and the AGW ‘believers’ have been able to arrive at such certainty about human contributions to climate change!
    Seriously, it’s good that HIPPO has gathered such data but I am amazed at the arrogance of the IPCC.

    • No-one has claimed certainty. “Very likely” is not certainty.

      • That anthropogenic CO2 emissions has increased the atmospheric CO2 concentration is very likely. That global temperatures have increased slightly over the full aggregate of that very much accelerated anthropogenic CO2 emissions is somewhat likely. That such temperature increases have in overall caused harm or good is probably unknown.

        That future trends in atmospheric CO2 and temperature can be used to reasonably extrapolate 50+ years into the future is dubious.

        Frankly the rise in CO2 emissions due to the stimulated industrialization of the developing world is so great that most of what is measured now is a direct result of today’s emission.

        This achievement of directly measuring emissions which are immediately apparent in a very likely manner comes at a price.

        Amongst other things, it has gutted the credibility of science, It has shown the peer review process to be research perverting, junk producing, time wasting, political subjective gamesmanship. It demonstrated beyond doubt that scientists easily get caught in inescapable contests of silly buggers where they fantasize that their objective virility is worth defending at all cost.

        Despite claiming objectivity, scientists are every bit as subjective and all others. The so called ‘objective’ seeking professionals cannot see their own subjective existence. They are scientists. They are immune from such vulgarities.

        The achievement has subjected Western society to a relentless deceitful decades long propaganda war which would do a Eastern bloc cold war commie country proud. People now have some inkling of what it must have been like to live under a daily onslaught of jingoism.

        The achievement has created the biggest inflationary bubble and gold rush the world has ever known. It has taken previously worthless CO2 emission and hyped it into the most valuable asset known to man.

        Selling emissions, hiding emissions, off shoring emissions, using alternative emissions to reduce emissions is one endless ponzi scheme and shell game to monetize and encourage emission.

        The newly created CO2 bubble has ensured that far more carbon shall be emitted than would have been the case when people didn’t care. That’s quite an achievement!

        Yeah, that CO2 is increasing in the atmosphere in direct relationship to it’s substantial and rapid increase of emission is very likely.

        Science is amazing. It can detect the immediate and direct action-and-response of copious CO2 emission.

        Bully for you. Care to comment about the reliability of estimating there beyond?

      • Raving.

        Well raved !
        It would be interesting to know just how much financial misery worldwide could have been avoided if the IPCC had never existed.
        Apart from the costs surrounding their own existence and the funding thrown around by governments as a result of their pronouncements we must factor in all the pointless green legislation and pen pushing green jobs that stop products and energy being produced at the ‘proper’ price. Slice away the Greenie supplements, subsidies, feed in tariffs, land sale for wind farm kick-backs and land grabs for bio-fuels.
        Help the third world generate power using the latest most cost-effective means and give them clean water. Eradicate malaria, because we can if we want to and let’s get back to enjoying what should be the wonderfully increasingly abundant interglacial thanks to CO2 !!

        We really have never had it so good.

        We are like a hypercondriac who’s just been given a medical book,
        within a hour’s reading we feel we have symptoms of everything known to man.

        Time to lighten up – never mind ‘pre-industrial levels’ lets get back to pre-IPCC levels and reverse the lunacy !!!

      • You make a good list of some of the damage that the CO2 obsession has inflicted.
        The opportunity costs of the AGW social mania are huge.

      • I noticed of all the soft commodities the other day, all were down except corn. I wonder why?

      • “Very likely” is not certainty.

        No, it’s cockiness.

      • No-one has claimed certainty.

        But IPCC has understated uncertainty.

      • As far as the Australian government is concerned, the IPCC message carries the implication of certainty such that we are about to have a carbon tax inflicted on us. The government is so certain that it plans to disallow any debate on the legislation befor it is passed!
        Most rational people know that the whole AGW thing is merely a pretext for governments to effect wealth transfer by screwing more taxes out of developed nations. THAT is the CERTAINTY in practical terms.

      • Has socialism progressed to the point in Australia that you can no longer vote them out? I’m not very familiar with how the political system there works (or doesn’t work as the case may be.)

  7. This is a typical NCAR PR. A few measurements and a lot of pro-AGW hype. The noise to signal ratio is awesome. I am not sure the facts are even discernible.

  8. The task of understanding how carbon cycles through the Earth system,
    known as “balancing the carbon budget,” is gaining urgency as policy
    makers discuss strategies to limit greenhouse gases.

    1. Who decided that greenhouse gases are a bad thing which needs to be limited? As best as I can tell GHG + warmer temperature is the preferred state for biological process.

    Raising the alarm about havoc mayhem and suffering might be true for humans but mayhem and suffering is part and parcel of population growth decline migration and change.

    People and organisms have been suffering and changing, living dying and going extinct from day ‘dot’. What so special about humans emitting CO2 into the atmosphere that’s any different to humans fostering growing populations, curing diseases, waging war, polluting or doing anything else that humans do. Thing change. Life suffers and flourishes..

    The assumption of misanthropy is taken as given uncritically and without debate.

    2. “First global portrait of greenhouse gases emerges from pole-to-pole flights.”

    That’s nice but it also hints of another hidden assumption. Global suggests time stationary in the same sort of way that anthropogenic carbon dioxide production is 150 time greater than that produced through volcanism. The global portrait isn’t much of anything until it builds into a portrait across time.

    3. Preoccupation with rate of carbon change is unfair and misleading.

    Partly, it ignores that populations, technology, society and motives flux with time. The world will be a different place in a year hence. The reality of the situation will appear to be different too.

    4. Preoccupation with the rate of carbon change is grossly unfair and ignorantly misleading because it is quite blind to the predominant player.

    “Poor cute polar bear cubs! Who will speak for them? … Life is so fragile and easily stressed ”

    Yeah sure. Life thrives on CO2 heat humidity and trace elements. It causes an explosion of biomass production and a flourish of diversity awaiting the opportunity to break out from a stagnating genome.

    Volcanism has been pumping out CO2 for millions of years. All of the anthropogenic carbon emission to date multiplied by 7X would = the complete output of CO2 via volcanism for the past 70,000 years since the Toba eruption.

    7X the current cumulative aggregate of anthropogenic carbon emission is the upper limit that humanity can exploit carbon resources. It doesn’t seem very impressive juxtaposed against the relentless CO2 emissions for millions of years from volcanism.

    Most of all it strongly suggests that the biosphere is like a sleeping dragon who is eager to wake up and consume as much CO2 and other trace elements as opportunity provides.

    God bless the poor dumb ignorant greedy selfish industrialist pouring out all that noxious crap to poison burn and choke defenseless shrubbery and cute furry animals. He doesn’t realize that he is sounding the wake up bell. “Breakfast is on. Wakey wakey! Come and get it …”

    There is a sleeping dragon at this AGW party. It’s called life. It has been dismissed as fragile helpless and at dire risk. It’s not seen as the earth’s most potent response to increased CO2 and rising temperatures.

    Better keep an eye on those fluctuating year-over-end ‘global CO2’ measurements. The sleepy dragon of life is very apt to start revving up and overtake all expectations.

    Nobody saw it coming because they were all mesmerized by misanthropy.

    • “There is a sleeping dragon at this AGW party. It’s called life.”

      Life is a major climate forcing, completely unrecognized in mainstream climate science. It is the major player that has regulated the earths climate for the past couple of billion years.

      This is obvious when one considers evolution. Life exists over time, because once it gets a foothold, it regulates its environment to promote its survival. Otherwise, how has life survived a couple of billion years of climate change, that make the 0.7C warming in the past century look like a walk in the park.

      Unbelievable as it may seem, the local life forms outside my house can actually survive a TEN TIMES greater climate change than has happened in the past century, even when it happens in a single day! Who would have thought that life could be so hardy and adaptable.

      • CO2 levels, since the primordial high-density atmosphere was established, have been under the control of life forms, though they have not always been judicious. Indeed, the flora have eaten CO2 down to near-starvation levels, and it’s about time for we fauna to pick up our end of the stick and get as much of it as possible back in circulation.

      • Sky Demon! Or Hockey Stick? Boomerang?

      • It would be better to have the CO2 on these graphs logarithmic. Here’s an example, although it also includes the longterm solar forcing too:
        http://www.skepticalscience.com/images/Phanerozoic_Forcing.gif

        It’s worth bearing in mind (in fact it’s crucial) that those past CO2 changes happened over millions of years, rather than the mere hundreds we are going about doing likewise.

      • @orange

        Those graphs suggest that the Earth’s climate is controlled by a negative feedback process.

        Volcanoes emit CO2. The biosphere mostly consumes and fast recycles it and partly loses carbon downwards.

        For a negative feedback mechanism to work, the biosphere must consume and lose downwards at increasing rates at higher temperature and CO2 levels.

        It is conceivable that there is a runaway catastrophe at the hot end with excessive CO2 but that doesn’t seem to have happened. Increasing CO2 + temperature results in increasingly aggressive biological activity at scavenging.

        The other catastrophe of falling off the bottom is recurrent and produces ice ages.

        Volcanism can be relied upon to build up atmospheric CO2 levels and thaw things out.

        If biological activity is high during regimes of high CO2, a heck of a lot of carbon must be recycling quickly back into the atmosphere.

        … (big) OR?

        Why such high atmospheric CO2 levels for so long?

      • on long timescales I think the main sink is weathering rather than biological activity.

      • lolwot,

        on long timescales I think the main sink is weathering rather than biological activity.

        Don’t underestimate coccoliths: they have build up hundreds of meters thick chalk layers during the Cretaceous, still doing that up to today. That is a lot of carbon more or less permanently removed out of the atmosphere…

      • 324.8ppm at an earlier time in the Holocene.

        “The GISP2 ice core has 120 data points between 10,200 – 16,490 YBP. The Vostok ice core has only 6 data points for the same range. Yet it is the Vostok ice core that dominates the discussion. The GISP2 also has 324.8 ppmv which is the same level that Mauna Loa recorded for 1969. GISP2 said that was the same level that happened 10,960 YBP. There is also the evidence of the high frequency changes when the period from 11-19K years ago is looked at.”

        http://theinconvenientskeptic.com/2011/08/why-the-co2-ice-core-reconstructions-matter/

    • anthropogenic carbon dioxide production is 150 time greater than that produced through volcanism

      maybe that should be revised to known volcanism

      There are studies suggesting that the volcanic CO2 being considered (primarily from terrestrial volcanoes) is only a very small fraction of the total CO2 being emitted through unknown submarine volcanoes and fissures.

      Max

      • That number … and I’ve raised it slightly … but choose any number from 80 – 250 …

        The thing being … that by summing all the known eruptions and emissions from all known sources and generalizing outwards to cover unknown regions in as generous a manner as one chooses STILL isn’t enough to account for 1/50th 1/100th or 1/whatever of a fraction of anthropogenic CO2 emissions. It’s not really a numbers game, it’s a magnitude of scale thing.

        If it were an unknown submarine volcanoes and fissures [scale of magnitude] thing such would amount to hiding umpteen times more volcanic activity underwater or subsurface than all the known and very obvious surface and underwater volcanoes that are currently known to exist. Such N-fold hidden activity relative to observed activity would be noticed. … at least a sufficiently minimal proportion of the N-fold excess would be sufficiently noticeable to ‘hint’ that further exploration is worthwhile.

        See “Volcanic versus anthropogenic carbon dioxide” in http://bigthink.com/ideas/38998 for the AGU paper on this.
        ————————–
        When one walks away from the “Couldn’t volcanism produce the large CO2 emission?” question … the anthropogenic150X volcanic origin of CO2 … isn’t so impressive. That represents 10,000 years of volcanic activity at humans emitting 100X volcanic emissions for 100 years. By geological time scales it’s a drop in the bucket.

        It begs the question as to all that accumulative volcanic emission for the millions of years up to the present.. In the past, rates of CO2 emission could have been much higher or perhaps even lower than today.

        Aside from other pathways such as rock weathering and diffusion into seawater, .. it leaves biology as the main and potent removal mechanism.

        There seems to be a real huge immense biggie FACT here which might be conveniently ignored.

        How the heck did life exist for millions of years in atmosphere with 10X the CO2 that exists today? Life was teeming, chowing down as fast as it could put it away … but the carbon got a quick turnaround and went right back up skywards.

        And then there is the question of trace elements limiting biomass production … and the question of whether carbon is a trace element insofar as it serves as a limiting factor.

        The oceans of the world have large swatches of dead zones, presumably due to a lack of trace elements.

        Why? … What does it mean?

        Those are the questions which I am asking because pumping all that CO2 into the air will awaken the sleeping biological dragon to something familiar and desirable. Bringing old genomic designs back “on line” to take advantage of opportunity is something that is apt to happen very quickly … years, decades and centuries intervals of time.

        Admittedly, any one single evolutionary modification isn’t so likely. Yet given many different species and many ways of doing it .. That’s quite the multiplier!

        IMO, the way that the biosphere responds to the CO2 increase is by far the greatest wild card in a AGW story which is already full of unknowns.

      • It is estimated that there are 3,000,000 underwater volcanoes.

        Some of them spewing out liquid CO2.

        http://wattsupwiththat.com/2011/08/11/undersea-volcanoes-might-be-more-common-than-previously-thought/

    • Raving,

      “Volcanism has been pumping out CO2 for millions of years. All of the anthropogenic carbon emission to date multiplied by 7X would = the complete output of CO2 via volcanism for the past 70,000 years since the Toba eruption.”

      Considering you have no idea how much CO2 is being output from sea floor vulcanism this statement, as many others by Alarmists is pure speculation.

      • It’s hardly speculation is it. How much more deep sea vulcanism do you need to change that figure significantly?

        Ie drop it down from 70,000 to just 1000.

        Too much.

      • We simply have no idea whether the number of undersea volcano equivalents annually are twice as much as known surface or thousands of times more. Until we do have a better idea the IPCC and everyone else’s speculation about the CO2 flux and Goreball Warming is a bad joke.

      • Might not have much idea of the CO2 emitted specifically. Nevertheless as the AGU source reference starkly points out … where there is CO2 there is also lava … on a percentage basis of CO2-emitted/lava-effused .. and you can play with that CO2/lava ratio as much as you desire … there simply isn’t enough lava produced each year to match the anthropogenic emission of CO2. Nothing subtle here. Two more orders of lava need to be effused per year to equal human emission.

        No way to hide 100X the observed lava emission. No possible reason to expect submarine volcanism to emit CO2/lava at 100 times the ratio that CO2/lava is emitted above ground. There NO WAY to make up that 100X difference between observed and matching human emissions. It cannot get more unsubtle.

        There is another effect that submarine volcanism can have on CO2 levels. I’m no expert and I don’t know how and if the following is considered. Nevertheless …

        Submarine volcanoes produce heat. Previously cold but now hot water rises … and perhaps to the surface to release dissolved CO2 … or perhaps the heating is sufficient to create a current …

        I don’t know .. but bringing up water from deep can have a big influence on atmospheric CO2 levels. …and underwater volcanoes produce heat.

  9. A major advantage of satellite measurements is the frequency and comprehensiveness of coverage. A major advantage of aircraft measurements of the HIPPO type is physical sampling. I would think that the latter will ultimately prove to be particularly valuable as a guide to accuracy checking and calibration of satellite measurements – perhaps even more than for its direct contribution to data acquisition. The same applies to its potential use for checking the ability of models to simulate atmospheric changes.

    • Aircraft measurements were used to check the Mauna Loa measurements fairly early on. The excellent match validated Keeling’s choice of CO2 observatory

      As some may recall the NASA CO2 Observatory crashed and burned on launch which was a huge loss.

      • Probably sabotaged by warmenistas when they realized it would show things they didn’t want us to know.

      • Never attribute to stupidity what can be explained by circumstance. Account well for Bruce

      • Well … a fairing came off and therefore the satellite was too heavy to make orbit, and NASA had chosen the smallest possible launcher.

        “As a direct result of carrying that extra weight, we could not make orbit.”

        The $270m (£190m) mission was launched on a Taurus XL – the smallest ground-launched rocket currently in use by the US space agency.

        The mission was to have been Nasa’s first dedicated CO2 mapper

        Since its debut in 1994, this type of rocket has flown eight times, with six successes and two failures including this launch. But this is the first time Nasa has used the Taurus XL.”

        http://news.bbc.co.uk/2/hi/7907570.stm

        You have to wonder …

      • It was the same damn thing that happened on the previous launch with an XL

      • Brer Rabbit,

        “It was the same damn thing that happened on the previous launch with an XL”

        You don’t know when to stop supportingyour opponents position do you.

      • Eli knows enough when to hand a clown the keys to the care

      • “You don’t know when to stop supporting your opponents position do you.”

        LOL.

  10. I have a feeling that if one were to compare this report to domestic and international air traffic routes there would emerge a signature that is latitude dependent.

  11. I have to agree with Willis and Oliver. Very little can immediately be concluded. Simply useful for future studies, no more, no less. The most useful result may be that analysis of the data can determine how best to proceed in research.
    It has seemed to me that Mark Twain’s comment has been so prevalent in Climate Science–‘A great return of conjecture on very little investment of fact’

  12. A greenhouse gas (sometimes abbreviated GHG) is a gas in an atmosphere that absorbs and emits radiation within the thermal infrared range. This process is the fundamental cause of the greenhouse effect. The primary greenhouse gases in the Earth’s atmosphere are water vapor, carbon dioxide, methane, nitrous oxide, and ozone. In the Solar System, the atmospheres of Venus, Mars, and Titan also contain gases that cause greenhouse effects. Greenhouse gases greatly affect the temperature of the Earth; without them, Earth’s surface would be on average about 33 °C (59 °F)colder than at present.

    Since the beginning of the Industrial revolution, the burning of fossil fuels has contributed to the increase in carbon dioxide in the atmosphere from 280ppm to 390ppm, despite the uptake of a large portion of the emissions through various natural “sinks” involved in the carbon cycle. Carbon dioxide emissions come from combustion of carbonaceous fuels such as coal, oil, and natural gas. CO2 is a product of ideal, stoichiometric combustion of carbon, although few combustion processes are ideal, and burning coal for example, also produces carbon monoxide. Since 2000 fossil fuel related carbon emissions have equaled or exceeded the IPCC’s “A2 scenario”, except for small dips during two global recessions.

    • What is the greenhouse effect on Mars and Titan? Please explain.

      Since the combustion formula for CH4 is CH4+ 2O2>> 2H2O + CO2 we get twice as many molecules of H2O (a GHG) than CO2 so lets call it correctly burning fossil fuels increase the amount of water vapor in the atmosphere and we need to stop that now.

    • Since 2000 fossil fuel related carbon emissions have equaled or exceeded the IPCC’s “A2 scenario”, except for small dips during two global recessions.

      Yeah.

      But the compounded annual growth rate (CAGR) of atmospheric CO2 has remained fairly constant around 0.43% per year, which is equal to IPCC’s “B2 scenario” (~580 ppmv by 2100.)

      According to IPCC, this scenario

      is a world with continuously increasing global population, at a rate lower than A2

      Scenario A2 assumes a much more rapid CAGR (approx. 3 times that of B2), reaching ~1280 ppmv by 2100, with a “continuously increasing population”.

      UN estimates of population growth over the 21st century are at a CAGR of 0.3% per year, leveling off at around 9 billion inhabitants by the late 21st century, compared to a CAGR of 1.7% per year from 1960-2000.

      A2 appears totally unrealistic, since latest estimated by the World Energy Council have estimated that all the “inferred possible fossil fuel resources in place” only contain enough total carbon to reach a level of around 1000 ppmv when they are all consumed.

      I’d say common sense tells me that B2 is probably the upper limit and anything above B2 is simply based on exaggerated model input assumptions, in order to get an alarming putative temperature increase.

      Max

      • But the compounded annual growth rate (CAGR) of atmospheric CO2 has remained fairly constant around 0.43% per year, which is equal to IPCC’s “B2 scenario” (~580 ppmv by 2100.)

        Aha, a formal model of CO2, excellent, Max. You appear to be modeling CO2 in the year y as 1.0043^(y − 617) ppmv. When y = 1958 (the first year of the Keeling curve) your model gives 315.4, spot on. When y = 2100 it gives 580.1, spot on again. Outstanding model!

        But if it works so perfectly going forwards 142 years from 1958, we would expect it to also work pretty well going backwards 142 years from 1958, namely to 1816. Setting y = 1816 gives 171.5 ppmv.

        Oops, something wrong there. If preindustrial CO2 is supposed to be 280 ppmv, why is your model saying it should be 171.5 ppmv?

        When projecting your model backward 142 years from 1958 gives such an obviously wrong answer, why should we attach any significance to what it projects 142 years forward from 1958?

        Going into debugging mode, we can ask whether the model should have been allowing the natural level of 280 ppmv to grow exponentially, or just the anthropogenic contribution. The latter would be a different model. Let’s see if taking 280 to be constant makes any difference.

        We know that when y = 1958, CO2 = 315 = 280 + 35, while when y = 2011, CO2 = 391 = 280 +111. This makes the CAGR of the anthropogenic component 2.2%, giving us another model:

        280 + 1.022^(y − 1795)

        With y = 1958 we get CO2 = 314.7 ppmv, pretty good. And when y = 2011 we get 390 ppmv, also pretty good.

        What about the middle of the Keeling curve, where y = 1984? At Mauna Loa they measured 343.7 ppmv in January 1984. Your model gives 352.6 ppmv, high by 8.9 ppmv. This new model gives 342.5, low by 1.2, more than 7 times as accurate as the model that allowed the natural base to grow exponentially.

        Now let’s go back to 1816 again. This time instead of 171.5 ppmv we get 281.6 ppmv. Given that fossil fuel consumption at that point was largely coal burnt at home by 1/7 of today’s population (there were no steam locomotives or power plants yet, and only one steamship, Henry Bell’s), an anthropogenic contribution of 1.6 ppmv added to the presumed natural background of 280 ppmv doesn’t seem too outrageous.

        Reassured by four promising datapoints, we can now ask what this model has to say about y = 2100. It projects 763 ppmv.

        Now this model would be less plausible if there were no relationship between its CAGR of 1.022 and the CAGR of fossil fuel consumption, which warmists hold responsible for the atmosphere’s growing anthropogenic CO2, the excess over 280 ppmv. A radically different CAGR for growth of fuel consumption would tend to shoot a hole in this model.

        The Carbon Dioxide Information Analysis Center, CDIAC, maintains a very nice database of global fossil-fuel consumption. For 1751 it estimates a total emission of 3 megatonnes of carbon, and 3 again for 1752. For 2007 it is 8543 MtC, for 2008 8749 MtC (8.749 GtC).

        Since some of this is expected to accumulate in the atmosphere we’re interested in the cumulative sums, which for 1751 will be 3, 6 for 1752, 338009 for 2007, and 346758 for 2008. (As a check, 346758 − 338009 = 8749.)

        This graph plots the CAGR averaged over 5 years (by taking the ratio of entries 5 apart and taking the 1/5 power of the ratio, to give a smoother graph than just taking ratios of adacent values, though it’s essentially the same).

        In the 19th century CAGR is in the range 4-5%. In the first three decades of the 20th century it declines to 2.37% at the depression, hits 2.7% at the start of WW2, drops back to 2.36% by war’s end, then climbs until it reaches 3.6% in 1970 before slowly falling back to 2.43% at the end of the century.

        In general this is rather on the high side compared to the observed steady 2.2% CAGR of atmospheric CO2 as measured at Mauna Loa. While I’m not sure what to make of the difference, it most certainly bears no connection to the 0.43% CAGR of the first model considered here.

        I’d say common sense tells me that B2 [580 ppmv in 2100] is probably the upper limit and anything above B2 is simply based on exaggerated model input assumptions, in order to get an alarming putative temperature

        I’d say we should be grateful CO2 is not accumulating at the 3% CAGR that our fuel consumption would have predicted. Otherwise we’d be contemplating 1820 ppmv in 2100!

  13. “… forests believed to absorb carbon dioxide”. I thought we had established how plants and trees thrive?

    Anyway do we have any measures pre-now to say definitively if anything is changing? Of course not so we compare to what we thought, this study will only be valuable when we compare like with like over time. Of course it’s nice to know what’s up there, and where it hangs about just now, but it adds nothing to any discussion of changes over time (just now).

    The press release says nothing really.

  14. It’s good to see some attention being given to observations of the atmospheric distribution of black carbon as well as that of gases.
    See for instance:
    http://globalwarming.house.gov/files/HRG/031610BlackCarbon/ramanathan_part1.pdf
    Ramanathan is a big name in the climate establishment, and his findings may point the way to a more useful direction for research on the elusive anthropogenic effects on climate, particularly in the arctic where black carbon emitted in the middle latitudes of the NH and deposited on the surface of snow and ice may have a measurable effect on melting rates. I don’t have a reference to hand, but I think James Hansen has also published on this topic.

    • “Life time of black carbon in the air is of
      the order of several days to few weeks.”

      It is relative short, if it’s true. All the new black carbon emitting vehicles/plants are actually very clean (diesel vehicles, coal and biomass fired plants…). Emission control is easy (filters, scrubbers…). A real pollutant and a real pollution reduction. And emission control can/shoud be enhanced.

      Unlike CO2. No pollution and no possible (efficient) reduction.

    • Maybe you are thinking of this one:

      http://pubs.giss.nasa.gov/docs/2003/2003_Sato_etal.pdf

  15. I wish these guys could explain to me how nitrous oxide “traps heat”. Is nitrous oxide immune to the normal physical processes of other gases whereby they can transfer heat energy to other gases by collision?

    • its like a dam “traps water”

      • Thanks orange, now I understand. The nitrous oxide forms an impermeable wall above the atmosphere and traps all the energy in. Neat. Can I build a nitrous oxide wall round my house?

      • Greenhouse gases do present a barrier to infrared radiation.

      • Really! So nitrous oxide is another gas that blocks IR? Amazing physics we have these days! Just amazing.

      • Yeah it’s a greenhouse gas…

      • Nonsense. You can say IR-resonant gases cause diffusion, but they have little thermal mass, so they can’t block, trap, or store significant amounts of energy.

      • The planet begs to differ

      • orange, 49% of incoming solar energy is IR.

        Does Co2 and NOx trap that IR energy and keep it away from the earth?

      • Amazing physics we have these days! Just amazing.

        Actually it’s older than quantum mechanics (1925), or relativity (1905). The discovery that gases with complex molecules (such as nitrous oxide) absorb infrared radiation was made by English physicist John Tyndall in the early 1860’s, around the same time as Scottish physicist James Clerk Maxwell formulated Maxwell’s equations for EM radiation.

      • “orange, 49% of incoming solar energy is IR”

        nonsense

      • orange,

        Without checking, it may well be correct but it’s a moot point. What greenhouse gases do well is “trap” longwave infrared, whereas the infrared arriving from the Sun is almost entirely shortwave

      • Shortwave radiation is UV and Visible light. Longwave is infrared. Lucky for us most UV is stopped by ozone.

      • Andrew, what is he correct about?

        Science of Doom is interesting, but they think backradiation exists so I take anything they say with a big grain of salt.

        In figure 6.3, all of the black part of the graph is the INCOMING energy absorbed by Ozone, H20, etc. CO2 does cause incoming energy to be absorbed by the atmopshere. If there is more CO2, more incoming IR would be absorbed.

      • Orange, you are correct. It is 51%. Bruce got it backwards.

      • I think it is 51% when it gets to the surface. At top of atmosphere it is different and it appears to fluctuate a lot with UV going up and down. But I can’t find my reference for 49%.

      • If you take 700 nm as the boundary between visible red and infrared, then the
        second table here, which puts the 50% point at 711 nm, makes IR at TOA equal to 51%, which is what I had in mind. Not sure what it would be at the surface but it would presumably depend heavily on percentage of cloud cover.

        Actually I’m guessing from orange’s firmly expressed “nonsense” that more likely he was thinking that 0% of incoming solar energy is IR.

      • Bruce,

        The figure shows that only a very small percentage of incoming solar radiation falls within the range at which it can be absorbed by CO2.

      • Water vapor absorbs several percents of incoming solar energy in the near IR, ozone has a strong but narrow absorption peak at 760 nm, which reduces the radiation reaching the surface by 0.5% or so. CO2 has some absorption further out in the tail, where it’s influence is further reduced by the overlap with absorption by H2O. The influence of CO2 for the incoming radiation is indeed extremely small, less than 0.1%.

      • But since more Co2 would stop some IR from reaching the earths surface (not to mention the large amount stopped by H2O/NOx etc) one should consider the coolling possibilities of more GHG’s.

      • Should we really put much weight on something that’s less than 1% of the opposite stronger effect. That’s the order of magnitude of the cooling effect of CO2 in comparison to its warming effect.

      • Well, I think Co2 supposedly is causing 1W/m^2 or so of warming right now.

        Isn’t that about 1/1300th of the TSI at the top of the atmosphere?

        And CH4 also blocks part of that spectrum. As does Ozone and H2O etc.

  16. Judith,

    Nothing on the forest/grass fires or above ocean volcanic activity?
    How much heat do these activities produce?
    No one is doing a comparison measurement in BTU’s per tree or volcano.
    Since most of the land mass is in the northern hemisphere.

    Still no word on how much EXACTLY in water vapor is “Lost in Space”.

  17. Given Wagathon’s overarching conclusions that “trees can get all the N they want from sedimentary rock…” it is worth quoting the conclusions of the study in full.

    “After accounting for confounding climatic and edaphic factors, analysis of the regional FIA data indicates a substantial effect of bedrock N on C storage in aboveground forest biomass: tree C pools were 40% higher in forests underlain by N-rich rock than in sites with low bedrock N. Field data show that sites on N-rich parent material had elevated δ15N (0.69‰ ±1.3, mean ± 1σ) vs. sites with N-poor parent material (-2.31‰ ±1.3), and total N was elevated among sites with N-rich parent material (1.19% ±0.16) vs. N-poor parent material (1.04% ±0.15). Rock analysis shows that N-rich bedrock is found in many meta-sedimentary formations of northern California and Southern Oregon, including the Redwood Creek Schist, Condrey Mountain Schist, and Colebrook Schist formations, as well as parts of the extensive Franciscan formation. Taken together, these data provide evidence that rock-N contributes to increased N fertility and C storage across large areas of temperate forest.”

    One may add – “in northern California and southern Oregon” Whether that applies to other temperate forests, or to boreal forests would appear to require further study. And in all probability, it does not apply to humid tropical forests where rooting patterns, the depths of ancient soils, and the speed of nutrient recycling appear to preclude significant scavenging from bedrocks. Bottom line, its interesting stuff but don’t get your hopes up.

  18. I do not understand. The Science was Settled. Why was this necessary?

    So, there is really a lot they still do not know.

    • Yet at the same time you effectively claim the science IS settled when you state things like:
      “Many of us have looked at the Climate Data and we have figured out what it means. It certainly does not mean that a small part of a trace gas can cause a major ice age or a major global warming.”

      You’ve figured out what it means = it’s settled.

      • You’ve figured out what it means = it’s settled.

        If you figure out that “lapin” is French for “rabbit” would you say you have now mastered French vocabulary? Just because some things in science have been figured out does not mean everything has.

      • What I have figured out is that the temperature over the past ten thousand years is extremely stable in a narrow range. That is because when it gets warm, it snows more. when it get cool, it snows less. This powerful negative feedback to temperature has overpowered all the unstable temperature forcing for ten thousand years and will continue to do so. For me, this much is settled. When data is stable and Theory and Models are unstable, I will go with the actual data.

      • when it gets warm, it snows more. when it get cool, it snows less

        As a theory of climate change I would say the benefits of your model far outweigh any negatives it might have. In fact after much thought the only possible negative I could think of is that it bears no relationship whatsoever to any extant theory of climate change, which many contributors to this blog would consider a plus.

        On the positive side we have the following.

        1. It is much easier to explain than the complex general circulation models or GCMs that have been developed at great expense to UK and US taxpayers yet which no member of the public could even begin to explain at a cocktail party. Quantum mechanics, relativity, evolution, plate tectonics, and Glenn Beck’s theory of gay marriage are all easier to explain than GCMs, as is your theory.

        2. To this Australian it is obviously correct. As we move into spring and then summer, the snow comes down harder and harder. Conversely as we move into autumn and then winter the snow starts easing up. We expatriate Australians are all too familiar with this phenomenon, which seems strange when first arriving in the Northern Hemisphere (“Snow at Christmas? But it’s summer!”) but to which we soon grow accustomed.

        3. It will have particularly strong appeal to the contiguous US voting public, none of whom live in either arctic or tropical climates and can therefore relate to the idea of snow coming and going. Those non-Americans living in the tropics will be mystified by the concept of more snow (“what snow?”). Those living in the Arctic or Antarctic will be equally mystified by the concept of no snow (“what’s a meadow?”). Those provincial tropical and polar residents don’t get to vote in the US so their opinion doesn’t count.

        I foresee a promising future for this conceptually elegant theory of climate change, especially if the Republicans take the White House in 2012.

  19. Manmade Greenhouse gases are not causing significant dangerous warming. There are things in NASA and NOAA’s theory that don’t stand up to simple physics. They build miles thick ice sheets during the long cold of the ice age when the water is frozen and there is no source for moisture to provide the snow that would be necessary.
    They achieve max ice volume, twenty thousand years ago, and then melt the miles thick ice sheets in ten thousand years while they warm the earth.

    Many of us, engineers, do not believe that was possible. They build ice sheets at the same temperature that they melt ice sheets rapidly, with only small changes to CO2 and Solar Cycles and Orbit Parameters. This totally does not match with simple physics.
    The ice chest does not warm while you melt the ice. The ice chests warms after the ice is mostly gone. Earth, with miles thick ice sheets, would not warm while you melt the ice. There is no energy source that could melt miles thick ice sheets and, at the same time, warm the earth in a ten thousand year period. They got the ice wrong and they got the CO2 wrong.

    Look at the data for yourself. Their own data proves them to be wrong.

    We went to the moon because we looked at the data and figured out what it meant.
    The Space Shuttle was successful because we looked at the data and figured out what it meant.
    Many of us have looked at the Climate Data and we have figured out what it means.
    It certainly does not mean that a small part of a trace gas can cause a major ice age or a major global warming.
    NASA and NOAA will not engage us in debate over this. If they had a strong case, they would likely want to talk to us and try to convince us.
    They are on very thin ice.

  20. Most of the commentary here has been on CO2.

    But black carbon is pretty important for warming.

    If Asian black carbon levels over the Pacific, heading to US, are as high as in a major US city, that suggests that the importance of BC has been understated in models.

    If BC warms the atmosphere, then other emissions warm it less. That is nice to know.

    BC can be reduced more quickly that can CO2, and at less cost. Health benefits come quickly when BC is reduced as well.

    • John

      Excellent point, and excellent initiative.

      A little optimistic, but even if you were inflating the importance of particulates by an order of magnitude (which I doubt you are), well worth undertaking the action you suggest, on a purely economic basis.

      No economist, after all, is going to suggest soot is good for plants.

      This would, of course, pretty much kill the coal industry, but I expect they will see reason and do what’s best for all concerned, rather than resist at a cost to us all.

      • It would not kill the coal industry. Particulates are easy and inexpensive to control/reduce (e-filters fabric filters, scrubbers…). I don’t think they warm significantly, but they are pollutants.

      • Nonsense. This was the original plan that may be put on hold until after the election (because it was dangerous to Obama’s re-election campaign)

        “As the stagnant U.S. economy continues to plague the country, and as regulations increase, opposition from industry leaders has sprung up. The American Legislative Exchange Council and the Edison Electric Institute, the latter an industry representative for investor-owned utilities, have tagged the developing regulations “EPA’s Regulatory Train Wreck,” as they claim the new rules will cost utilities up to $129 billion and eliminate one-fifth of America’s coal capacity. The Edison Electric Institute also noted that the U.S. government’s regulatory war on coal could retire up to 90,000 megawatts of coal-fired electricity generation.

        Another concern is further damage to the very issue President Obama is so determined to reverse — the unemployment rate. According to a Commerce Department analysis, the regulations would cost up to 60,000 jobs, a much higher figure than the agency originally forecast.”

        http://www.thenewamerican.com/tech-mainmenu-30/environment/8700-epa-regulations-to-shut-down-coal-plants-and-raise-energy-prices

      • Are you replying to Edim or Bart?

      • Bruce, US coal is indeed at risk due to the policies of the current Administration. But the policies are multiple, and none have to do with black carbon. Utilities will have to comply with new rules for (1) air toxics, (2) SO2, (3) NO2, (4) the types of particles emitted by coal burning power plants (sulfates, which are light in color, not dark, and which reflect sunlight rather than absorbing it), (5) cooling water, and (6) ash disposal. I may have missed one.

        The rule which President Obama withdrew would have controlled ozone, which requires oxides of nitrogen and carbonaceous gases as precursors for the creation of low level ozone in the troposphere; utilities burning any fuel emit oxides of nitrogen, but coal is the major such source.

        Some environmental reporters erroneously and confusingly refer to power plant particulate emissions as “soot,” implying to most of us that the emission is dark. Measurement experts when using the term “soot” see it as synonymous with black carbon, but environmental reporters simply use a confusing term, which is why many people think power plants in the US emit black carbon. If you live east of the Mississippi, or within a hundred or so miles west of it, you can probably find a coal fired power plant within an hour’s drive of your home. Go take a look at what comes out of the stack.

      • It’s not nonsense. CO2 hysteria is nonsense. Particulates/soot/black carbon from industrial/power-plant furnaces/boilers is easy and inexpensive to control/reduce. Soot from diesel engines is also easy to control/reduce. Newest generations are very clean. It can be made even cleaner.

      • If it wasn’t for the black carbon, temperature might have dropped by .5C in the last 10 years instead of staying level.

      • Bruce

        Should point out, Bond et al. (see John’s link below), the 1996 to 2004 levels of particulate carbon fell substantially, although they did still remain high, and as you’ve pointed out elsewhere, could have been lower still.

        If particulates were more than a significant fraction of the story, this 1996-2004 drop ought have resulted in many times the drop you call for; if a post-2011 particulate drop of like proportion occurs, it will by necessity have still less impact.

        When particulates bottom out, the modifying effect of their drop will cease. If they increase, the increase will add to the other increase we know by this very reasoning must also be there.

      • “Black carbon emissions from China doubled from 2000 to 2006.” My guess is they doubled again from 2006 to 2011.

        “Approximately 20% of BC is emitted from burning biofuels, 40% from fossil fuels, and 40% from open biomass burning”

        http://www.igsd.org/docs/BC%20Summary%206July08.pdf

      • Bruce

        Well caught.

        However:

        96-04 ‘= 00-06.
        China ‘= world.

        Certainly is a concerning trend, and may mean we’ve already hit the bottom of the particulates decline curve and ought expect an upturn.

        Which by both our reasoning systems, implies rise in heat soon, absent other effects we don’t account for here.

      • China does burn 48% of the worlds coal – 2.5x more than 2000.

        Asia-Pacific burns 67% of the worlds coal – 2x more than 2000.

        (2010 figures)

      • Bart, thank you. With regard to sources, black carbon is produced almost entirely by (1) diesels, (2) household burning of anything carbonaceous — wood, coal, animal dung — for heat and cooking, (3) forest fires and planned burning of agricultural fields, and (4) coal used but poorly controlled in developing world industrial uses. Coal is mostly used around the world in producing electricity. I was very surprised to find that coal used to produce electricity emits virtuallly no black carbon, even in China.

        Bond et al., 2004 (“A technology-based global inventory of black and organic carbon emissions from combustion,” Journal of Geophysical Research, V 109) finds that coal burned to make electricity emits about 1 part per 1,000 of global BC emissions, whereas BC from coal used in industrial uses emits almost 100 times as much BC, and residential use of coal about 70 times as much BC. See Table 15 in the publication.

      • John

        Thank you, sir.

        The more I learn, the less I know.

        If I hang around you, I’m going to end up knowing nothing at all in a very short span of time.

      • Bart, it is pretty rare to be complimented on the internet, so thank you very much!

        BTW, you can get a copy of the Bond et al (2004) study by googling the full title, nothing else. Click around a bit and a PDF will become available. Again, look at Table 15, under “this work.” Table 12 gives black carbon emissions by region. It is a great study.

      • Diesels. In the 1990s Europe decided to meet Kyoto targets by increasing use of diesels for cars from around 20% to over 50%.

        Recently huge amount of rainforest were burned to produce crop land for ethanol production which is also a major source of BC.

        In effect Kyoto caused global warming by producing more black carbon.

      • I have a car with a 2.2 l diesel engine (from the late 90s, turbo, common rail direct injection, particle filter, only EURO 3, I think). It’s very clean and very efficient (4.5 – 7.5 l/100 km). Newer generations (EURO 4/5) are even cleaner. Very clean even without particle filters, with filters particles are reduced to close to nothing.

        I am not convinced of significant warming by black carbon. Natural factors are overwhelming. But maybe I’m wrong. Nevertheless, black carbon is a pollutant and should be reduced, warming or not. Warming is a distraction.

      • Edim, we would like numbers.

        “Laws that favor the use of diesel, rather than gasoline, engines in cars may actually encourage global warming, according to a new study. Although diesel cars obtain 25 to 35 percent better mileage and emit less carbon dioxide than similar gasoline cars, they can emit 25 to 400 times more mass of particulate black carbon and associated organic matter (“soot”) per kilometer [mile]. The warming due to soot may more than offset the cooling due to reduced carbon dioxide emissions over several decades, according to Mark Z. Jacobson, associate professor of civil and environmental engineering at Stanford University.”

        http://news.stanford.edu/pr/02/jacobsonJGR1023.html

      • Bruce,

        Of course diesel powered cars emit more particulate matter than gasoline powered ones, because the gasoline cars emit essentially no particles! Diesels are more efficient (better mileage, although the new downsized turbo direct injection gasoline engines have great mileage too – but higher particle emission than the older non-DI generation!). Here’s the link for European emission standards:
        http://en.wikipedia.org/wiki/European_emission_standards

        On the other hand, gasoline powered cars are associated with evaporative emissions (hydrocarbons -> smog), but the’re controlled. Diesel cars evaporative emissions are essentially zero. Other tailpipe pollutants are compareble – diesels have somewhat higher NOx and gasoline cars higher CO. At the moment there’s place for both.

      • Edim, you must agree that more diesel cars in Europe (because of Kyoto) made things worse.

        It is not current standards that interests me, it is the amount of BC created by cars before Kyoto versus the amount created now.

      • Bruce,

        Why worse? Because of warming? I am not convinced that black carbon causes significant warming. Warming is a distraction. Yes we should control/reduce pollutants. But, it’s very hard to do when all the attention is on warming/CO2. As I like to say, CO2 is not a pollutant, but CO2 hype is a big polluter. So, yes Kyoto made things worse, because it grabbed attention from real pollutants and initiated more warming/CO2 hype.

      • John

        Just imagine how much improved Bond et cie will be in conjunction with the HIPPO data.

        I admit to a general bias that what produces particulates often produces other GHGs and other pollutants, so a good reason to scale back one often is adds to a list of good reasons to scale back the same, and merits looking into whether the list applies more generically to the many.

        Edim is very right, by these lights. A pollutant ought be reduced, warming arguments or not.

        And Bruce is extremely sharp to point out the law of unintended consequence is particularly vicious in cases of political forces acting out of ignorance and to appeal to populist or opportunistic agendas.

  21. The discussion here has confirmed that black carbon is a pollutant, which can be removed fairly easily at source. It is an respiratory irritant to humans and other animals and has no known beneficial effects.

    CO2 is a natural trace gas of our atmosphere, essential for all life on our planet, therefore it is not a pollutant by definition.

    Human generate it primarily for energy, which in turn has helped humanity in the industrially developed world move from abject poverty to today’s high standard of living, with developing and underdeveloped nations rushing to achieve higher affluence by also increasing energy consumption.

    CO2 is harmless to humans and animals at many times the concentrations, which could ever occur from human emissions. There are studies suggesting that plants including most crops grow more effectively at higher atmospheric CO2 concentrations.

    CO2 cannot be removed fairly easily at source.

    Let’s attack the harmful and easily removable BC and forget our myopic fixation on CO2.

    Max

  22. Apropos of the HIPPO data, as can be seen from this plot, global temperature rose by about half a degree during the last third of a century, or 0.15 °C/decade. It is clear from the left side of the plot, up to say 1970, that natural fluctuations can be large, raising the question of what proportion of the recent rise is of anthropogenic origin assuming that the natural fluctuations are continuing.

    Assuming preindustrial CO2 was 280 ppmv (it’s now 392 or so), and that the impact of all anthropogenic contributors to global warming is delayed by a nominal quarter of a century (some contributors will act sooner, some later), I figure that the proportion of climate change over any given 20-year period, including projections to 2050, due to anthropogenic factors is as follows.

    1850-1869: 38%
    1870-1889: 42%
    1890-1909: 32%
    1910-1929: 32%
    1930-1949: 84%
    1950-1969: 51%
    1970-1989: 79%
    1990-2009: 87%
    2010-2029: 98%
    2030-2049: 96%

    This is figured as a/(a+n) where a and n are the unsigned magnitudes (absolute values) of the anthropogenic and natural changes over that period. Even when a is relatively small, n can sometimes be even smaller because it fluctuates up and down while a rises pretty steadily with increasing population. The 15 years 1885-1899 saw only 22% because the natural contribution dropped .08 °C while the anthropogenic contribution rose less than .02 °C (which may seem high for that time, but bear in mind that surface temperature is only logarithmic in CO2 level and not linear).

    Separating a and n is an imprecise art so these figures should not be taken too seriously, not even the first digit. But given that the population has been doubling every 40 years or so over the last century, and that independently per capita fuel consumption has also been increasing, it should not be too much of a surprise to see the anthropogenic contribution rising.

    This is entirely based on the temperature record and the rates of increase of fuel consumption and atmospheric CO2, no fancy climate models. What I don’t know how to infer from that data is how to apportion the anthropogenic component between aerosols (soot etc.) and greenhouse gases. GHGs seem likely to be 30-70% of the anthropogenic component, but pinning it down more precisely is likely to be a bit of a shouting match for many people. I would therefore be very interested in hearing strong reasons for a more precise figure, from any source.

    The HIPPO data motivating this thread may give new insight, but so far it’s not clear to me what exactly it tells us about that proportion if anything.

    • The 15 years 1885-1899 saw only 22% because the natural contribution dropped .08 °C while the anthropogenic contribution rose less than .02 °C

      Correction, .024, not “less than .02”. The .08 number was .082. But this is misleadingly much precision.

    • Vaughan Pratt

      Your analysis for the past 160 years would suggest that natural forcing has been responsible for around 35% of the warming experienced from 1850 to today, with the balance attributable to anthropogenic factors.

      This may make sense, although the anthropogenic portion is a bit higher than other estimates I have seen, but there is a major caveat for the future.

      From 1850 to 1950 population grew from 1.171 to 2.406 billion
      This is a CAGR of 0.72%

      From 1950 to 2000 population grew from 2.406 to 6.08 billion
      This is a CAGR of 1.87%

      From 2000 to 2010 population grew from 6.08 to 6.9 billion
      This is a CAGR of 1.27%

      The UN estimates that population growth rate will slow down over the 21st century, with population reaching around 9 billion by 2050 and leveling off to around 10 billion by the end of the century.

      This equals a CAGR of 0.67% to 2050 and 0.21% from 2050 to 2100.

      You should take this unusually rapid growth over the second half of the 20th century and the slowdown thereafter into account in your projections.

      There is also the question of whether or not the current lack of warming since 2001 will continue for another decade or two, as some “crystal ball gazers” project (not including IPCC, who project warming at a rate of 0.2°C per decade).

      If the current cooling continues, it would seem to be difficult to attribute 98% of this cooling to anthropogenic factors per your formula, with CO2 increasing to record levels (unless the CO2 climate sensitivity is much lower than the IPCC models assume).

      Max

      • If the current cooling continues, it would seem to be difficult to attribute 98% of this cooling to anthropogenic factors per your formula, with CO2 increasing to record levels (unless the CO2 climate sensitivity is much lower than the IPCC models assume).

        Unfortunately my model is unable to make short term projections of the “current cooling continues” kind, which are for significantly less than 15-year periods. My model removes all short term (< 15-year) events from the 1850-now HADCRUT3 record and then models just long term climate behavior. This graph shows that so far there’s been no sign of long-term cooling in the sense assumed by my model. The short term data contains way too much noise to base short term projections on.

      • Vaughan Pratt

        my model is unable to make short term projections of the “current cooling continues” kind, which are for significantly less than 15-year periods. My model removes all short term (< 15-year) events from the 1850-now HADCRUT3 record

        Then we’ll have to wait until 2015 for your model to recognize what Trenberth has been quicker to recognize and has referred to as a “travesty”.

        4 more years…

        Max.

      • Then we’ll have to wait until 2015 for your model to recognize what Trenberth has been quicker to recognize and has referred to as a “travesty”.

        There we were exchanging calculations, Max, and suddenly you dropped that into the discussion. This had me scratching my head: were you pulling my leg, or wondering how I’d respond, or what?

        While pondering this I had to run down to the store for a few things. At the checkout counter I noticed this on the magazine rack.

        Then it struck me. You actually were serious!

        Gotta hand it to you, Max, you’re nothing if not persistent.

    • Vaughan Pratt

      What I don’t know how to infer from that data is how to apportion the anthropogenic component between aerosols (soot etc.) and greenhouse gases. GHGs seem likely to be 30-70% of the anthropogenic component, but pinning it down more precisely is likely to be a bit of a shouting match for many people.

      A starting point could be the IPCC AR4 model estimates for total radiative forcing components since 1750, i.e. since “pre-industrial” days.

      CO2: +1.66 W/m^2
      Other GHGs: +1.35 W/m^2 (CH4, N2O, Halocarbons, O3)
      Aerosols: -1.2W/m^2
      Other anthropogenic: -0.21 W/m^2
      Net anthropogenic: +1.6 W/m^2

      As your model points out, the natural forcing components are most likely to be significantly higher than assumed by the IPCC models at 0.12 W/m^2. Your model would point to something roughly like:

      0.35 * 1.6 / 0.65 = +0.86 W/m^2 for all natural forcing from 1750-2005

      As several solar studies have shown, the level of 20th century solar activity was unusually high in several thousand years, and this could well be the underlying reason for this.

      If the above assumptions are correct we would have a 2xCO2 climate sensitivity of around 1°C, as follows:

      total warming = 0.7°C
      anthro = 65% = 0.455°C
      natural = 35% = 0.245°C

      C1 = 280 ppmv
      C2 = 390 ppmv
      C2/C1 = 1.393
      ln(C2/C1) = 0.3314

      ln2 = 0.6931
      dT(2xCO2) = 0.95
      = 0.455 * 0.6931 / 0.3314

      The warming appears to have slowed down considerably (or even reversed), especially since the start of solar cycle 24, so this may affect your future projections.

      Max

      • A starting point could be the IPCC AR4 model estimates

        To what extent do these take aerosol warming into account? My interpretation of this whole thread has been that HIPPO is something of a game changer in that we may need to revise upwards our estimates of the proportion of anthropogenic warming due to aerosols. What’s your take on it?

        I have no good empirical basis for estimating feedbacks, without which it’s hard to know what to say about GHG warming. With a theoretical no-feedback sensitivity of 1 °C/doubling and with the temperature over the past half-century climbing at a rate of 3 °C/doubling of GHGs assuming a 25-year climate response time to GHG increases, there’s a huge amount of room for uncertainty as to what role aerosols played in that rise.

      • You ask what my take on the impact of aerosols is.

        IPCC AR4 has estimated that since 1750 the impact of anthropogenic aerosols was a radiative forcing of -1.2 W/m^2, roughly offsetting the positive forcing from other GHGs besides CO2.

        Natural cloud albedo effect was not considered by IPCC (clouds were only included in the models as positive feedback to other forcings) – but IPCC conceded “cloud feedbacks remain the largest source of uncertainty”.

        SB2011 has pointed out that it is not possible to differentiate between natural cloud forcing and cloud feedbacks to other forcings. This makes sense to me, although the “mainstream” team is having a hard time swallowing it (and may eventually try to refute it scientifically).

        Natural forcing components (including solar) were restricted to direct solar irradiance and relegated to a very small impact (3% of the total) – but IPCC conceded that its “level of scientific understanding” of natural forcing factors was “low”.

        IMO aerosols are being used most recently as the “wild card” to rationalize why the planet is cooling – Trenberth’s “travesty – despite CO2 increase to record levels (viz. the study on China and others).

        A similar attempt at rationalization of the 1945-1975 cooling cycle despite accelerated post-war CO2 emissions was attempted in IPCC AR4, but it has fallen flat.

        The Earthshine data showed that cloud changes have caused warming in the 1990s and cooling in the 2000s, but there is no differentiation between natural clouds and human-caused ones.

        IMO our knowledge of human-caused as well as natural cloud albedo is so rudimentary it is difficult to say what is really going on.

        But since the reflected incoming SW energy represents a major part of the total incoming energy to our planet, it is an area that requires much more work

        Then there is always the CLOUD experiment at CERN.

        Max

  23. If the above assumptions are correct we would have a 2xCO2 climate sensitivity of around 1°C, as follows:

    Your math assumes that the impact of a decrease in CO2 resulting from decreasing population rate of growth would be instantaneous. My model assumes that the response time of different depths below the surface to changes in both aerosol and GHG forcing varies but that the brunt of the effect is felt 25 years later. So for example my model attributes the anthropogenic component of today’s temperature to 1986’s atmospheric loading, which in the case of CO2 was 345 ppmv. ln(345/280) = .2088.

    Another difference is that my model shows a 1.02 °C temperature rise between 1750 and now, of which the natural component rose 0.12 °C (as it happens — from 1880 to now the natural component declined 0.065 °C — the natural component fluctuates up and down) while the anthropogenic component rose 0.90 °C.

    Also your math assumes that 35% of the natural rise between

    Hence the line

    0.455 * 0.6931 / 0.3314

    in your math becomes

    0.90 * 0.6931 / 0.2088

    in mine, or 2.99 °C/doubling.

    But that’s just my model extrapolating 100 years back from 1750, which I don’t take any more seriously than what it projects 100 years forward from now.

    Talking of which, where did you get your figure of 0.7 °C as the rise since 1750?

  24. Vaughan Pratt

    Good news!

    Your computer model only has to wait until December 2012 for a 15-year period of no global warming!

    It can “recognize” this in just 15 months!

    http://www.woodfortrees.org/plot/hadcrut3gl/from:1998/plot/hadcrut3gl/from:1998/trend

    Max

    PS Will respond to your latest post separately, but thought you’d be pleased with the good news

    • It can “recognize” this in just 15 months!

      A mere 15 months? Max, I still think you’re putting too much emphasis on short term climate at the expense of long term.

      If you look at the 12-month-smoothed climate since 1950 you’ll notice that the 14 years 1957-1971 bears some resemblance to 1997-2011. The notch at 2000 is deeper than the one at 1960, and there’s a flat bit around 1966-68 that you don’t see in the peak at 2000, but otherwise they’re quite similar.

      There was a 0.2 °C decline during 1970-72 that 2010-2011 might be going to mimic. Or maybe not.

      There was an even bigger downturn around 1976. Maybe we’ll see that again too, in say 2016.

      Are you claiming that the temperature record from 1970 to 1976 should have told us that things would be getting much cooler during the following decades?

      Judging by what you’re saying about 2000-now, the answer would appear to be “yes.”

      • Vaughan

        Are you being deliberately daft – or is it your natural state?

        It means that the random cooling meme stops working soon. Not that we ever believed it.

        Anastasios Tsonis, of the Atmospheric Sciences Group at University of Wisconsin, Milwaukee, and colleagues used a mathematical network approach to analyse abrupt climate change on decadal timescales. Ocean and atmospheric indices – in this case the El Niño Southern Oscillation, the Pacific Decadal Oscillation, the North Atlantic Oscillation and the North Pacific Oscillation – can be thought of as chaotic oscillators that capture the major modes of climate variability. Tsonis and colleagues calculated the ‘distance’ between the indices. It was found that they would synchronise at certain times and then shift into a new state.

        It is no coincidence that shifts in ocean and atmospheric indices occur at the same time as changes in the trajectory of global surface temperature. Our ‘interest is to understand – first the natural variability of climate – and then take it from there. So we were very excited when we realized a lot of changes in the past century from warmer to cooler and then back to warmer were all natural,’ Tsonis said.

        Four multi-decadal climate shifts were identified in the last century coinciding with changes in the surface temperature trajectory. Warming from 1909 to the mid 1940’s, cooling to the late 1970’s, warming to 1998 and declining since. The shifts are punctuated by extreme El Niño Southern Oscillation events. Fluctuations between La Niña and El Niño peak at these times and climate then settles into a damped oscillation. Most recent warming happened in 1976/77 and 1997/98. Most of the rest was the result of cloud cover changes.

        Science suggests that the world is not warming (Mochizuki et al 2010, Swanson et al 2009, Tsonis et al 2007, Keenlyside et al 2008) – albeit with immense uncertainties surrounding the origins of decadal variability.

        AGW is a spaceship cult. Surprise – the spaceship ain’t coming any rime soon.

        Robert I Ellison
        Chief Hydrologist

      • Vaughan, are you being deliberately daft – or is it your natural state?

        Robert, are you deliberately closing your eyes – or are you legally blind? :)

        What these three graphs show is that after smoothing out all the short term fluctuations, there remain two signals, an oscillation with period somewhere in the range 60-65 years, and a steadily growing curve.

        The red curve is global temperature smoothed with a 750-month (62.5 year) moving average, which has the effect of completely removing any 62.5-year-period sine wave, as well as all harmonics thereof. This is because the frequency response of a moving-average filter is the sinc function, whose zeros are at those multiples. Notice how steadily it rises, without any significant oscillation.

        The other two curves are the same thing with window sizes 550 and 950 months. Notice the oscillation in each, with period about 62 years. Both curves cross the red curve at about the same place every 30 years or so. It’s the same oscillation, but the sign changes when the period passes through 750 months on the way from 550 to 950 months.

        The oscillation is visible because it is not a harmonic of oscillations with period either 550 months or 950 months. The corresponding frequency responses therefore pass that oscillation through.

        The red curve grows steadily from 1880 to 1980. (A 60-year moving average knocks 30 years off each end of the 160-year total.) Now if you calculate the expected increase in surface temperature due to the amount CO2 rose over that period, assuming 2 °C per doubling, it matches the red curve beautifully!

        When theory predicts that something ought to happen, and it turns out to happen, anyone who is determined to prove that it did not happen has to close their eyes to the fact of its happening. Or make up some elaborate alternative explanation of what happened, as you seem to want to do.

      • Vaughan,

        “When theory predicts that something ought to happen, and it turns out to happen, anyone who is determined to prove that it did not happen has to close their eyes to the fact of its happening.”

        Umm, well, when there is a reliable data set to make these kinds of analyses and conclusions you would be correct.

      • I have to agree with kuhnkat on this one. Though your 2 C for doubling does look like a reasonable upper limit, with appropriate error bars of course.

      • Vaughan,

        There is an asymmetry here. I referenced peer reviewed literature – you do some simple smoothing at woodfortrees. You accuse me of making it up?

        Let me get this straight – you are attributing all warming last century to CO2?

        And the rate of warming is about 0.06oC/decade?

        Cheers

      • Umm, well, when there is a reliable data set to make these kinds of analyses and conclusions you would be correct.

        Good point, kuhnkat. I’m certainly not claiming to be any more reliable than the HADCRUT3 data, so if that’s unreliable so am I.

        If you have a more reliable dataset, you have my full attention. If not I’ll stick with what I have, which in that case according to you would be the most reliable available, warts and all. I’m perfectly happy not to pass judgment on its reliability myself, I’ll let others do that.

      • Vaughan,

        sticking with what you have is certainly a reasonable thing to do if you cannot find something better. You just need to investigate its limits so you aren’t fooling yourself with the amount of reliance you put into it.

        There are two primary issues I have with the temp series. Adjustments that would appear to be reasonable within limited application which are smeared over stations that should NOT be adjusted. And the fact that we collect our data where we generate some of the strongest anthro contamination.

        Steven Mosher is fond of talking about how he can get the same answer with virtually any subset of the stations in the temp series. I find it amazing that he can say that with a straight face.

        The CO2 series is based on a BACKGROUND reading that is alledgedly the closest to the actual average CO2 concentration in the atmosphere. It is taken away from anthro influence and adjusted for spikes from known contamination. With the temp series we mostly measure it within UHI or similar areas giving us series that apparently have good correlation to the economic health of the countries!!

        Then there is the whole ridiculous idea that we can AVERAGE these varied locations and come up with something useful without including pressure and humidity data in the computation. Then the even more ridiculous idea that we can ADJUST stations 100’s of kilometers away. This is VERY circular argument. We can adjust diverse stations because they have a similar trends which we keep similar by adjusting them and measuring within contamination areas.

        There have been interminable discussions on different sites on these subjects. Those supporting the current series make claims such as it is all we have. Fine, it is all we have. FIX THEM!!! If you can’t fix them you can’t tell me that they can show we had exactly .67c global increase in temps over ~100 years. The error bars should be huge and they are NOT reasonable evidence of AGW or ANY kind of warming over the last 100 years. They are suggestive of warming at the best.

      • There is an asymmetry here. I referenced peer reviewed literature – you do some simple smoothing at woodfortrees. You accuse me of making it up?

        Sorry, Robert, I’m not following you. I didn’t complain about your peer reviewed literature, which looks perfectly fine to me. What I was accusing you of was ignoring the evidence of AGW. The evidence for it that I pointed out was consistent with the paper you referenced, not contradicted by it.

        You seem to be capable of reading things into a peer-reviewed paper that simply aren’t there. You deny AGW, but the paper you referenced sure as heck didn’t. In fact it is quite explicit about the contrary: “Moreover, we caution that the shifts described here are presumably superimposed upon a long term warming trend due to anthropogenic forcing.”

        If you’re having difficulty with the five-syllable words in that sentence, in words of one syllable what they said is that Earth may try to cool down soon for a while but man could add his bit of warmth to that to bring it back up or more.

        The superimposition they speak of is exactly what my “simple smoothing at woodfortrees” that you were complaining about shows.

        If you still don’t believe me, ask the first author himself, University of Wisconsin-Milwaukee’s Kyle Swanson, who spelled all this out much more clearly than I could in a guest commentary here.

        I can only assume you object to simple demonstrations because they make it clearer where your reasoning breaks down.

      • In slow-fast systems we should see some evidence of persitence in the trend.In the literature we find evidence of anti persitence eg Carvalho 2007

        Abstract. In this study, low-frequency variations in temperature
        anomaly are investigated by mapping temperature anomaly records onto random walks. We show evidence that global overturns in trends of temperature anomalies occur on decadal time-scales as part of the natural variability of the climate system. Paleoclimatic summer records in Europe and New-Zealand provide further support for these findings as
        they indicate that anti-persistence of temperature anomalies
        on decadal time-scale have occurred in the last 226 yrs. Atmospheric
        processes in the subtropics and mid-latitudes of the SH and interactions with the Southern Oceans seem to play an important role to moderate global variations of temperature on decadal time-scales.

        http://www.nonlin-processes-geophys.net/14/723/2007/

        The NZ problem suggests reversibility as we see in KT hometown for the last 15 years.

        0.20 -0.04 0.05 0.94 0.44 0.29 0.25 -0.02 0.10 -0.15 0.59 -0.12 0.14 0.23 -0.16

        Certainly looks like the duck and the devil ie canards on the staircase,Do you think it may be a good time for KT tp phone home.

      • With the temp series we mostly measure it within UHI or similar areas giving us series that apparently have good correlation to the economic health of the countries!!

        Oh, Kuhnkat, you never give up, do you? Good for you, persistence is a virtue when not done to a fault. This ancient skeptic argument was disposed of a decade ago or more in the Third Assessment Report of the IPCC, as follows.

        However, over the Northern Hemisphere land areas where urban heat islands are most apparent, both the trends of lower-tropospheric temperature and surface air temperature show no significant differences. In fact, the lower-tropospheric temperatures warm at a slightly greater rate over North America (about 0.28°C/decade using satellite data) than do the surface temperatures (0.27°C/decade), although again the difference is not statistically significant.

        It’s no wonder skeptics hate the iPCC report so passionately, it did such a thorough job of addressing some of their concerns. (But not all, agreed.)

        Another place to look for evidence of UHI is in the difference between global land-sea temperature and sea-surface temperature, since the latter can safely be assumed free of any UHI effect. I’ve plotted a graph of the two time series here in respectively green (grass) and blue (ocean), along with their difference in red.

        We do see a slight rate of increase in the difference starting in 1975, about 0.025 °C/decade, representing around 15% of the overall rise, However this is to be expected because the ocean is a more effective heat sink than the land due primarily to convection in the ocean that cannot occur on land, so the ocean therefore responds more slowly to sudden changes in radiative forcing.

        Another point to note on this graph is that no such increase in the difference is associated with the almost equal rise from 1910 to 1940. If that rise had some other cause than radiative forcing, in particular ocean oscillations, then one would not expect the difference to rise because it would then be the ocean that is heating the atmosphere and not vice versa as in the case of radiative forcing. But one would not expect it to fall either because the land responds much faster to such changes than the ocean.

        This graph therefore provides a nice confirmation that the rise between 1970 and 2000 does not have the same cause as that from 1910 to 1940, contrary to Girma’s favorite argument, or we would have seen a rise in the difference over the latter period similar to that over the former.

      • “Oh, Kuhnkat, you never give up, do you? Good for you, persistence is a virtue when not done to a fault. This ancient skeptic argument was disposed of a decade ago or more in the Third Assessment Report of the IPCC, as follows…”

        Until we have a long term temperature series with a SIGNIFICANT number of recording stations undisturbed by anthropogenic or geophysical contamination they have disposed of nothing. Claiming that measurements of something else proves something about the primary series has to be PROVEN before the correlation means anything. This is worse than using wind speed to interpolate temperature. I thought you had some scientific training Vaughan?

      • Vaughan

        It was you who wrote that your computer could not recognize data over time periods less than 15 years.

        I simply reminded you that the current “lack of warming” has now lasted almost 14 years, so that there will be a bit more than one year (15 months, actually) until your computer should be able to “recognize” the current lack of warming.

        If you look at the HadCRUT3 record you will see that there have been no periods of cooling lasting this long since the end of the mid-century cooling cycle around 1976.

        Max

      • If you look at the HadCRUT3 record you will see that there have been no periods of cooling lasting this long since the end of the mid-century cooling cycle around 1976.

        We must be looking at different graphs, Max. As I said before, I’m looking at the 15-year-smoothed record, which shows 20 downturns since 1976, namely at

        1977, 1977.25, 1977.33, 1977.42, 1977.5, 1977.58, 1977.75, 1978.33, 1980.58, 1985, 1985.17, 1985.25, 1985.33, 1986.33, 1987.42, 1988.5, 1988.67, 1988.75, 1989.42, 2000.5

        (These are the dates as defined by Woodfortrees, namely the center of each 15-year window.)

        I don’t know what you’re referring to by “no periods of cooling lasting this long.” What “period of cooling?” The most recent downturn was an isolated month 11 years ago (recorded 3.5 years ago), namely 2000.5, when the average temperature for the 180 month window centered on 2000.5 declined by one millikelvin (0.001 K). You can verify all this by clicking on “raw data” at the bottom right of the above Woodfortrees graph and noticing that each month listed above is followed by a colder month.

        This graph plots the month-to-month deltas for the 15-year moving average. The above-listed 20 downturns are the 20 occasions when the delta was negative. During 1990-1995 the deltas hovered around 2 millikelvings, that is, the average was increasing by around 2 millikelvins a month, corresponding to a rise of 12*2*10/1000 = 0.24 °C/decade. Between 1995 and 2005 the rise slowed down to around 0.15 °C/decade.

        You’ll notice two almost-negative deltas at 2002.58 and 2003.58. The raw data shows miniscule increases from 0.384133 to 0.384206 (73 μK) and from 0.397639 to 0.397694 (55 μK), which are so close to no-change they could just as well have been declines, lady luck at work there.

        As can be seen from the monthly deltas graph, isolated downward spikes spaced a year apart are actually quite common. In eight months time, May 2012, there is a fair chance that we’ll see another such isolated spike, and unlike its two predecessors it might actually go negative this time, i.e. a decline for that month.

        But barring any massive volcanoes this year or next, I would bet against there being a sustained cooling lasting three months, between then and May 2013. The last sustained cooling started in 1988.5 and the (smoothed) temperature declined for three months in a row. The previous such 3-month decline started in 1985.17, and the only other sustained decline since 1976 started in 1977.3 and lasted 6 months. Those three runs account for 12 of the 20 declines, the other 8 were isolated declines.

        Your concept of “sustained cooling” has me mystified. How do you define the concept?

  25. Vaughan Pratt

    Response to your latest.

    Your computer apparently “recognizes” things that do not even exist!

    There is no reliable global temperature record earlier than 1850 and your model’s “extrapolation” is meaningless.

    Let’s start in 1850, with CO2 at 290 ppmv, instead of 1750 with CO2 at 280 ppmv. These CO2 numbers are extremely dicey in themselves, as I’m sure you will admit (even though your computer may not “recognize” this).

    Your computer also suffers from the same “low level of scientific understanding” of “natural forcing factors” as was conceded by IPCC, who attribute only 3% of the 1750-2005 warming to natural forcing.

    Yet your earlier figures came out at around 35%, taking the % your computer attributes to CO2 for each period and the warming observed over that period.

    In addition, several solar studies (made by scientists, who did not suffer from a “low level of scientific understanding” of “natural forcing factors” like IPCC) have suggested that this was very likely closer to 50% as a result of the unusually high level of 20th century solar activity (highest in several thousand years).

    The linear warming (1850-2010) HadCRUT3 is 0.66C, so go figure.

    Using your 35%, you end up with a 2xCO2 climate sensitivity around 1°C.

    Lesson to be learned: make sure your computer “recognizes” good data, rather than questionable input – it’s only as “smart” as what you feed it.

    Max

    • There is no reliable global temperature record earlier than 1850 and your model’s “extrapolation” is meaningless.

      So if I measured two power outlets for 10 seconds each, and observed 110V at 60 Hz on the first and 0V on the second, you’re saying that it would be “meaningless” for me to infer that for the 10 seconds before I started the measurement, there was 110V at 60 Hz on the first and 0V on the second?

      And would it be equally “meaningless” for me to infer anything about the next 10 seconds?

      What you’re claiming sounds more like the sort of argument made in philosophy classes than in practical engineering situations. If a tree falls in a forest and no one hears it, does it make a sound? Is that what you’re arguing here?

      • No, Vaughan, you have again misunderstood.

        Your model cannot tell me how much global warming occurred from 1750 to 1850 because there is no global record for this time period. So start your model run with 1850, with the admittedly lousy global record of the time. From 1850 to today the HadCRUT3 record (warts and all) shows a total linear warming of 0.66C.

        There is also no real record of atmospheric CO2 prior to the Mauna Loa measurements, but here we can at least use the IPCC assumption that the Vostok ice core data is meaningful. This shows a CO2 level of around 290 ppmv for 1850. Mauna Loa tells us it is around 390 ppmv today. So we have two more data points.

        We then have to estimate how much of the warming was caused by CO2 (~ total anthropogenic, according to IPCC) and how much was natural. Your computer model apparently has worked out how much was caused by CO2 for each time segment you picked. Multiplying the total observed warming over each of these periods by these percentages and adding these values up gives a total percentage caused by CO2 of 65% over the entire record. This is your model’s number. IPCC has this at 3% while several solar studies put it at 50%, but let’s stick with your model’s number.

        Using this percentage and the observed data one can calculate the 2xCO2 climate sensitivity. It comes out around 1C.

        All very simple engineering-type calculations, with no philosophical stuff thrown in (such as analogies with power outlet measurements a few seconds apart).

        Max

      • I see how easy it is to make an AGW believer happy…just load them up with noisy data from which they can draw endless spurious correlations…then they can happily spend their time creating semi-plausible storylines. They have models and the models are based on math. That’s all they need.

      • They have models and the models are based on math. That’s all they need.

        That and data. What do you have?

      • Your model cannot tell me how much global warming occurred from 1750 to 1850 because there is no global record for this time period

        If I measured two power outlets for 10 seconds each, and observed 110V at 60 Hz on the first and 0V on the second, are you saying that I can infer nothing whatsoever about the preceding 10 seconds because there was no record of the voltage for that period?

        While your argument might score points in a freshman philosophy seminar, I doubt it would impress an electrician out there in the field with her voltmeter. She is most likely going to say that if the voltage is a steady 110 VAC for the 10 seconds she measured it, it almost certainly was the same during the preceding 10 seconds.

        Science tells us nothing, it can only estimate. A model can certainly estimate how much global warming occurred from 1750 to 1850, and it will almost certainly be wrong, but the estimation may be useful (to borrow a handy quote from JC’s latest post). We can quibble all day over “may.”

        Your computer model

        If I said something that made you think I had a “computer model,” my apologies for being unclear. All I have is a simple (one-line) closed form formula for natural climate change, and another for anthropogenic climate change. The only primitives my two formulas use are plus, minus, times, divide, exp, log, and sin. While it’s true I use a computer to evaluate these two formulas at the requisite points for convenience of plotting, I could just as well use a scientific calculator and plot them by hand.

        All very simple engineering-type calculations,

        Correct. And I showed you my simple calculations leading to 3 °C for 2xCO2 climate sensitivity. One crucial difference however is that I’m allowing 25 years for CO2 to impact temperature whereas you seem to be assuming that increasing CO2 instantly impacts surface temperature. If I adopt your assumption then my figure drops to 1.8 °C/doubling.

        So why should we expect that a change in CO2 instantly impacts the surface temperature? I would have thought the ocean would smooth out and therefore delay the response by acting as a sort of thermal capacitor. Do you find that implausible?

      • They have models and the models are based on math. That’s all they need.

        That and data. What do you have?

        I have two things …

        1) AGW supporters are fanatics. Their excessive eagerness blinds their judgement. It doesn’t automatically mean they are wrong. It does strongly suggest they have gone stupid.

        2) I have nothing

        Having nothing has it’s advantages. It helps me to search for ‘something’ ….

      • Vaughan Pratt

        This exchange is beginning to ramble, but let’s see if we can cap it off.

        You wrote:

        I showed you my simple calculations leading to 3 °C for 2xCO2 climate sensitivity. One crucial difference however is that I’m allowing 25 years for CO2 to impact temperature whereas you seem to be assuming that increasing CO2 instantly impacts surface temperature. If I adopt your assumption then my figure drops to 1.8 °C/doubling.

        Let’s get down to basics here, Vaughan. You have not cited any empirical data, which confirms your assumption of ”25 years for CO2 to impact temperature”. Since volcanic forcings obviously do not take 25 years to impact temperature, I find such an assumption extremely dicey.

        Also your own data do not support such a high climate sensitivity.

        It is your estimate I have used for the percentage of warming attributable to CO2, namely:

        I figure that the proportion of climate change over any given 20-year period, including projections to 2050, due to anthropogenic factors is as follows.
        1850-1869: 38%
        1870-1889: 42%
        1890-1909: 32%
        1910-1929: 32%
        1930-1949: 84%
        1950-1969: 51%
        1970-1989: 79%
        1990-2009: 87%
        2010-2029: 98%
        2030-2049: 96%

        So I simply took your percentage for each time period and multiplied it by the linear warming experienced over that time period (HadCRUT3), added these numbers up and divided to arrive at the average anthropogenic proportion of 65%.

        Taking this number plus the other physically observed data since 1850 gives me a 2xCO2 climate sensitivity ~ 1°C.

        One could jiggle this calculation around to accommodate an assumed 25-year lag, but that does not make much sense IMO.

        Basically this would suggest that all feedbacks cancel one another out and our climate system is not unstable, as IPCC would have us believe.

        Max

      • You have not cited any empirical data, which confirms your assumption of ”25 years for CO2 to impact temperature”. Since volcanic forcings obviously do not take 25 years to impact temperature, I find such an assumption extremely dicey.

        Two points here. First, volcanos differ from CO2 in that the addition of aerosols by a volcano to the atmosphere occurs as a single event whose effect then decays over a period of a number (10, 20 or more) of months. CO2 is the opposite: it has been increasing for more than a century. Hence one would expect little or no delay in the impact of a volcano, but a nontrivial delay in the case of rising CO2.

        Second, my model is intended to take whatever delay has been assumed into account and to use the notion of climate sensitivity appropriate to that delay. Thus by “assuming 25 years” I mean that for someone who thinks the delay is 25 years, climate sensitivity would be around 3 °C. For those who think it is zero, climate sensitivity would be 1.8 °C. The actual climate behavior is no different in either case, it’s a question of what definition of climate sensitivity you’re working with. The IPCC considers different possible notions of climate sensitivity, a concept which I incorporate into my model so as to make climate sensitivity a function of whatever one believes the delay to be.

      • So I simply took your percentage for each time period and multiplied it by the linear warming experienced over that time period (HadCRUT3), added these numbers up and divided to arrive at the average anthropogenic proportion of 65%.

        But I could just as well have used 30-year intervals instead of 20-year, and your averaging procedure would likely give quite a different result, which moreover would bear little or no relationship to the definition of anthropogenic factor for the time interval [s,t] as ast/(ast+|nst|). If the natural increase nst for that interval is positive, this is simply the proportion of total warming attributable to the anthropogenic factor ast, as one would expect. In the long run (millions of years) the natural fluctuations for each millennium should average out to very close to zero or Earth would be in the same boat as Venus: impossibly hot. In the short run of say 150 years, overall nst will be smaller than overall ast because nst fluctuates around a mean of zero while ast grows with growing population. Hence for 150 years the anthropogenic factor may well be considerably larger than the average your method produces.

      • Vaughan

        Your latest posts from September 11 and 12 do not add anything new to our discussion.

        Assumptions on time delay, climate sensitivity, etc, are nice. But they are just that: assumptions.

        Physical observations on changes in temperature and CO2 levels tell us a bit more, but we are stuck with making assumptions regarding the relative forcing of CO2, other anthropogenic factors and natual factors.

        Here we have IPCC on one side suggesting that natural factors only represented around 3% of the forcing since 1750 and all other natural forcing components other than CO2 cancelling one another out.

        Then we have several solar studies suggesting that around half of the warming can be attributed to the unusually high level of 20th century solar activity.

        In between these two estimates we have your model assumption, which results in roughly 65% of the warming attributed to CO2.

        If we use the IPCC assumptions above plus the observed data from 1850 to today, we end up with a 2xCO2 CS of ~1.4C.

        If we use your model’s assumptions, we arrive at ~1.0C.

        If we use the postulations of the solar studies we arrive at ~ 0.7C.

        All of the above without complicating things by adding in an assumed significant time delay (or “hidden in the pipeline” postulation), which would, of course, increase the CS..

        All of this points toward our host’s observation that there is a whole lot of uncertainty out there regarding a) natural climate forcing and b) positive/negative/or neutral net overall feedback, which make it uncertain whether or not AGW is a potential problem or nothing to worry about at all.

        Some of this uncertainty may prove to be impossible to resolve with added scientific knowledge based on empirical data, while other portions may be resolved with new research results. Will the CLOUD experiment at CERN give us new knowledge to help resolve some of this uncertainty? Or will new knowledge come from a totally different source? Who knows?

        It’s still a wide-open question with nothing cast in concrete.

        Max

      • It’s still a wide-open question with nothing cast in concrete.

        Those who define their terms consistently see a consistent world. Those who pay no attention to the definition of the terms they bandy about freely see only a confusing world where nothing is cast in concrete.

        The confusion you speak of is in your own mind. It is created by refusing to define your terms before reasoning with them. The upshot is a mishmash of illiogic that makes no sense. You then project this confusion on those around you and blame them for it.

        One cannot reason reliably about terms whose definitions keep changing.

      • @Vaughan Pratt

        “Those who define their terms consistently see a consistent world.”

        Agreed. It’s called being blinded by the truth

        A “Sure bet” is a desirable proposition. It isn’t easy to resist such enticement. When that enticement amounts to premature convergence in a poorly posed problem it represents being caught up in a basin of attraction for an excessively localized critical structure.

        A person becomes blinded by the light … blinded by a premature and localized truth.

        It is difficult to escape such enticing entrapment!
        It’s not easy to decide whether that trap isn’t truthfully an appropriately safe haven.

        … Works both ways

      • Those who define their terms consistently see a consistent world.

        A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines.

        Ralph Waldo Emerson (1803–1882)

      • Well, at least you’re consistent in your reliance on inconsistency to make your points.

  26. Amazing image of CO2 ice on mars:
    http://hirise.lpl.arizona.edu/ESP_023464_0945

  27. Coal to gas: the influence of methane leakage
    http://www.springerlink.com/content/b430681263425q64/fulltext.pdf

    • Edim, I appreciate you attempts to ensure China burns 10% more coal year in and year out.

      But I personally would still prefer Natural Gas no matter how many idiots are allowed to publish fictional papers.

      “In summary, our results show that the substitution of gas for coal as an energy source results in increased rather than decreased global warming for many decades”

      The corollary must also be true:

      “In summary, our results would also show that the substitution of coal for gas as an energy source results in decreased rather than increased global warming for many decades.”

      Hurrah for coal. It will save us from global warming!

      • Save us from a good thing? Warming is benign, and if the cooling trend deepens and/or if we get a full reversion to Ice Age normal, we’ll be thankful for every scram of warming we can scrape together.

        And even if there is no cooling trend, the “Optimums” (Roman and Holocene) were much warmer than today. I think we might quite enjoy a replay.

    • Edim

      You cited an interesting study by Tom Wigley.

      Since Wigley has been more alarmist than IPCC on the dire consequences of AGW, we should probably take his article “Coal to gas: the influence of methane leakage” with a grain of salt, but let’s analyze it skeptically anyway.

      On first glance, it sounds like a “plaidoyer” (or plea brief) against shale gas, rather than an endorsement of coal.

      Wigley’s conclusion is that, if increased natural gas consumption introduces a 2% leakage, then the negative greenhouse impact from the leaked methane will offset any CO2 benefit from replacing coal.

      For the conversion from coal to gas, Wigley does concede, however:

      The overall effects on global-mean temperature over the 21st century, however, are small.

      It is to be noted that Wigley did not estimate “the overall effects on global-mean temperature over the 21st century”, but stated:

      In our analyses, the temperature differences between the baseline and coal-to-gas scenarios are small (less than 0.1°C) out to at least 2100.

      This is, indeed, “small”, but let’s see how much ”less than 0.1°C out to 2100” they would really be.

      But first, let’s do a reality check on Wigley’s assumptions.

      Wigley states:

      For our coal-to-gas emissions scenario we assume that primary energy from coal is reduced linearly (in percentage terms) by 50% over 2010 to 2050 (1.25%/yr), and that the reduction in final energy is made up by extra energy from gas combustion.

      After 2050 we assume no further percentage reduction in coal-based energy (i.e., the reduction in emissions from coal relative to the baseline scenario remains at 50%).

      How realistic is this assumption?

      Over 40% of the world’s electrical power comes from coal today. This represents an installed capacity of around 1,760 GW.

      World-wide around 70 GW new coal-fired power plants are being added every year, with China and India alone accounting for around 90% of this new capacity. There are no new coal-fired plants currently planned for the USA, but if the country returns to the rate of new additions over 2000-2008, this would represent around 4 GW new capacity annually.
      http://www.engineerlive.com/Power-Engineer/Focus_on_Coal/Coal-fired_power_plants_capacity_to_grow_by_35_per_cent_in_next_10_years/21600/

      IOW, for Wigley’s assumption to be realistic, both China and India would have to drastically reduce the new construction of coal-fired power plants and switch these over to natural gas.

      Since both nations have large coal reserves and hardly any natural gas, it is highly unlikely that they will make a major switch to gas-fired plants in the future, so Wigley’s basic assumption does not pass a reality check.

      But let’s go along with Wigley’s premise anyway and check out the temperature impact of switching new power plants to natural gas.

      Wigley reckons with a 1.25% linear annual reduction of world-wide coal-fired power capacity over 40 years and no change thereafter. This represents a net annual reduction of coal-fired capacity of 22 GW.

      IOW, over the next 40 years there will be a switch of 22 GW power capacity from coal to gas.

      A typical coal-fired plant generates 1,100 g CO2 per kWh generated, while a gas-fired plant generates around half this amount per kWh. Let’s forget about Wigley’s “leakage” estimate (for either natural gas production or coal mining, because this appears to be basically a “red herring” directed at shale gas).

      These plants operate at an average of 8,000 hours per year, so for 22 MW this equals 176,000 GWh per year, or a net annual CO2 reduction of 0.0968 Gt.

      Over the period 2010-2100 this represents a cumulated reduction in CO2 emissions of 6.78 GtCO2.

      The mass of the atmosphere = 5,140,000 Gt.

      This represents a reduction of atmospheric concentration by 2100 of:
      6.78 * 1,000,000 / 5,140,000 = 1.32 ppm(mass) = 0.87 ppmv

      Using the logarithmic relation between CO2 and temperature and the mean value for the 2xCO2 climate sensitivity of 3.2°C, as assumed by the IPCC models, we arrive at a net temperature reduction by year 2100 of 0.0069°C.

      This is well below Wigley’s estimate of ”less than 0.1°C”, (in fact, it’s less than 0.01°C and if I were China or India, I would ask:

      Why the hell are we doing this in the first place?

      Max

      • Good analysis Max. I know Wigley has benn very alarmist. I don’t agree with the paper. I find almost all mainstream AGW/CC papers absurd. I just cited it because it was new. I am making a mockery of the paper by citing it.

  28. Vaughan

    Extracting from another of these interminable nests.

    http://judithcurry.com/2011/09/07/global-portrait-of-greenhouse-gases/#comment-111185

    Swanson extracts the ‘true warming signal’ from the recent warming – 1979 to 1997. That is – it excludes the ENSO fluctuations that occurred in 1976 and 1998/2001 as extreme events of the sort that occur at times of chaotic bifurcation. Although as the ENSO spanned 1997/98 – 1997 should also be excluded.

    So we have a residual warming of 0.1oC/decade. Is this a problem and was it all due to CO2? NASA suggests that it was in some part due to clouds – and I know we have had this discussion before. Did you look at my post on decadal variability of cloud? It would save a lot of time if we were on the same page.

    ‘The overall slight rise (relative heating) of global total net flux at TOA between the 1980’s and 1990’s is confirmed in the tropics by the ERBS measurements and exceeds the estimated climate forcing changes (greenhouse gases and aerosols) for this period.’ There was in fact ‘relative cooling’ in the IR.

    From Swanson et al 2007. ‘This suggests that the climate system may well have shifted again, with a consequent break in the global mean temperature trend from the post 1976/77 warming to a new period (indeterminate length) of roughly constant global mean temperature.

    The shifts are associated with the roughly 60 year periodicity seen in ocean and atmospheric indices and biological – especially fisheries – cycles. There are 2 things to keep in mind here.

    Firstly – the decadal variability is associated with Pacific variability especially – and result in many changes in temperature, convection, clouds and hydrology. ENSO changes the surface temperature in 2 ways – energy transfer between ocean and atmosphere and in changes in cloud radiative variability at TOA. La Nina cool the planet and El Nino warm.

    ‘Specifically, when the major modes of Northern Hemisphere climate variability are synchronized, or resonate, and the coupling between those modes simultaneously increases, the climate system appears to be thrown into a new state, marked by a break in the global mean temperature trend and in the character of El Nino/Southern Oscillation variability.’ op. cit.

    So what is the nature of decadal ENSO variability?

    It can be seen here – http://www.esrl.noaa.gov/psd/enso/mei/ – a La Nina bias (blue) to 1976, an El Nino bias to 1998 and La Nina since. The decadal modes are linked to the PDO and last for 20 to 40 years in the instrumental record. The cool PDO is associated with more intense and frequent La Nina and vice versa.

    So there is quite some justification for supposing the planet won’t warm for another decade or three.

    The other thing to keep in mind is that the variability is not limited to what is seen in the instrumental record. ENSO varies considerably over much longer timeframes.

    http://i1114.photobucket.com/albums/k538/Chief_Hydrologist/ENSO11000.gif

    There is some suggestion that ENSO variability is linked to solar UV drift – which would suggest the possibility…

    The problem with warmists is that they insist that climate is so complex it can only be understood with a supercomputer – and then that it is so simple that only a moron would not get it.

    I don’t think you have a broad grasp of science – not even the concept of spectral absorption in the atmosphere – and so solve the wrong problem in the wrong way.

    I do reject AGW entirely – as a linear concept in a non-linear world.

    See – http://www.whoi.edu/page.do?pid=12455 – and –
    http://www.nap.edu/openbook.php?record_id=10136&page=1 – and – http://www.biology.duke.edu/upe302/pdf%20files/jfr_nonlinear.pdf

    My expectation is, however, that you are an AGW ‘spaceship cultist’ – and can’t process anomalies.

    Cheers
    Robert I Ellison
    Chief Hydrologist

    • My expectation is, however, that you are an AGW ‘spaceship cultist’ – and can’t process anomalies.

      So what’s your long-term plan here, CH? To convert every spaceship cultist one at a time by convincing them of the correctness of your views? How’s that been working out?

      • I keep warming them to stay out of the Kool-aid queue in the AGW equivalent of the spaceship ain’t coming anytime soon. What is it – a decade and counting and peer reviewed science suggesting at least another 10.

        ‘Researchers first became intrigued by abrupt climate change when they discovered striking evidence of large, abrupt, and widespread changes preserved in paleoclimatic archives… Modern climate records include abrupt changes that are smaller and briefer than in paleoclimate records but show that abrupt climate change is not restricted to the distant past.’
        http://www.nap.edu/openbook.php?record_id=10136&page=19

        Did you try to understand any of it?

        Instead you seem to specialise in smartarse quips, stupid analogies and simplistic analysis. My plan is to try to rescue something something from your nonsense.