by Judith Curry
A few things that caught my eye this past week.
A detailed and understandable account of how raw data make it onto global temperature graphs [link]
Die Klimazwiebel: Hottest summer – what does this tell us? [link]
A relatively sensible perspective: Was that extreme weather event influenced by climate change? [link]
Time running out for Larsen C? [link]
Absence of 21st century warming on Antarctic Peninsula consistent with natural variability [link]
A Nearly Ice-Free Northwest Passage [link]
Arctic sea ice decline is like a ball bouncing down a bumpy hill [link]
History: A reanalysis of Hurricane Camille (1969) [link]
Another paper modeling future tropical cyclones finds lower freq near SE coast, increase in open Atlantic. [link] …
The Record-Breaking 2015 Hurricane Season in the eastern North Pacific: An Analysis of Environmental Conditions [link]
Atlantic bathwater: Why the ocean is so warm right now and what it means [link]
NYTimes: Giant Coral Reef in Protected Area Shows New Signs of Life [link]
Big fish – and their pee – are key parts of coral reef ecosystems [link]
“Two centuries of [natural] decadal climate variability across the Pacific North American region” [link] …
New Survey Finds Antarctica Covered in More Ice Than Previously Thought [link]
Anticipating Disaster: Formal Climate Information vs. Traditional Ways of Knowing Floods and Droughts [link]
The BBC picks private weather forecaster as it ditches Met Office after 94 years [link]
Climate modeling suggests Venus may have been habitable [link]
Death of the Bering Strait theory for the population of the Americas [link]
Ever noticed that #rain has a smell? It’s known as #petrichor [link]
Today’s men are not nearly as strong as their dads were, researchers say [link]
The science of science
New paper condemns ‘Atmosfear’ fear-based approach to communicating effect of Climate Change on Extreme Weather [link] …
From the American Council on Science and Health: Political Correctness Hampers Advancement of Science [link]
Science communication as a moral imperative [link]
New political science initiative calls for evaluating the research process before knowing the results [link]
Flawed citation practices facilitate the unsubstantiated perception of a global trend toward increased jellyfish blooms [link]
Should writing for the public count toward tenure? [link]
People see the weather differently depending on their politics (high importance of minimizing cognitive dissonance). [link]…
The impact of academia on Parliament: 45 percent of Parliament-focused impact case studies were from social sciences [link]
News from the alarmed
Obama: Climate Change ‘Could Mean No More Glaciers In Glacier National Park,’ Threaten the Statue of Liberty [link]
Scientist calls for World War mobilization vs climate change [link]
“Is it useful to think of climate change as a ‘world war’?” [link] …
Climate activists: Should we be having kids in the age of climate change? [link]
World’s leading sea ice expert (Peter Wadhams) warms that the Arctic sea ice death spiral will make global warming worse [link]
New study: Leo DiCaprio got people talking about climate change, if only for awhile. [link] …
Piers Sellers: Space, Climate Change, and the Real Meaning of theory [link]. If you read one article from this subsection, make it this one.
“Fundamentally, a theory in science is not just a whim or an opinion; it is a logical construct of how we think something works, generally agreed upon by scientists and always in agreement with the available observations.”
that’s nice but no matter how cool the theory it does need to be supported by empirical evidence. also, all theories in science require and encourage discussion and debate without calling people “deniers”. activism and science do not mix because activism corrupts science.
Agreement with observation and empirical evidence are the same thing.
However, the term “theory” is much vaguer than described here. In many cases it is synonymous with hypothesis, so every conjecture is a theory. Thus there need be no agreement. Nor need there be complete agreement with observation. For over 100 years there was raging debate between the particle and wave theories of light. Each had some observational support, plus some non-support. Science is not simple.
Theories in science become “theories” because they *are* supported by empirical data.
Karl Popper wrote “scientific theories must be falsifiable (that is, empirically testable), and that prediction was the gold standard for their validation”.
The theory of climate change due to the accumulation of greenhouse gasses has never been empirically tested (decrease the concentration of CO2 and observe global cooling), and no models predicted the slowdown in the rate of global warming after 2000, or the current acceleration in the rate of warming..
On both counts, climate change due to the accumulation of greenhouse gasses fails as a viable scientific theory.
Or am I missing something?
“The theory of climate change due to the accumulation of greenhouse gasses has never been empirically tested (decrease the concentration of CO2 and observe global cooling)”
Because it’s impossible to do such experiments — unless you have a second Earth we can play with.
Climate science isn’t an experimental science, it’s an observational science — like geology, like medicine, like astronomy.
David, you are far too restrictive in your interpretation of the word experiment. Some of Einstein’s theories were tested by observing astronomical events. That didn’t need a spare universe nor does climate science need a spare earth.
All that is needed are falsifiable predictions. You know, like children just aren’t going to know what snow is, or the rains which fall will no longer fill Australian dams, or the arctic will be ice free by , or that the earth will soon have a climate like Venus.
I’ve given you a few false falsifiable predictions. Now it’s over to you. State a clear falsifiable prediction, when it was made, and when it came true.
You wrote “Climate science is not an experimental science, it is an observational science”
All too true, and thus the science is prone to being wrong, if the wrong conclusions are drawn from the observations (the earth was once thought to be flat, the sun revolved around the earth, and there was no continental drift)
You also did not address the inability of the greenhouse gas theory to make accurate ( temperature) predictions, a requirement for any valid scientific theory
Here’s a Quick N Dirty Guide to Falsifying AGW:
It’s their complete inability to accept the falsification of their theory, without then trying to fudge things that’s at issue and if then when one can’t fudge things any longer to adjust the hell out of their own data. If at first one doesn’t succeed then move the goal posts.
There was a time when a pause of 15 years was enough to kill CAGW (supposedly) that past so they moved the goal posts, then when that didn’t seem to be going to well for them. They gave us the various pause busting papers….. There’s lies, damn lies, statistics and then there’s Climate Studies.
It doesn’t help that they add extra colour to their works with various op-eds, “Children won’t know what snow is…”, “Death Spirals…” etc etc
They can’t even agree on the effects of Clouds. Honestly, any other area of science that’s at this level of maturity would generally be beavering away and keeping themselves to themselves, not trying to change the world and not sticking their fingers in their ears when someone else helpfully points out that they’ve made a mistake here and there.
> It’s their complete inability to accept the falsification of their theory
Like the Denizens’ theory that AGW can’t be falsified.
Another response, this time from NG:
That was only five years ago.
Judith, I read the Sellers piece and was totally unimpressed. Just another amateur science communicator talking about things he has only third hand knowledge of. His explanation of climate models is 4th rate.
Was also discussed elsewhere. A lot of Seller’s ‘facts’ are also off.
I looked up Piers Sellers’ academic record. He appears to have ~100 publications, an h-index of almost 50, and 3 of his 10 most highly-cited papers (all 3 of which he is first author) mention GCMs. None of that necessarily means that he is any good at what he does, but it does mean that suggesting that he has third-hand knowledge is bizarre. Personally, I think you’re talking out of your hat.
The thing that always bothered me about “attribution studies” is that they only study disasters — never “lack of disasters”.
You can assume that climate change has some affect on *everything*. So any natural disaster can have some percentage of it attributed to climate change. Might be 1% might be 50%. So if you take all of the natural disasters and all of their attributions (even if you assume you got them exactly right) and you can say “Climate change has cost us x,000,000,000 dollars from natural disasters.
But the problem is that when *nothing* happens, there is no study. We’ve had a huge lack of major hurricanes hitting the US. I have no idea how much climate change has affected that. But even if it was only 1%, then you could say “climate change *saved* us x,000,000,000 dollars.”
If you only count the costs and not the benefits you really aren’t getting any useful information. I’m not sure there is any motivation to change this, though.
“New paper condemns ‘Atmosfear’ fear-based approach to communicating effect of Climate Change on Extreme Weather [link]”
“In reality, recent increases in damages and losses due to extreme weather events are due to societal factors. Thus, invoking atmosfear through such approaches as attribution science is not an effective means of either stimulating or legitimizing climate policies.”
H/T Dr. Pielke, Jr.
+10 for Jankovic and Shultz for Atmosfear.
The Sellers article was the usual talking points gibberish.
+5 for Tim Gleason’s comments on the lack of attention to possible positive effects of possible climate change. A Cost/Benefit analysis, such as the president’s comments is just more gibberish.
Piers Sellers displays a shocking lack of understanding of the fundamentals of computer models. He doesn’t seem to distinguish between hindcasting and forecasting, apparently considering that hindcasting is just as good. He doesn’t appear to have ever heard of GIGO. Most of the rest of the article is vapid tripe. I very surprised Judith, that you would recommend that article.
It occurs to me that your complaint is precisely why she did so. Seemingly aurhoritative NASA astronaut completely off base in several ways. But you have to be knowledgable enough to see that. The general public gets misled.
After reading the article, this was my conclusion re Judith’s “must read” tag, as well. To my mind, even the inclusion of this article – not to mention its last but not least position on her list – strongly suggests to me that she may well have determined that it was an excellent substitute for the cartoons with which she has ended such lists in the past :-)
Saw this just this morning:
“NASA Unveils New Public Web Portal for Research Results”
Sounds like a step in the right direction.
Re: A reanalysis of Hurricane Camille (1969)
Very interesting. Thanks for posting the link.
I agree, opluso. It brings back many memories of the storm and its aftermath.
Just one little story: A couple of days after landfall my father, who was the Mississippi personnel director for “a large telecommunications company”, drove from Jackson to the remains of Gulfport with an armed guard and a folding table, two folding chairs, and several boxes full of cash in the trunk of a company car. For the next two days they handed out cash to employees based on nothing more than a company ID, a signature beside each handwritten dollar amount, and a handshake.
The destruction and suffering was epic. While traces of Camile remained for many years, the return to “normalcy” was surprisingly rapid and orderly for most.
Dr. Curry ==> “Piers Sellers: Space, Climate Change, and the Real Meaning of theory [link]. If you read one article from this subsection, make it this one.”
Share your ideas on why Piers’ essay is so important?
KH, will not speak for Judith, but see my reply to philipcollet just above. I think she points to something seemingly reputable that is very off in several dimensions, as an example of how bad warmunist stuff has become.
Piers Sellars is a huge honcho at NASA, carries huge authority. Also a good writer and effective communicator. He is worth reading for those two factors alone. This is a very influential essay. Personally, I disagree with much that he said.
Perhaps criticism of Mr. Sellers reasoning may not be fully justifiable.
‘Spacewalking’ astronauts in particular are poorly protected against the space radiation, a long term brain damage could be the most likely effect. Prefrontal cortex is the least protected and it is associated with a range of mental functions one being logical reasoning.
Piers Sellars is a huge honcho at NASA, carries huge authority. His NASA would never make it to the Moon.
So true Curious.
Miss those ‘Right Stuff’ guys.
Fraidy cats run everything now.
Is Tang gluten and GMO free?
Judith writes, while noting her disagreement:
In my ignorance, I had never heard of Sellars before. But, that aside … There goes my theory as to why mention of this article received its place of honour this week. <sigh>
Judith, if you disagree with Sellers, then as a scientist you have an obligation to explain those differences.
It’s not enough to just state a disagreement and let it go at that. Be a professional.
Humanity lost its frame of reference, and will re-establish contact with reality or become as extinct as a goose without an internal compass to distinguish north from south.
The internal compass of every atom, life and planet in the solar system is indelibly recorded in precise rest mass measurements of every known atom:
Add spin, and charge, and color for quarks and gluons….
Yes, David, this rainbow of glitter from CERN made goverment propaganda ever more attractive to “theoretical physicists.”
Now even the Higgs boson is government propaganda?
Oddly funny. Funnily odd.
Not to “theoretical physicists.”
More background on Dr. Piers Sellers here. As well as being an astronaut, he was first a climate scientist and well familiar with climate and weather models, and has returned to that with his research group.
Which makes Seller’s article even more inexcusable than it already is.
No specificity in your attack at all. It is worthless like that.
JimD, Sellers trite referendes to ‘theory” as justification is the epitome of nonspecificity. It’s just vague nonsense.
The piece is effective, and there is nothing in there that is not supportable from the scientific standpoint. It separates science from opinion and tells you the basic thinking behind the confidence of the consensus. In the end, it is just the science.
JimD, as just one specific example, the fifth sentence of Sellers second paragraph contains two separate falsehoods. The sentence asserts that but for CO2, temperature since 1890 would be roughly constant. Falsehood 1, even AR4 said tue rise from ~1920-1945 was not attributable to CO2. Yet that period is statistically indistinguishable from ~1976-2000. Falshood 2, based on the Keeling curve, ~35% of all CO2 since 1958 was injected after 1999, a period during which satellites observe no rise in temperature except the now rapidly cooling 2015 El Nino blip.
You say he is a knowledgble climate scientist. Then I say he is also knowingly misrepresenting important facts about attribution and the CO2 ‘control knob’, That is dishonest.
“Falshood 2, based on the Keeling curve, ~35% of all CO2 since 1958 was injected after 1999, a period during which satellites observe no rise in temperature except the now rapidly cooling 2015 El Nino blip.”
RSS LT shows +0.12 C of warming since Jan 2000.
UAH LT shows +0.16 C.
Both intervals are too short to be representative of climate change.
JimD asks for specifics, gets this: “Falsehood 1, even AR4 said tue rise from ~1920-1945 was not attributable to CO2. Yet that period is statistically indistinguishable from ~1976-2000. ” and then ignores that devastating fact.
Come on Jim, how does this little fact fit in with the “basic thinking behind the confidence of the consensus”?
He is not a knowledgeable climate scientist, because knowledge implies truth. He is a devout climate scientist. The sorts of errors ristvan points to are common among climate scientists.
You would do well to listen more to the people who worked in the field most. These people have a feel for the system interactions that only comes with experience and using experiments for testing hypotheses. The armchair critics just don’t have that feel.
JimD, as a licensed lawyer for now almost exactly 40 years, I have a profound feel for truth gained the hard way. As a practicing business man (strategic consultant, senior exec, VC, entrepreneur) I also have a painfully developed BS detector. If I accepted what every expert witness or expert consultant said, it would not have gone well.
Your faith in warmunist ‘expertise’ and ‘feel for interactions’ is sadly misplaced. You might try essay Cloudy Clouds’ in ebook Blowing Smoke for a particularly cutting documented example from this mere ‘armchair critic’. You are being manipulated and lied to bynwarmunists like Sellars. How many examples do we have to give you before you begin to understand that? See two Sellars examples in one sentence now released from moderation just above.
Your repeated appeals to authority (Sellars) are increasingly pathetic. Why not do some fact checking of them yourself? Here is a starter offer on cloud feedback. What is the r^2 of Dessler’s ‘famous’ 2010 cloud feedback paper? What does NASA (and Dessler himself) say about it on NASA’s ‘official’ climate website? (Clue, answers are documented with footnotes, image captures, and links in essay Cloudy Clouds– all the homework already done for you, but available for youmto double check for yourself). Now this presumes you know Stats 101. If not, you can google correlation coefficients, how they are calculated, and their statistical meanings.
As a lawyer, you should know that observational evidence is a key determinant of the truth of the matter. Since 1950, we have had 75% of the CO2 increase and 75% of the warming. The rate is consistent with 2.4 C per CO2 doubling, or 2 C per doubling if you consider that CO2 is 80% of the net forcing change. This supports the consensus transient sensitivity. You may protest “but solar natural variation…” but that one fails because the sun is weaker now than it was in the 1950’s. Then you might say “but the oceans…” but we have taken a 60-year timeframe that cancel the ocean oscillations to a small fraction of the trend of indeterminate sign. I could get into the observation-based imbalance too, that shows that the warming has not even kept up with the net forcing change, but that just is part of the >100% attribution argument. So, no, the support of AGW is not just models, but also what has been happening while we have had measurement records.
for JimD below. Yes, we’ve had 75%of the CO2 increase and 75% of the warming. We’ve also had 75% of the bullxxxx from the IPCC. The rate of change in both can be and is consistent with many things but correlation is not causation. It MAY indicate there is a connection but ‘consistent with’ is not causation. That observations have not kept of with model predictions is a fact that implies that the models are not correct. There are also ice core records that show directly that warming precedes CO2 increase, which implies that changes in temperature cause the increase through simple, well known physical chemistry.
Another fact, the increasing histrionics, such as from the president, correlate with a lack of unusual effects from global warming and imply that the theories behind it are not sound.
“There are also ice core records that show directly that warming precedes CO2 increase, which implies that changes in temperature cause the increase through simple, well known physical chemistry.”
This argument is so wrong I can’t believe anyone ever tries to use it anymore.
First of all, go learn about the PETM (Paleocene-Eocene Thermal Maximum).
Second, this argument (obviously) does not hold when humans are digging up fossil fuels and burning them. Do you think they’re waiting for a temperature change until they emit the CO2? Of course not — the CO2 emissions come first.
Temperature and CO2 are in a mutually reinforcing feedback loop. More of one leads to more of the other. The glacial-interglacial temperature change of this ice age would only be about 2/3rds what is observed to be without CO2’s feedback.
@ Jim D
Sounds like a new variant of logical fallacies:
“Appeal to experts gut feeling”
“World’s leading sea ice expert (Peter Wadhams) warms that the Arctic sea ice death spiral will make global warming worse”
“… Climate change has been caused by ignorance and stupidity and cannot be solved by endorsing more of the same with geo-engineering. The only answer is reducing greenhouse emissions. Fast.”
How foolish is it to claim that burning coal, oil and gas to come, in the end, at an always increasing life expectancy, reduced neonatal mortality, and all the rest would be a bad thing… caused by ignorance and stupidity?… WTF are these bozos at The Guardian talking about????
Ars Technica on surface temperature corrections. Is good but quite incomplete and therefore misleading. TOBS, sure. Instrument change, sure. Site moves, sure. But does not, for example, discuss the now significant resulting discrepancies for CONUS between what history shows supported by press reports from the time versus GISS claims for 1930’s versus the past decade. Does not discuss the microsite warming issues that homogenization sweeps into the general record. (I posted elsewhere a study of all SurfaceStations Project CRN 1 quality stations, raw/final. Homogenization did remove urban UHI, but contaminated all but one suburban/rural stations). Does not discuss yhe refional expectatioms problem inherent in homogenization that messes up Rutherglen in Australia and BE station 166900, Amundsen Scott. (Fn. 24 to essay When Data Isn’t in ebook Blowing Smoke). Does not discuss the discrepancy between all 4 main surface records (NOAA, NASA, HadCrut, BE) and all three satellite records. Does not doscuss the simple to compare (just compare the ‘official now’ the the past published ‘official then’ trend of surface adjustments to progressively cool the past and warm the present.
And most tellingly, does not touch on why Lamar Smith subpoenad NOAA over the egregious Karlization of SST, an adjustment quite unlike TOBS which contaminated better more recent float data with older worse and inherently more uncertain (variable depth) engine room intake water data. Implies Smith is purely political, rather than concerned by what whistleblowers from within NOAA brought to his attention.
Scratch the surface, and what seems a reasonable article turns ou very biased and misleading.
From Piers Sellers piece …
The theory of aerodynamics is another perfect example: the Boeing 747 jumbo-jet prototype flew the first time it took to the air—that’s confidence for you. So every time you get on a commercial aircraft, you are entrusting your life to a set of equations, albeit supported by a lot of experimental observations.
How many hours were spent in wind tunnels before the first flight? How many hours were the engines run in test cells? How many jet-powered airliners were designed, built, flown and wrecked before the first 747 took to the skies?
First flight of an airliner is a pass/fail test. The airliner test equivalent to climate modeling is fuel efficiency — a test frequently failed.
Thanks, broker, I missed that silly misrepresentation. Sellers is more ignorant than I thought. Does this man understand anything about CFD? An ignorant but authoritative sounding “astronaut” is perhaps the most dangerous kind of practitioner of the dark art of “science communication.”
F. I meant to write, *”we do NOT know the initial conditions*”
The point you missed in that analogy is that equations plus observations equals confidence. Not one or the other, but both together. Same for climate.
In which case we cannot have much confidence. When my prior comment escapes moderation for being honest about dishonesty, you will have a very specific set of two examples from sentence 5 of Sellers second paragraph.
You don’t have much confidence, but the people with an understanding of the science and observations do.
JimD, If they have confidence, they are fooling themselves. One can have confidence in the greenhouse effect. Climate projections are another matter. Here its really very sketchy and laymen such as yourself may be fooled by the biased literature or the practitioners of “Colorful Fluid Dynamics” such as Sellers. I am working on a post on this that lays it all out with references to the real science.
The people with access to more facts on this topic tend to have more confidence. This shows up in polls of scientists. Most “skeptics” have limited factual resources to make judgements on, and not surprisingly they are not very confident of their own view at all.
JimD, Appeals to authority are not going to work with me. Skeptics are not as uniformly unfamiliar with the scientific issues as you imagine. There is a pretty broad range of scientific opinion on the important questions. The greenhouse effect is settled science. The feedbacks are not.
Jim D, it is clear that ristvan has a clear understanding of the science and observations. Far better than yous in fact. Try good arguments instead of insults. Oh wait.
His understanding is a little more muddy than you seem to think. Up against Dr. Sellers on this particular issue, I would take Sellers every time.
The Arctic Ocean is ice free. And I don’t remember what snow was – or do I? What “confidence” do you mean?
You seem to get your science from newspaper clippings. Try the IPCC reports. The early ones underestimated ice loss, but you won’t remember that.
The people I know who have spent their careers doing CFD for fields like aerospace are the most deeply skeptical about the climate models. They are profoundly aware of the limitations of their own models, which operate in far more constrained conditions than the full earth climate.
One such colleague of mine, a fellow adjunct I shared an office with at a UC school (an so politically liberal he was literally dancing a jig for joy when I saw him the day after the Democrats took control of Congress 10 years ago), is absolutely dismissive of the skill of climate models: “They can’t predict ANYTHING!”, he says.
On the other hand, people who run GCMs are very satisfied with how they do. Isaac Held is an expert with a really good blog on general large-scale atmospheric and modeling and related issues.
What we see is that the jet streams are well represented together with asymmetries related to the seasons and continents. To get these right is no small feat because heat is transported both vertically by convection and horizontally by baroclinic waves, and the resulting thermal structure is only correct if those transports are, together with surface effects. These GCMs can be run for centuries with realistic weather systems, and the CO2 level can be changed to see its effect.
JimD, Colorful Fluid Dynamics is qualitative. Oh look, there is a jet stream!! And the gullible applaud. This is bordering on Continuous Fraud and Deceit. Quantitatively, the models are pretty bad. The recent paper on tuning shows that the uncertainty is far larger than the IPCC has said. Held is a smart guy, but he can also be wrong and I don’t think has ever claimed quantitative skill for these models in the long term.
You don’t get things like that right by accident. Many independently developed GCMs exist and give qualitatively similar results to the actual observations. There is a worldwide GCM community that critique each other through the review process. Note also that GCMs coupled to the oceans can’t produce the 1 C warming we have had unless the CO2 is increased in accordance with observations. The natural fluctuations of global-averaged multi-decadal temperatures are just a couple of tenths of a degree.
JmD, Of course you can get those things right by accident. What do you say about the recent model tuning paper. As Judith correctly observed, it implies that the uncertainty is far larger than the IPCC has calculated. I have 40 years experience in these simulations. Accidents happen all the time when you tune parameters and they are not meaningful scientifically.
It is a multi-dimensional space when you look at the temperature and rainfall and their distributions. These models can’t get the patterns so close and be wrong at the same time.
JimD, You can find this issue discussed in Leschziner and Drikakis, July 2002 in the Aeronautical Journal. They say something like “results can sometimes appear accurate because some turbulence effects are not important in these cases, not matter how wrong the viscous stress may be.” Accidents happen all the time and should be treated as such. I’ll find the exact quote this evening.
“People who run GCMs are very satisfied with how they do.”
Palm readers are very satisfied with how they do, too!
“You don’t get things like that right by accident.”
You can get things like that right by ex post facto tuning. The financial markets are littered with models that show incredible fidelity to existing trends but fail miserably going forward.
“Note also that GCMs coupled to the oceans can’t produce the 1 C warming we have had unless the CO2 is increased in accordance with observations. The natural fluctuations of global-averaged multi-decadal temperatures are just a couple of tenths of a degree.”
If you ascribe a strong warming effect to CO2, as all of these models do, and a weak effect for natural variabililty, as all of these models do, with them tuned to broad (but not fine) 20th century patterns, then of course if you remove the increased CO2 trends, you won’t show significant warming. Doesn’t demonstrate anything.
The warming is not natural variation. We know that from the remaining imbalance being positive. It is a forcing change that the warming is driven by. We know that much from observations alone.
Jim D- one this you seem to have missed is that all of the alarmist propaganda and all the significant expenditures on models and other climate research has been since, and driven by, the United Nations Framework Convention on Climate Change. The Convention formed the IPCC to review the human causes of climate change. Since that point there has been virtually no research into how the climate works that has not been aimed at that goal. When you only study one aspect of a problem you’ll never learn. Just imagine if the designers of the 747 had only looked at the static loads on the plane.
These facts also show why directed research that is too focused often doesn’t yield effective results. Despite all the research we still don’t have good theories about how the climate actually works because 90% of the problem hasn’t even been looked at.
In the future I’m not going to reply to any of your posts until you start showing respect and understanding of different points of view. Political talking points are not science.
Thanks. Check out paleoclimate and the observation history for independent evidence. There are many strands of consilient evidence. Many of the “skeptics” are the ones with the narrow focus on models, but don’t blame the scientific community for that.
JimD when tuning to global solution functionals getting the right answer for the wrong reasons is an anticipated outcome. Compensating errors are known occurrences. In the case of GCMs global solution functionals are indeed global :-)
Which is why they look at the patterns, not just global averages. This is an important thing to realize.
JimD, Here’s the full quote: The Aeronautical Journal, July 2002, lead article.
This review has provided evidence that eddy-viscosity models are fundamentally flawed and often perform poorly in flows featuring separation, strong shock–boundary-layer interaction and 3D vortical structures. More seriously, perhaps, the models do not display a consistent behaviour across a wide range of conditions. In relatively simple flows, which develop slowly and in which a single shear stress (expressed in wall-oriented coordinates) is wholly dominant, eddy-viscosity models can be crafted to give the correct level of this stress and thus yield adequate solutions. This applies to near-wall flows, thin wakes and even separated flows in which the separated region is long and thin and hugs the wall. Another type of flow in which eddy-viscosity models are adequate is one in which inviscid features (pressure gradient, advection) dictate the mean flow, so that the Reynolds stresses are largely immaterial, immaterial, however wrong they may be. The fact that many flows are an amalgam of shear layers and regions in which turbulence is dynamically uninfluential explains, in part, the moderate level of success of eddy-viscosity models. Among two-equation eddy-viscosity models, SST forms perform fairly well (at least in 2D flow), due to the limiter which prevents the shear stress from responding to the strain to the extent dictated by the stress-strain relationship. The length-scale equation is a key area of uncertainty and its precise form greatly affects model performance. There is some evidence that models using the turbulent vorticity as a length-scale variable near the wall perform (marginally) better than models based on the dissipation-rate equation, although it must be stressed that performance depends greatly on the nature of viscosity-related damping functions and the numerical constants in the length-scale equation.
In GCMs, the reason they do so well is that they can represent the main scales of transport even at low resolution. Weather systems are well resolved and convective processes have rather well understood vertical transports. Between them, these processes account for how heat and moisture move around in global weather systems. In fact they would not get large-scale features like the jet streams right if they did not represent these large-scale transports well.
Here, Jim D.
Search for “mathematical complexity and, arising from this, numerical difficulties (see Section 4) and higher computational costs” and “While
Reynolds-stress models often give better predictions than eddy viscosity
models in many complex ‘laboratory’ flows, they cannot
be said to guarantee better solutions in practice.”
Yes, in GCMs things like flow around obstacles are not a primary thing to represent. Weather systems are, and these are driven by the large scale heating pattern which is well resolved. Weather systems and continents have scales of thousands of km, so GCM resolutions resolve them well even with 100 km grids such as in the older climate models.
JimD, You will see in the paper I quoted from vortex dynamics are one of the things current models don’t do very well. Rossby waves are vortex dynamics. That’s the main feature of GCM’s that is claimed to look right.
Rossby waves have wavelengths of thousands of kilometers, so their dynamics is well resolved. Dissipation is several orders of magnitude smaller than the other terms, unlike the small viscous vortices you find in CFD.
JimD, Rossby waves are just vortex dynamics, one of the big challenges for turbulence models as pointed out in the quote I gave you earlier. Unfortunately, all scales influence all other scales. Dissipation is 7 orders of magnitude smaller than kinematic forces in aeronautical simulations, so according to your reasoning it could be neglected. Nothing could be further from the truth. These very small viscous forces make an O(1) difference in the results.
GCMs handle Rossby waves as well resolved features. Molecular viscosity is far less important than eddy viscosity in the atmosphere, and I get the sense that you are conflating these, so I don’t know what this Rossby wave argument is about. I think you are throwing irrelevant things from CFD models in.
I knew a WW2-era test pilot. Great guy. He looked like a test pilot. From one side, his face looked pretty normal; from the other side, nothing but scar tissue. They wanted to try landing a jet on an aircraft carrier at night. He crashed. He lost half his face.
So wiki claims in the 1950s the death rate of test pilots was one a day. By ~1968, when the 747 taxied down a runway where the heavy sob first felt lift, what were the pilot’s odds of survival? They were better. Why?
You can not forecast the weather for 100 hours, so be happy with your 100 years climate forecast. Especially if someone actually spend taxpayer’s money on you.
The old weather-climate saw. I was wondering when someone would try that one. Weather is an initial value problem, while climate is a boundary value problem. It’s predictions versus projections.
JimD, Its nonsense what you say. Climate is not a “boundary value problem.” Look up boundary value problem in any textbook on PDE’s. The only way climate models can possibly be right is if the climate is so stable that change could never take place. The nature of the attractor is critical and we simply don’t know. Paul Williams has shown that the climate of the attractor can change a lot if you change the time step even for a 3 dimensional system.
It’s the forcing that dominates climate change. This makes it a boundary value problem by definition because the forcing is regarded as a specified boundary condition. This matters more than a perturbation to the initial conditions.
Jim’s right — climate change is a boundary value problem.
No one knows the initial state. We don’t even know it today.
, Jim D.
Search for “Dyson-effect.”
Try there instead, Jim D.
Search for “Dyson-effect” again.
JimD and David Appell, This is the standard gloss that you repeat and its wrong. “The simulation forgets about the initial conditions.” Philippe Spalart who knows vastly more than either of you says they now have LES time accurate simulations where the “climate” is different for different grid sizes. That’s the issue. Is the attractor followed at all? And this is a far simpler problem than GCM’s solve. And of course computer scientists are experts in chaotic simulations because they say so, right.
In the real world, the climate is determined by forcing changes. All of paleoclimate shows this, not just Milankovitch. In the model world, changing the model is akin to changing the boundary conditions.
It’s important to know what the boundaries are. This paper claims the boundaries at today’s forcings are anything from an ice free world to a snowball Earth.
” A coupled atmosphere-ocean-sea ice general circulation model (GCM) has four stable equilibria ranging from 0% to 100% ice cover, including a “Waterbelt” state with tropical sea ice. All four states are found at present-day insolation and greenhouse gas levels and with two idealized ocean basin configurations.”
So now that you know your boundaries, you know they don’t tell you much.
Sure, and add 5 W/m2 and the available states would likely cease to allow any ice. Note that the Ice Ages are supported by the current insolation levels. GHGs would decrease with cooling temperatures to amplify them. Holding GHGs constant would be unrealistic. Anyway, yes, the continental configuration is part of the forcing. Certain configurations support polar glaciers more easily and would shift the Milankovitch behavior one way or the other. Currently we have a set continental configuration, so some of those modes are not accessible. We can learn a lot from these types of GCM studies, and it is good to see skeptics accepting them finally.
We — hence, climate models — do know the initial conditions.
This is clearly true of the past, but also true of today. In particular, we lack information about
1) deep ocean currents
2) deep ocean heat content
3) global aerosol concentrations (esp as a function of latitude & longitude)
Without these, you simply can’t specify an initial state for climate models to begin their calculations.
Also read Robert Grumbine’s comments.
You are the one that doesn’t accept them, Jim, or you wouldn’t be constantly pedaling your forcing is the only way crap when I have shown you time and time again that not only can changes in ocean heat transport warm the world but that the evidence indicates this is exactly what has happened.
Point is, from your paper, continental configuration is very important. This may be via changing the ocean heat transport in addition to the ease of forming ice ages. Also in the current situation of a rapid forcing change the ocean heat transport can be disrupted in some way, e.g. by the Greenland meltwater. Nothing can be taken in isolation.
Data shows there is plenty of variation in ocean heat transport on short time scales without having to worry about continental drift but from an academic perspective I agree it is interesting and has a lot of implications for paleoclimate.
These are sub-climate wiggles, AMO, PDO, ENSO, are known parts of this.
And there are millennial trends as you are well aware.
Milankovitch explains the millennial cooling trend since the Holocene Optimum.
Here, Jim D.
Can’t help you much more this time, the breadcrumbs have been eaten.
Are you claiming reduced solar slows poleward ocean heat transport?
This Milankovitch phase encourages gradually more polar sea ice which is an albedo effect. Not sure what that does to the transports.
If you decide to take a position let me know.
The position is that forcing changes drive climate changes, and Milankovitch is in that category. No surprise there.
While waiting for more breadcrumbs, you might like to take a look at this:
An excerpt from the key findings:
Too bad we’re dealing with proprietary code. Perhaps the need for “revolutionary algorithmic improvements” may change that. Let’s hope we can improve the efficiency of airplanes, which are not an unimportant source of CO2 in the atmosphere.
JimD and David Appell, You are asking the wrong question. The boundary value vs. initial value problem is a silly and ignorant controversy and largely irrelevant. The question is whether you can actually compete the “climate of the attractor.” People have found some disturbing things recently on that question. It’s mostly unpublished. I will ask Philippe tomorrow for a reference. But Paul Williams did do a simple example.
Only under very strong assumptions is the attractor climate computable.
BTW, There is also the recent paper about tuning GCM’s which sheds some light on the question. The paper’s statements about tuning are of course well known to turbulence modelers, who are well aware of the limitations and pretty honest about them. The problem here is those who “run” the codes are just wildly optimistic about the skill of their runs. We have a new paper on this that is submitted. I can send you a copy if you want to see it.
My interpretation of the model tuning paper is that the uncertainty is larger than previously estimated by the IPCC. Obvious of course.
BTW, There are plenty of NASA codes that are for these question as good as proprietary code. FUN3D is a good one from NASA Langley. Langley also has a web site on turbulence modeling that is pretty good.
Sorry, typo in previous comment. The question is whether you can compute the climate of the attractor.
David Young wrote:
“JimD and David Appell, You are asking the wrong question. The boundary value vs. initial value problem is a silly and ignorant controversy and largely irrelevant”
Have you ever seriously studied partial differential equations?
DY, the initial and boundary value problems distinguish the weather problem from the climate problem and tell you why initial conditions matter for weather but not for climate, and why changing solar or GHG values matter for climate and not weather. It tells you why the weather problem is prediction and the climate problem is projection. It is about as precise a distinction as you can get even when you can use the same models for both.
David Appell, Yes for 45 years.
Since you can take today’s forcings and get anything you want by changing ocean heat transport, it seems you need to understand ocean heat transport. Just because you prefer to ignore that which isn’t understood doesn’t mean it doesn’t exist or doesn’t matter. I am amused by your imitation of the 3 monkeys though.
steven, “Since you can take today’s forcings and get anything you want by changing ocean heat transport”. If you are referring to the paper you mentioned before, it was by changing the continents that they got this effect. Plus we know that the Ice Age mode is possible with suitable orbital/albedo forcing. We went through all this already.
I doubt that David, or you would — like math undergraduates — understand the difference between a BVP and an IVP.
And you’d understand that we don’t know the initial values for the equations of climate.
It might not be the best of time (if it ever is) to pull out the Says Who? line.
While waiting for breadcrumbs, you might need a break.
Search for “initial conditions.”
Just to be clear, Robert Grumbin’s comments on the “boundary value problem” canard post by the computer scientist are about orbital dynamics. The time scales for that chaotic behavior are very long, tens of thousands of years. Atmospheric turbulence has very short time scales. It’s a totally different set of phenomena.
In any case, the question is really not if the attractor is reasonably well behaved. Even if it is, actually computing it is a totally different question, and a far harder one I would argue. i have yet to see any discussion that really addresses that including Annon and Connelley’s old post.
The climate wars often mask uncertainty as people try to “communicate” a sense of urgency.
willard: If you have a point, make it.
I’m not going to guess based on your links.
Yes Jim, we have been through this before. The data has enough variation in it to explain all the warming we have experienced. The reason you keep bringing up continental drift is diversion? Exactly how much variation is possible without continental drift?
A lot. Look at the Ice Ages.
Exactly. There is plenty of evidence that changes in ocean heat transport can cause us to go in to or out of an ice age without invoking any continental drift. So why do you keep bringing it up?
You have that fixation. What about the normal explanation which is Milankovitch cycles? Have you discounted that?
Not at all. I think poleward ocean heat transport is to a large extent solar driven which makes climate much more straight forward. That matches up well with the data and with what the models say should happen. LW radiation should slow that transport. So SW has a large positive feedback and LW has a large negative feedback.
That doesn’t really address the point you were trying to make though. There are plenty of examples of abrupt climate change associated with changes in ocean heat transport. No continental drift and no change in orbit.
These examples occur as a response to rapid melt periods, but the rapid melt is forced. The recovery from the last Ice Age is like that. It was not linear. Same with this warming. The Hansen meltwater pulse scenario gives rapid sea-level rise and cooling in Europe as the Gulf Stream is shut down by a Greenland melt. In these cases it is the melting glaciers that drive the changes and the ocean responds (e.g. the prevailing theory about the Younger Dryas).
The rapid melt is from a change in ocean heat transport. The change in ocean heat transport may or may not be forced. You had abrupt changes in the middle of the last ice age.
The next one will be a melt. Guess why.
Hmmm, Assuming climate is a boundary value problem isn’t all that helpful if you don’t know the initial conditions. The standard deviation of ocean paleo is about 1.25 C for interglacial period and about 2.5 C for longer term reconstructions with 200 to 1000 year sample frequency. So while you can smooth out the variability with multi-proxy and “global” reconstructions, not considering that the tropical oceans have a relatively large swing in temperatures might be a bit too novel.
A big problem is what is “average” or “normal” volcanic activity. If you assume too much as “normal” less than that would be a warming condition. Along with volcanic activity you have seismic activity and ice sheets aren’t the most stable of structures. Huge glaciers might have one to two key boulders slowing advance so they can collapse with an earthquake.
Losing significant ice mass tends to increase seismic activity over quite a few time frames so you have a potential large climate impact not really considered because it is difficult.
Never mind, CO2 done it makes life so much simpler. :)
Yes, volcanoes and the sun cause variabilities in forcing, nothing compared to what we are doing to GHGs, but there it is.
The models don’t think the amount of fresh water hosing we have had so far is enough to slow the AMOC anywhere near as much as it has slowed. Are you saying the models are junk now?
The IPCC is very conservative about melt rates. Hansen projected an already observed acceleration of the melt rate to get his model result.
I’m not talking about the future. I’m talking about measured observations.
I am talking about what is important at this time. It is a special situation we are in now and in the near future.
We will see if it is special or not. So far it appears to be pretty run of the mill climate.
The old weather-climate saw. I was wondering when someone would try that one. Weather is an initial value problem, while climate is a boundary value problem.
There’s a reasonable case to be made for the predictability of average temperature increase with increased CO2. That’s because while differing atmospheric conditions do effect emissions to space, for the majority of earth, the effect of doubling CO2 is positive, so while dynamics might change the distribution of atmospheric profiles somewhat, the result would seem unlikely to be zero. Here is a seasonal global map of CO2 forcing from a reanalysis depicting this point (it’s an old radiative code and with some margin of error, but I believe the point holds ):
Fluid flow tends to change temperature, but it does so by making some areas cooler than normal and other areas warmer than usual ( trough in the east, ridge in the west, or vice versa ) for a net wash.
So, I’m on board with “it doesn’t matter if climate is not predictable to have a rough prediction on the warming effect of CO2 ( beyond the range of natural variability, the sign of which we cannot know ).”
However, one cannot logically maintain the position that it doesn’t matter that climate is unpredictable but then proclaim any basis for predicting changes in events which are direct functions of unpredictable fluid flow, namely frequency of precipitation and droughts, heatwaves, extreme heat, hurricances, etc. etc. These events are determined by unpredictable chaotic fluid flow that has an infinite number of valid distributions, even with the influence of the relatively constant features of mountains and ocean basins.
As a reminder of just how pathetic the models are in this regard, consider this comparison of blocking frequency:
Newton’s ideas, and those of his successors, are all-pervasive in our modern culture. When you walk down a street, the buildings you see are concrete and steel molded to match theory; the same is true of bridges and machines. We don’t build different designs of buildings, wait to see if they fall down, and then try another design anymore. Engineering theory, based on Newton’s work, is so accepted and reliable that we can get it right the first time, almost every time.
This can’t be more incorrect. Newtons laws of motion deals with dynamics…where structural building design deals with statics…I don’t use newtons theory at all when designing a building. More accurately people like Hardy Cross or Aurthur Casagrande theories are more applicable to building design. Also….it was quite common practice when trying something new in building to push the limits of current design……many buildings failed due to this and that served as a large factor in knowledge learned for today’s design.
Action and reaction is one of Newton’s laws.
Which has to with rockets and one componentnof jey engine thrust, not buildings. Buildings depend on materials properties stuff like tension (steel re bar), compression (concrete crush load), flexion (wind sway) and torque (wind twist).
I’ve got a bridge to sell you, Jim D:
We don’t actually use Newton’s third law…..it’s a collorary to that law is that for a body to be at rest the summation of forces must equal 0.
The implicit claim, that engineering has no experimental frontier, is highly insulting. Moreover, the fact that both science and engineering result in the growth of knowledge, is irrelevant. AGW is not Newtonian mechanics. It is a questionable hypothesis with little observational support.
I understand that the Newton’s first law is used to calculate the transfer of loads from one bridge component to another.
Also, Newton (or more often kilonewton, kN) is the SI unit widely used in building industry.
We don’t use Newton’s laws when designing a building.
We use some of its corrolaries instead.
Verification and Validation
Piers Sellers article is a climate budget promotional piece where he exaggerates the confidence, hides the problems, and uses illogical ad hominem rhetorical arguments with appeals to authority to justify his position.
Sellers cites the development and launch of a 747. Yes NASA understand stringent verification and validation of computer codes. e.g. for Computational Fluid Dynamics: Overview of CFD Verification and Validation. That engineering is built on very expensive quantitative engineering with people’s lives and engineer’s futures at stake. NASA has an Independent Verification and Validation facility (IV&V) that understands and conducts such rigorous software verification and validation against quantitative “accurate” (low uncertainty) data.
Climate modeling is the opposite. Trying to understand climate requires requires modeling Nonlinear, Coupled, Chaotic systems with a large number of variables. Compound that with inadequate data, and where the measurement uncertainty of critical elements (clouds) comparable to the critical feature being determined (warming from anthropogenic impacts including CO2). Combine that with finding feedback and group think biasing publications and climate sensitivity estimates. Alarmist papers get funded and funding the essential study of natural systems slighted.
Has NASA dared to submit its global warming software models or climate program to its own IV&V facility for rigorous verification and validation? The political ramifications to their climate budget would appear to be too great to contemplate their having done so.
Contrast “Right Stuff” NASA engineers at “The Right Climate Stuff”. They developed a much simpler quantitative model that is defensible and finds very different results. Has Piers Sellers ever read their work?
The IPCC recognizes that
In Overcoming Chaotic Behavior of Climate Models Fred Singer identifies the duration and number of runs that are needed to get quantitative results. Most models do not conduct any where near this number of runs nor the run length.
See control expert and chemical engineer Dr. Pierre Latour on climate control in
Letter to the Editor, Hydrocarbon Processing, January 2009
Pierre Latour Engineering Earth’s thermostat with CO2?, Hydrocarbon Processing Feb. 2010. (emphasis added)
Latour… sky dragon crap
NASA has excellent scientists and technicians. Yet if you don’t test and validate under real-world conditions, you can still have a problem.
Richard Feynman: Challenger Crash O-Ring https://www.youtube.com/watch?v=6Rwcbsn19c0
Will Sellers address Irreducible Imprecision?
Irreducible imprecision in atmospheric and oceanic simulations
James C. McWilliams PNAS vol. 104 no. 21, 2007, 8709–8713, doi: 10.1073/pnas.0702971104
John Christy 2016 shows the current “imprecision” (including systemic bias) to be about 300% over 35 years compared to the reality of satellite and balloon measurements..
Why the illogical Ad hominem attack of guilt by association?
It appears you are unable to rebut LaTour’s quantitative structural challenges that climate models are structurally incapable because of the:
David L. Hagen,
If I could only pick one argument for why it isn’t exactly prudent to make radical changes to the radiative properties of the atmosphere, the inherent uncertainty of the long-term outcome would be it. Of course the IPCC goes on to state that the best we can hope for is probabilistic estimates of future climate states. How nice of you to leave that part out.
“But chaos” arguments aside, one obvious issue with predicting future climate states is predicting future climate forcings. How much CO2 will we really emit? How many major volcanic eruptions between now and 2100? Of what magnitude? Where? What’s the Sun going to do over the next eight decades?
Not that it matters to the loud contingent of people (like Latour) who question the theory and observation that CO2 has a non-negligible effect on long-term temperature trends.
For rational people doing risk assessment, uncertainty is generally seen as a risk unto itself. IOW, if someone says they cannot quantitatively tell me the consequences of making a major and difficult to reverse change to a system, I’d argue that we probably should not make the change.
I agree on not making changes until we can quantify their impacts and know we can revert them. I encourage you to further explore each of the terms LaTour uses in explaining the very severe difficulties in measuring, predicting, and controlling our climate. Each is quntitatively and conceptually very important.
“….the Boeing 747 jumbo-jet prototype flew the first time it took to the air”
This amazing comment comes from an aspiring scientist. Science requires logic.
Logically, if it did not fly, it would not be the first time it took to the air.
This is, by every analysis, an unbelievably stupid observation by Sellers.
“How many hours were spent in wind tunnels before the first flight?”
The Boeing 747 first flew in 1969. There were no wind tunnels big enough then to hold one. (Even today the largest wind tunnel will only hold a 737.)
How many hours have been spent verifying the Planck Law? The spectra of atmospheric gases? The laws of thermodynamics? Fluid mechanics?
They make up climate models just as the equations of aerodynamics make up the airplane models.
David Appell, I think the point is that there was vast experience with flight tests of similar swept wing aircraft. Also scale models were tested in wind tunnels before the 747 was even built.
The initial series of flights by any aircraft are called test flights for reasons.
I’m sure there were lots of tests. That’s how we know the aerodynamic equations work.
Same for the equations that go into climate science.
You miss the logic.
The converse might be that the prototype aircraft crashed many times before it got off the ground and took to the air.
Compare with the Sellers quote that means little, “….the Boeing 747 jumbo-jet prototype flew the first time it took to the air”
Flying is the same as taking to the air, no?
It was loud and made a big noise?
It was heavy and weighed a lot?
One might ask how many GCMs crashed before they modeled reality?
But they are proud of the observation that the GCM ensemble does a pretty good job.
One might ask how confident we would be if on average airplanes flew?
CAPE CANAVERAL, Fla. (AP) — An explosion has rocked the SpaceX launch site in Florida.
NASA says SpaceX was conducting a test firing of its unmanned rocket when the blast occurred Thursday morning. The test, considered routine, was in advance of a planned Saturday launch from Cape Canaveral Air Force Station.
And this wasn’t a first flight. Engineering is hard. Climate is harder.
WWW, What’s Worrying Wired?
Wired is/was a thought leader. Same people that make up The Long Now Foundation.
Piers Sellers wrote about the successful first flight of the Boeing 747. He didn’t mention the failure of three rockets that failed while attempting to deliver supplies to the International Space Station between April 2014 and June 2015.
Science and engineering are hard.
World’s leading sea ice expert (Peter Wadhams) warms that the Arctic sea ice death spiral will make global warming worse
Professor Wadhams is a man obsessed by death
Cambridge Professor Peter Wadhams claims: “Three scientists investigating melting Arctic ice may have been assassinated”
Increasing jellyfish. The abstract suggested the paper would be a must read. It is fascinating. A very thorough, reproducible study on cascading mis-citations leading to a generall accepted expert consensus that is simply wrong (in the sense of there is no underlying observational evidence). A rate of 46% mis- citation (saying in the citing paper that the cited paper says something it does not). Another huge scientific problem in the publish or perish era.
Thanks for drawing attention to this paper, very interesting indeed.
I have noted a practice within climate science, e.g. the IPCC reports, to make statements and refer to papers purported to support such statements. However, the corresponding statements in the referred articles are normally not quoted / precisely referred to – an extremely poor practice in my view.
No wonder that misconceptions will spread like wildfires by such practices. Here are a few quotes which illustrates the severity:
“Third, a United Nations Environment Report (Turley & Boot, 2010) and one of the most widely cited reviews on impacts of ocean acidification (Doney et al., 2009; cited 1602 times in GS, accessed January 2016), cited Attrill et al. (2007) to claim a link between ocean acidification and increased jellyfish numbers, but neither papers acknowledged the rebuttal of Attrill et al. (2007) by Haddock (2008), the erratum published by Attrill & Edwards (2008) or the expanded analysis of Attrill et al. (2007) by Richardson & Gibbons (2008) that showed no link between acidification and jellyfish populations. Indeed, the misconceptions resulting from mis-citation are even more dangerous when they are contained in papers published in highly influential journals, which provide a platform for those papers to be highly cited. Duarte et al. (2015) have argued that poor citation practices are one of the elements that have perpetuated perceptions on ocean calamities (including rising jellyfish populations) that are contributing to an overly negative perception on the state of the ocean. Our study confirms that that mis-citation facilitated the perception of rising jellyfish populations.”
“More recently, Todd et al. (2007, 2010) observed high and consistent rates of mis-citation in the general ecology and marine biology literature with only 76.1% and 75.8% of citations, respectively, clearly supporting the assertions made in each discipline. Errors of accuracy in citation are particularly concerning because they may have considerable influence on the develop- ment of perceptions within a discipline if they are persistent and biased in a particular direction (Harzing, 2002).”
What I find particularly disturbing is that unelected, megalomaniacal bureaucrats / activists in United Nations endorse such practices and even utilize such practices for their purpose.
Ter whit …
Don’t forget the Doc’s link to Atlantic Bathtub.
Treasure the Chesapeake.
Richard Tol has a new paper out, yet another empty explanation of the climate debate that refuses to accept it as scientific in nature.
“The Structure of the Climate Debate”
Here is the first sentence of the Abstract: “First-best climate policy is a uniform carbon tax which gradually rises over time.” Poor Richard then has to explain how this “first-best” policy got bungled. That it is a very bad policy seems not to be part of the explanation.
In the social sciences, when you start with a false premise you tend to wind up with a false conclusion.
David, i am starting to read Tol’s paper and already there are some important insights about environmentalism and its relationship to environmentalism. I would be careful about rejecting it so quickly.
I did not say that every sentence was false. But unless he contradicts his Abstract he is very wrong. Tol’s problem is that he believes that climate change is a serious problem, which is false. In fact he runs one of the absurd IAMs that claims to know what the world will be like 300 years from now and just how much human caused climate change will hurt us between now and then.
> Tol’s problem is that he believes that climate change is a serious problem, which is false.
Willard, I’ll bite. Observational ECS is 1.5-1.8C. See, e.g, Lewis and Curry 2014 using IPCC AR5 values, or Lewis 2015 using Bjorn Smith’s newer aerosol range rather than the IPCC value. So, AGW is not a serious problem. Tol thinks it is or he would not be advocating carbon taxes.
Only a serious CAGW problem when CMIP5 models are thrown against an arbitrary 2C serious problem threshold. Schellnhuber made up the 2C. Tol’s own past work shows net benefits at 2C, no problems let alone serious ones. The CMIP5 models run hot because of the attribution problem. These are already definitively wrong by peer reviewed warmunist standards as shown by the observed/modeled temperature divergence (pause) and absence of modeled tropical troposphere hotspot. So their ‘serious problem’ CMIP5 ECS median 3.2C and average 3.4C is also false.
Each of these facts you can easily verify should you wish to.
Original source, Willard.
> So, AGW is not a serious problem.
There’s a step missing, Sir Rud: to lukewarmingly push for the lower bounds of justified disingenuousness doesn’t imply that AGW is not a serious problem.
Your “but NicL” just doesn’t cut it.
Try again, this time with more feeling.
Willard, thanks for demonstrating conclusively that there is something wrong with your thought processes. I offered papers and facts. All you can come back with is personal insults? How about a robust alternative Observational ECS? That might “win’.
Hint. You can find observational sensitivity alternatives in paleoclimate literature summarized by Hegerle. Your problem will only be ‘robust’. A long list of those studies, plus their deconstructions, may be found in the sensitivity portion of the Climate chapter of The Arts of Truth. With footnotes for you to research yourself. There is a much shorter discussion in essay Sensitive Uncertainty in Blowing Smoke, around the last figure in the essay.
> I offered papers and facts. All you can come back with is personal insults?
Actually, Sir Rud, I came up with an argument: your papers and facts don’t suffice to infer that AGW is not a serious problem. In other words, you’re committing a non sequitur, or you’re “going a bridge too far,” as the Auditor would put it.
What you’re missing are the conditions by which AGW would stop being a serious problem. These don’t follow from the facts you offered. Handwaving to papers won’t help you either. The best you could argue for is that your favorite central estimate may indicate that the problem could be less serious than the established view, on the proviso that we’re lucky enough that Nature abibes by NicL’s data torture, and notwithstanding the intriguing possibility of a sensitivity near zero.
Which personal insults, again?
Willard, you are not worth further debating. Certainly nothing on jellyfish main topic. Nothing on Tol subtopic. Nothing at all worth discussing at all it seems. Bye.
Do hope you try a return to bloviate on all lof above plus more. Will prove stuff already strongly suspected.
I offered an argument against your position and a suggestion to improve yours, Sir, and all you can come back with is personal insults?
Too bad you could not substantiate your claim, for you were going beyond what is only dog whistled usually.
The unsubstantiated claim makes it obvious that Nic Lewis’ work is having an impact. On the downside, that means Mosher was right….
Wagenmakers (2011, 2012).
“On the downside, that means Mosher was right….”
This is known as a given
Steven, who do you think will be the next POTUS, on Nov. 9 2016?
David, please read more carefully. He says it’s a problem but one whose costs will be relatively small. The religious analogies are beautifully well done too. It’s a great article.
The article is a bit Long winded but the essentials are in section 4 . Basically Tol believes climate change is a modest problem requiring modest solutions
Then how come Tol’s IAM supports the ridiculous social cost of carbon calculations? Why has he not disavowed the SCC?
“Modest” is econspeak here. He says climate change is not catastrophic, the alternative being modest. But he also says that the tax might be quite high, depending on the target level. Modest carbon tax is an oxymoron.
sorry, I should have said the relationship between Environmentalism and Romanticism. It’s and excellent point made by Tol.
theory of the first best
Economics concept that if all but one requirement for achieving a most desirable economic situation cannot be satisfied, it is always beneficial to satisfy the remaining ones.
It sounds like he favors the second idea in libertarianism that is not a 100% solution but satisfies most objectives:
The polluter should pay
To begin, the principle that the polluter should pay has long been a part of libertarian theory. In his 1962 classic, Man, Economy, and State, Murray Rothbard expressed it this way:
In so far as the outpouring of smoke by factories pollutes the air and damages the persons and property of others, it is an invasive act. . . . Air pollution is not an example of a defect in a system of absolute property rights, but of failure on the part of the government to preserve property rights.
A person whose pollution harms another’s person or property should pay for the resulting harm. People do not pollute just for the fun of it. They do so because polluting, when unrestricted, is a cheap way of disposing of wastes. Paying for waste disposal is just as much a proper cost of business or household management as paying for any thing else—energy, labor, transportation, or whatever.
A polluter cannot escape the duty to pay for harm to others simply because it would be expensive to avoid polluting. Yes, it may cost more to build a smokestack with a filter than one without, or more to treat sewage than to dump it directly into a river. Beyond some point, the harm, at the margin, may be less than the cost of abatement, in which case releasing pollutants into the environment may be the economically efficient decision. Efficient or not, however, the polluter should still pay for any remaining harm done even after the efficient degree of abatement has been carried out.
All this leaves open the question of how to ensure that the polluter pays. First, though, we need to address another important issue….
The first libertarian idea (a100% solution)(Mises) was greatly opposed to this method:
Robert Poole cogently defines pollution “as the transfer of harmful matter or energy to the person or property of another, without the latter’s consent.”8 The libertarian — and the only complete — solution to the problem of air pollution is to use the courts and the legal structure to combat and prevent such invasion. There are recent signs that the legal system is beginning to change in this direction: new judicial decisions and repeal of laws disallowing class action suits. But this is only a beginning.9
The remedy against air pollution is therefore crystal clear, and it has nothing to do with multibillion-dollar palliative government programs at the expense of the taxpayers which do not even meet the real issue. The remedy is simply for the courts to return to their function of defending person and property rights against invasion, and therefore to enjoin anyone from injecting pollutants into the air. But what of the propollution defenders of industrial progress? And what of the increased costs that would have to be borne by the consumer? And what of our present polluting technology?
The argument that such an injunctive prohibition against pollution would add to the costs of industrial production is as reprehensible as the pre-Civil War argument that the abolition of slavery would add to the costs of growing cotton, and that therefore abolition, however morally correct, was “impractical.” For this means that the polluters are able to impose all of the high costs of pollution upon those whose lungs and property rights they have been allowed to invade with impunity.
Now has the question has yet to be resolved in court first of all is CO2 pollution? Both ideas seem to require a court solution.
Very interesting. Thank you.
It appears Scalia already addressed your: “Now has the question has yet to be resolved in court first of all is CO2 pollution?”
Does differentiate sources. Should one hold one’s breath?
Ordinarily I might favor the original idea of Mises, but in this case since CO2 may not be a pollutant up to a point the first best idea of Toll should work. That is as long as the tax is reasonable.
Yes, manmade CO2 is pollution as defined under the Clean Air Act and its Amendments.
See the Supreme Court case Mass. v EPA 2007.
There is no regulations against barbequing and wouldn’t be under the Scalia led ruling. It not only pollutes the immediate neighbors air it also pollutes the atmosphere. It is also more thank just CO2 pollution. What do you think about barbequing David? I think it should be banned.
“And this is where politics and science can find themselves at cross purposes. In many political discussions, climate science gets treated like family planning or tax restructuring. When it comes to these social issues, convictions or personal views count for a lot, and rightly so. But the climate issue, and the business of climate prediction, is different.” – Piers Sellers
Because it’s climate prediction it’s different. That’s some theory you have there, the best there is and it has the Good Housekeeping Seal of Approval. Now it’s time to sit down with the customer and make your sales pitch. Where theory meets practice. What is it that they want? Not what you have to sell, not what you think they need, but can you help them with that they want? It is like family planning and tax restructuring.
This is kind of interesting.
“North Carolina’s state epidemiologist resigned Wednesday to protest her employer’s depiction that “deliberately misleads” how screening standards were created to test private wells near Duke Energy’s power plants.”
Read more here: http://www.charlotteobserver.com/news/local/article94899332.html#storylink=cpy
The Duke report highlights the problem of setting of extremely low tolerance limits without first determining realistic background levels.
You are right. Interesting. And there are two further related issues beyond background levels.
One is the old threshold versus linear harm debate also embedded in secondhand smoke, radiation exposure, and most carcinogenic studies. EPA doesn’t like thresholds because it eventually puts them out of business? Yet most and medical do suggest most exposures, even cumulative, have safe thresholds below which there is no harm.
The other is incidence attribution. Cancer occurs, period. Attribution to hexavalent chromium exposure is usually impossible at environmental levels. Its the old powerlines cause cancer canard writ large.
Federal legislation opens a very wide door for EPA to drive its Linear No-Threshold theory through. If EPA must regulate with a margin of safety for public health, LNT is almost the inevitable result.
That said, EPA’s bureaucratic imperative makes a bad situation even worse.
Yep, good old linear no threshold, the environmental advocates best friend. There is even a how to website for using water quality standards to stop fracking.
Glendale, California is a guinea pig for Chrome 6. A superfund project required treating water for various contaminates and the final treated water had level of 10 PPB which is below the rough guess safe limit of the EPA. Thanks to the Erin Brockovich effect, Glendale at first dumped the water until the EPA sued them for wasting expensive treated water. Long story short, Glendale has 5 PPB Chrome 6 water until someone sets a new standard. That is since 2001 and no cancer clusters in that area. The EPA still hasn’t set a Chrome 6 standard, just a total Chromium of 100 PPB. California uses 50 PPB total. California assumes a 50% Chrome 6 ratio, so they do have a default limit of sorts.
Vanadium is kind of funny. Until about 2008 the sensitivity test limits was 30 PPB. In 2011, Oregon tested a large area and found 25 PPB to be pretty common even though they weren’t testing for Vanadium, it was just part of the new lab package. If we can get sensitivities down to 1 part per trillion for most things, we could freak the whole nation out with one good movie.
Here’s a study of residential radon exposure that refutes EPA’s LNT approach. Rud (as usual) is right about this.
I am deficient in the ability to be despondent about decreasing Arctic ice.
Can ice die?
Anyways, I’ve seen a couple of maps of the glaciation during the last Ice Age and it appears to me that much of the Arctic ocean was ice free with the center of the thing kinda being the Hudson Bay.
I am assuming the northern point of the axial rotation was still in the same spot as today.
What am I missing?
Was there an Ice Age with an open Arctic ocean?
Geological alms for an uneducated white Trump supporter?
RR, geological alms. Read Jakobsen et. al. invited review, Arctic Ocean Glacial History iN Quaternary Science Reviews 2014. Open source, not patwalled. Great paper, with more than you will ever need again. Just enjoyed reading and saving it for future referencd. Last sentence of abstract basically says most previous stuff we now know was wrong (one theory was frozen solid to a depth of 1 km, another partly open in late summer) and we still dunno what might be right.
We do know two things. 1. Bering Strait was dry land, so Arctic ocean currents would have been very different. 2. Siberia did not have an ice sheet (mastadon fossils and all that) so wind and weather patterns on the Asian side were very different than on the North American side.
“In the science world, two and two make four.” – Piers Sellers
This might be accurate:
Mathematicians and Physicists Already Proved 1+1 Doesn’t Always Equal 2
“In reality, mathematicians and physicist are the ones who have proved higher dimensions exist, namely in two branches of mathematics, Topology and Chaos Theory.”
The climate is chaotic, however that doesn’t mean the mathematician is talking about the same kind of chaos where 1 plus 1 does not equal 2.
At the link there’s a banner:
Math is Logical – Programmers
Math is Practical – Engineers
Math is Magical – Unicorns
Thus proving the existence of Unicorns.
Sellers writes: “There are three items of good news about this modelling enterprise: one, we can check on how well the models perform against the historical record, including the satellite data archive; two, we can calculate the uncertainty into the predictions; and three, there are more than twenty of these models worldwide, so we can use them as checks on one another.”
Sellers claims something like an equivalency between theories and models. If so, twenty or more different models represent 20 or more different theories. Agreement or disagreement among model results doesn’t argue for either correctness or incorrectness of the models and underlying theories. It just argues for similarity. They could all be equally wrong or incomplete.
SB, to further demonstrate Sellars biases:
1. models v. Observations. They do ok not great with hindcasts because intentionally parameter tuned to do so. That was an AR5 ‘experimental design requirement. They do lousy projecting forward against observation. See the famous Christy chart Gavin Schmidt hates.
2. We can calculate the uncertainty. Well, actually not. We can calculate boundary conditions. But because of climate chaos and initial condition uncertainty, we cannot calculate model output uncertainty. The ‘96% model envelope’ around the ensemble mean is mathematical nonsense, not an uncertainty envelope. Cf Briggs and RGBatDuke.
3. There are more than 20, and we can use them as checks. Wrong, wrong, and wrong. There are 32 models (not 20) , and a total of 104 instantiations of them in CMIP5. Comparing output anomalies conceals that in absolute temperature terms they vary by +/- 3C so most do not even get water phase states correct, so get climactic impact of latent heat of freezing/melting and latent heat of evaporation/condensation wrong. No cross checks there at all. The only model of the 32 that comes close to projecting observed temperature reality from 2006-2016 is Russian INM-CM4. It has high ocean inertia, low WVP, and low sensitivity. Note the Russians do not think there is a CAGW crisis, unlike Sellars. The other 31, including all of NASA’s woild have been thrown out already if this crosscheck statement had any reality.
In defence of a scientist who did achieve what many can only dream about, but sadly now ….(you may add your own label )
The NASA climate scientist, Dr Piers Sellers originally from Crowborough, East Sussex, did not reveal how long he had left to live, but said he planned to continue with his work, despite his lifespan being ‘steeply foreshortened’.
‘I’ve no complaints, I spacewalked 220 miles above the Earth, I watched hurricanes cartwheel across oceans’:
NASA climate scientist, 60, said he had Stage 4 pancreatic cancer
Dr Sellers plans to continue with his work, which includes researching the effects and causes of climate change
He was born in the UK but became a US citizen to be an astronaut
Dr Sellers carried out six spacewalks over the course of three missions.
see also my comment
He is a good guy, therefore he is always right.
He got nearly everything wrong, but it he has achieved far more than many, one from the club of a very few.
It is mysterious to me that analyses of the Venusian atmosphere have not considered the heat of reaction of water and sulfur trioxide to form the sulfuric acid clouds of Venus.
It appears to me that Venusian volcanism produced the dioxide of sulfur, which was oxidized in the atmosphere to the trioxide, combining with water with an extraordinary evolution of heat to form the sulfuric vapor clouds
PROFESSOR ZHARKOVA has no agenda which makes what she says meaningful.
Further she has a good chance of being correct.
As far as the climate of the earth this period of time is in no way unique.
The climate in the big picture is controlled by Milankovitch Cycles, Land Ocean arrangements, with Solar Activity and the Geo Magnetic Field Strength of the earth superimposed upon this.
These factors then exert influences on the terrestrial items on the earth that determine the climate.
Sea Surface Temperatures
Global Cloud Coverage
Global Snow Coverage
Global Sea Ice Coverage
All of this gives an x climate over x time. The historical climatic record supports this.
That is WHAT likely makes the climate change, NOT the scam they promote which is AGW.
The historical climatic record showing this period of time in the climate is in no way unique while changes in CO2 concentrations having no correlation in leading to resultant climate changes.
Now how the cooling evolves will have to be monitored. Of course going from an El Nino condition to an La Nina condition is going to cause an initial cooling.
For clues that if solar is involved the depth of the cooling will have to be monitored and if the cooling is accompanied by the terrestrial items I have mentioned above.
Each one of those terrestrial items having been shown to be linked to Milankovitch Cycles Land Ocean Arrangements in the big slow moving picture while solar and geo magnetic variability being factors that can change these terrestrial items on a much smaller time scale.
The solar parameters needed are
Solar Wind sub 350 km/sec.
AP index 5 or lower
EUV LIGHT 100 units or less
COSMIC RAY COUNTS – 6500 or greater
SOLAR IRRADIANCE – off by .15% or greater.
All very attainable going forward and being compounded by a weakening geo magnetic which if attained with sufficient duration of time will translate into bringing the terrestrial items that control our climate to values which will cause the climate to cool gradually if not in a sharp drop off if certain thresholds should be meant.
More ice news…
Vostok, ice did not suggest any regattas were being held on the lake, back in the day however.
So much for Salvatore’s predictive abilities…..
“…here is my prediction for climate going forward, this decade will be the decade of cooling.”
– Salvatore del Prete, 11/23/2010
The engineering applications of CFD are in no ways whatsoever an analogy for the GCM situation. It is a correct statement that many CFD applications, those that do not resolve the fundamental scales of the un-altered Navier-Stokes equations, involve parameters. However, in general the parameters are few in number. Additionally, almost all of these parameters have a basis in fundamental aspects of turbulent fluid flows, and have been determined (Validated) to correctly capture the small-scale phenomena and processes that are critically important for each application. The parameterizations used in CFD describe small-scale phenomena and processes, and are usually expressed as functions of mean-flow quantities.
GCMs, on the other hand involve a multitude of parameters for phenomena and processes that occur (1) at very large scales, on the order of, including sometimes an order greater than the resolved scales, and (2) smaller than the resolved scales. Cloud parameterizations are examples of the former and radiatively interacting aerosols an example of the latter. Some of these parameterizations will explicitly depend on the size of the discrete grid, and these cause difficult problems, both practical and theoretical, when increases in the spatial resolution are considered. It is the parameterizations that carry the heavy load for the degree of fidelity between the model results and the physical phenomena and processes in the physical domain. The parameterizations are descriptions of states that the materials of interest have previously attained. They are not properties of the materials.
That is a singular, and of upmost importance, critical difference between proven Laws of Physics, which are based on descriptions of materials and the status of GCMs in Climate Science. The proven fundamental laws will not ever, as in never, incorporate descriptions of states that the materials have previously attained. Instead, the proven fundamental laws will always solely contain descriptions of properties of the materials.
GCMs are based on approximate models of some parts of some fundamental equations, plus a multitude of empirical descriptions of states that the materials in the system have previously attained. Even some of the approximate models of the fundamental equations will contain parameters that represent previous states of the materials, and are not material properties. Some of the less important empirical descriptions are somewhat, or completely, ad hoc ( for this case only ). The multitude of parameterizations do all the heavy lifting relative to the fidelity of the results of the model to physical reality.
Generally, CFD applications that potentially involve the health and safety of the public are subjected to a degree of Verification, Validation, Software Quality Assurance, and Uncertainty Qualification that the GCMs can only dream about. Decades of detailed experimental testing and associated model/code/calculational-procedure Validation for each response function of interest continues to be carried out for such applications.
There is no correspondence between CFD and GCMs at all. Absolutely none whatsoever.
It is getting to be a joke whenever fundamental approaches to descriptions of material behaviors are invoked as analogies to the status of GCMs in Climate Science. That reminds me of the good ol’ days when “realizations” using GCMs were equated to actual realizations of the Navier-Stokes equations to investigate the basis nature of turbulence.
A “realization” by a GCM is a “realization” of the processes captured by the descriptions of the previous states. Such “realizations” are not in any way actual realizations of the materials that make up the the Earth’s climate systems. And are in no way related to, or analogous to, realizations of the Navier-Stokes equations and turbulence calculations.
The distinctions between descriptions based on material properties and empirical estimates of previous states, the latter are characterized as ‘process models’, are at such a critical basis that they must always be kept in mind. Climate Science seems to completely ignore these distinctions and continues to invoke false analogies.
By the way, the Navier-Stokes equations are very significant extensions of Newton’s Fundamental Laws of Physics. Many decades of work by mathematicians, physicists, and yes even engineers, were required to progress beyond the point masses and static, unalterable internal structure and lack of thermodynamics of Newton. Cauchy, Euler, the Bernoullis, Fourier, Navier, Stokes, Carnot, among many many others provided progress toward application of Newton’s Laws to fluids.
Turbulence has a fundamental scaling, Navier Stokes equations do not.
There is no way to introduce a renormalization to the nonlinearity of the Navier Stokes equations so no way to reproduce turbulence.
For several flows, but not the full transient, compressible, Navier-Stokes equations, the controlling microscales have been identified and proven to be very useful for providing a firm foundation for algebraic engineering correlations.
Vedat S Arpaci, Microscales of Turbulence
Turbulence: The Legacy of A. N. Kolmogorov
Starting from any initial conditions. the Navier Stokes equation will reproduce a chaotic behavior attributable to the magnification of a nonlinearity but reproduces no scale of turbulence.
Dan, You are a little off here. CFD is far more uncertain than you make it out to be. Turbulence models are not that good and only work in easy cases.
Many of their assumptions border on wishful thinking. In any case, GCM’s can’t really be any better because they are based on the same CFD methods. They however can be worse and that is what the recent paper on model tuning shows.
Ok, David. Looking forward to your post.
I do say, however, that the closing relationships in single-phase CFD are on a far firmer theoretical foundation than any used in GCMs.
And that CFD and GCMs do not belong to the same population.
DH, one thing many denizens may not appreciate is that Feynman himself spent years attacking the mysteries of Navier Stokes CFD experimentally. See Feynman’s lectures on physics volume 2 chapter 41: the Physics of Wet Water. (The typical Feynman title joke is that a previous chapter of lectures had done the ‘physics of dry water’– ignoring viscosity to derive simple, elegant, but experimentally wrong answers).
The last three paragraphs of that famous chapter are considered his ‘Sermon on the Mount’ concerning the mysteries of mathematics. Of course, in his time the mathematics of chaos (nonlineat dynamics) were as yet not understood. Still, utterly magnificent deep science for that time. His ‘sermon’ still resonates deeply.
It is still a work of a genius.
As a chemist, I’ve long been puzzled by the faith placed in the Navier-Stokes representation of thermodynamic dissipation. The presumption of linear viscous dissipation is minor compared to a total neglect of thermal dissipation. Vortices dissipate kinetic energy producing heat and getting rid of this heat is essential to any solution. In wind tunnels or forced convection, there may be some justification for an isothermal approximation, but for the troposphere? Natural convection, driven by gravitational forces involves highly nonlinear thermal dissipation, with transport kinetics increasing dramatically for adiabatic thermal gradients (Fluid Dynamics, Landau & Lif****z).
This has to be one of my favorite comments ever on CE.
The Sellers piece was published in The New Yorker which by reputation has a great fact checking department.
More entertaining is the piece by John McPhee.
Maybe it should be said that The New Yorker once HAD a great fact checking department.
The New Yorker, the Scientific American for non-scientists.
“Two new studies have now, finally, put an end to the long-held theory that the Americas were populated by ancient peoples who walked across the Bering Strait land-bridge from Asia approximately 15,000 years ago. Because much of Canada was then under a sheet of ice, it had long been hypothesised that an “ice-free corridor” might have allowed small groups through from Beringia, some of which was ice-free.”
The Bering Strait is 82 km at it´s narrowest point – a two days walk in the park for eskimos – “When you have walked as long as you can – you have only walked half the distance you are capable of.”
And just why couldn´t eskimos have moved along the ice while eating seals???? Seal need to breath – eskimos need to eat – a perfect match. I move faster on snow and ice than I do on dry land. Eskimos don´t need a freakin land bridge – they do perfectly fine on ice.
I know I´m speculating here – but I do so to illustrate that the hypothesis presented in the article is only one among many possible hypothesis. Why on earth are people paid to speculate on things like this?
Does anyone have a .pdf copy of “Atmosfear: Communicating the Effects of Climate Change on Extreme Weather” Janković & Schultz (2016) linked by Curry at AMS here?
Thanks Stephen, but that’s the paywalled link provided in Judith’s essay.
I’m looking for the full .pdf, free (alas, I am not a member of the American Meteorological Society).
enter the DOI
never pay again.
Steven ==> Thanks, got it now….well, got the method but not the journal article. Works for other DOIs but seems to fail (maybe after appearing to download 11 pages but results only in messages in Russian regarding suitable proxy found, etc) for this particular one.
Great to have the resource though….was unaware of it.
Has anyone else managed to get the particular piece through Sci-Hub?
I tried also but no luck.
I tried also, but no luck. (Hoping this doesn’t double post, first attempt vaporized)
The owner of a copyright typically has the exclusive right to reproduce or to distribute the copyrighted work.
This issue is reminiscent of the pirated music download industry of a few years ago. Hopefully, Mr. Mosher will not become a Napster-like victim of the publishing industry’s wrath. We’d all hate to see him shut down.
Reply to opluso ==> from the Science Mag story you link to: Science Mag’s poll shows 87.87% of those surveyed replied “NO” to the questions “Do you think it is wrong to download pirated papers?”
These bold Web Social Initiatives — like Wiki-Leaks (often misused and poorly managed in my experience), Napster, and others, including Sci-Hub — have the ability to force change world-wide.
There are many issues involved — but in my opinion, “copyright” (owned by journal publishes) is the least valid. These papers are not being “republished” for profit — such as would be in the case of reproducing a copyrighted book for resale. It is more akin to a “available to all” public library that contains copies of the journals, copies which have been donated by various persons, that all are welcome to visit and where they are allowed to freely read the journals in the collections — only digital and over-the-web.
In my opinion, the journals are involved in suppressing of the spread of knowledge and must give way to another, new reality. I’m sure they can find a way to preserve their financial viability under the new regime.
The vast majority of papers published in the US are already paid for with public tax dollars — and thus morally already belong to the citizens of the US (and I suspect, we, the US citizens, are glad to share them with the rest of thew world).
I do suggest that this is an interesting topic for an essay and discussion here at Climate Etc.
I am totally with you on the question of open access for scientific papers that were funded by the public. It is reprehensible, IMO, that such is not automatically the case.
Nonetheless, I felt it was my duty to warn you of the potential perils of joining Mosher’s Merry Band of Thieves.
By the way, the US Constitution presumes that copyright is critical to the advance of science. Copyright itself is not the problem. Assigning it to money-grubbing journals under the coercion of publish-or-perish is.
Article I, Section 8 – Clause 8 [Congress shall have the power]
I wondered about an Ice free passage of the NW passage. It appears that this happened even during the ‘cold years’ of the 20th century. I know it is the most ice free, during the summer months, and more so in the recent years of warming. However it was passable throughout the 20th century:
Norwegian polar explorer Roald Amundsen was the first to sail through the Northwest Passage in 1903–1906.
Amundsen’s Gjøa was the first vessel to transit the passage.
Main article: Roald Amundsen
The first explorer to conquer the Northwest Passage solely by ship was the Norwegian explorer Roald Amundsen. In a three-year journey between 1903 and 1906, Amundsen explored the passage with a crew of six. Amundsen, who had sailed to escape creditors seeking to stop the expedition, completed the voyage in the converted 45 net register tonnage (4,500 cu ft or 130 m3) herring boat Gjøa. Gjøa was much smaller than vessels used by other Arctic expeditions and had a shallow draft. Amundsen intended to hug the shore, live off the limited resources of the land and sea through which he was to travel, and had determined that he needed to have a tiny crew to make this work. (Trying to support much larger crews had contributed to the catastrophic failure of John Franklin’s expedition fifty years previously). The ship’s shallow draft was intended to help her traverse the shoals of the Arctic straits.
Amundsen set out from Kristiania (Oslo) in June 1903 and was west of the Boothia Peninsula by late September. The Gjøa was put into a natural harbour on the south shore of King William Island; by October 3 she was iced in. There the expedition remained for nearly two years, with the expedition members learning from the local Inuit people and undertaking measurements to determine the location of the North Magnetic Pole. The harbour, now known as Gjoa Haven, later developed as the only permanent settlement on the island.
After completing the Northwest Passage portion of this trip and having anchored near Herschel Island, Amundsen skied 800 kilometres (500 mi) to the city of Eagle, Alaska. He sent a telegram announcing his success and skied the return 800 km to rejoin his companions. Although his chosen east–west route, via the Rae Strait, contained young ice and thus was navigable, some of the waterways were extremely shallow (3 ft (0.91 m) deep), making the route commercially impractical.
The first traversal of the Northwest Passage via dog sled was accomplished by Greenlander Knud Rasmussen while on the Fifth Thule Expedition (1921–1924). Rasmussen and two Greenland Inuit travelled from the Atlantic to the Pacific over the course of 16 months via dog sled.
Canadian RCMP officer Henry Larsen was the second to sail the passage, crossing west to east, leaving Vancouver 23 June 1940 and arriving at Halifax on 11 October 1942. More than once on this trip, he was uncertain whether the St. Roch, a Royal Canadian Mounted Police “ice-fortified” schooner, would survive the pressures of the sea ice. At one point, Larsen wondered “if we had come this far only to be crushed like a nut on a shoal and then buried by the ice.” The ship and all but one of her crew survived the winter on Boothia Peninsula. Each of the men on the trip was awarded a medal by Canada’s sovereign, King George VI, in recognition of this notable feat of Arctic navigation.
Later in 1944, Larsen’s return trip was far more swift than his first. He made the trip in 86 days to sail back from Halifax, Nova Scotia to Vancouver, British Columbia. He set a record for traversing the route in a single season. The ship, after extensive upgrades, followed a more northerly, partially uncharted route.
On July 1, 1957, the United States Coast Guard Cutter Storis departed in company with USCGC Bramble and USCGC Spar to search for a deep-draft channel through the Arctic Ocean and to collect hydrographic information. Upon her return to Greenland waters, the Storis became the first U.S.-registered vessel to have circumnavigated North America. Shortly after her return in late 1957, she was reassigned to her new home port of Kodiak, Alaska.
In 1969, the SS Manhattan made the passage, accompanied by the Canadian icebreakers CCGS John A. Macdonald and CCGS Louis S. St-Laurent. The U.S. Coast Guard icebreakers Northwind and Staten Island also sailed in support of the expedition.
The Manhattan was a specially reinforced supertanker sent to test the viability of the passage for the transport of oil. While the Manhattan succeeded, the route was deemed not to be cost effective. The United States built the Alaska Pipeline instead.
In June 1977, sailor Willy de Roos left Belgium to attempt the Northwest Passage in his 13.8 m (45 ft) steel yacht Williwaw. He reached the Bering Strait in September and after a stopover in Victoria, British Columbia, went on to round Cape Horn and sail back to Belgium, thus being the first sailor to circumnavigate the Americas entirely by ship.
In 1981 as part of the Transglobe Expedition, Ranulph Fiennes and Charles R. Burton completed the Northwest Passage. They left Tuktoyaktuk on July 26, 1981, in the 18-foot (5.5 m) open Boston Whaler and reached Tanquary Fiord on August 31, 1981. Their journey was the first open boat transit from west to east and covered around 3,000 miles (4,800 km; 2,600 nmi), taking a route through Dolphin and Union Strait following the south coast of Victoria and King William islands, north to Resolute Bay via Franklin Strait and Peel Sound, around the south and east coasts of Devon Island, through Hell Gate and across Norwegian Bay to Eureka, Greely Bay and the head of Tanquary Fiord. Once they reached Tanquary Fiord, they had to trek 150 miles (240 km) via Lake Hazen to Alert before setting up their winter base camp.
In 1984, the commercial passenger vessel MS Explorer (which sank in the Antarctic Ocean in 2007) became the first cruise ship to navigate the Northwest Passage.
In July 1986, Jeff MacInnis and Mike Beedell set out on an 18-foot (5.5 m) catamaran called Perception on a 100-day sail, west to east, through the Northwest Passage. This pair was the first to sail the passage, although they had the benefit of doing so over a couple of summers.
In July 1986, David Scott Cowper set out from England in a 12.8-metre (42 ft) lifeboat, the Mabel El Holland, and survived three Arctic winters in the Northwest Passage before reaching the Bering Strait in August 1989. He continued around the world via the Cape of Good Hope to return to England on September 24, 1990. His was the first vessel to circumnavigate the world via the Northwest Passage.
On July 1, 2000, the Royal Canadian Mounted Police patrol vessel Nadon, having assumed the name St Roch II, departed Vancouver on a “Voyage of Rediscovery”. Nadon’s mission was to circumnavigate North America via the Northwest Passage and the Panama Canal, recreating the epic voyage of her predecessor, St. Roch. The 22,000-mile (35,000 km) Voyage of Rediscovery was intended to raise awareness concerning St. Roch and kick off the fund-raising efforts necessary to ensure the continued preservation of St. Roch. The voyage was organized by the Vancouver Maritime Museum and supported by a variety of corporate sponsors and agencies of the Canadian government.
Nadon is an aluminum, catamaran-hulled, high-speed patrol vessel. To make the voyage possible, she was escorted and supported by the Canadian Coast Guard icebreaker Simon Fraser. The Coast Guard vessel was chartered by the Voyage of Rediscovery and crewed by volunteers. Throughout the voyage, she provided a variety of necessary services, including provisions and spares, fuel and water, helicopter facilities, and ice escort; she also conducted oceanographic research during the voyage. The Voyage of Rediscovery was completed in five and a half months, with Nadon reaching Vancouver on December 16, 2000.
On September 1, 2001, Northabout, an 14.3-metre (47 ft) aluminium sailboat with diesel engine, built and captained by Jarlath Cunnane, completed the Northwest Passage east-to-west from Ireland to the Bering Strait. The voyage from the Atlantic to the Pacific was completed in 24 days. Cunnane cruised in the Northabout in Canada for two years before returning to Ireland in 2005 via the Northeast Passage; he completed the first east-to-west circumnavigation of the pole by a single sailboat. The Northeast Passage return along the coast of Russia was slower, starting in 2004, requiring an ice stop and winter over in Khatanga, Siberia. He returned to Ireland via the Norwegian coast in October 2005. On January 18, 2006, the Cruising Club of America awarded Jarlath Cunnane their Blue Water Medal, an award for “meritorious seamanship and adventure upon the sea displayed by amateur sailors of all nationalities.”
On July 18, 2003, a father-and-son team, Richard and Andrew Wood, with Zoe Birchenough, sailed the yacht Norwegian Blue into the Bering Strait. Two months later she sailed into the Davis Strait to become the first British yacht to transit the Northwest Passage from west to east. She also became the only British vessel to complete the Northwest Passage in one season, as well as the only British sailing yacht to return from there to British waters.
In 2006 a scheduled cruise liner (the MS Bremen) successfully ran the Northwest Passage, helped by satellite images telling the location of sea ice.
On May 19, 2007, a French sailor, Sébastien Roubinet, and one other crew member left Anchorage, Alaska, in Babouche, a 7.5-metre (25 ft) ice catamaran designed to sail on water and slide over ice. The goal was to navigate west to east through the Northwest Passage by sail only. Following a journey of more than 7,200 km (4,474 mi), Roubinet reached Greenland on September 9, 2007, thereby completing the first Northwest Passage voyage made in one season without engine.
Northwest Passage Drive Expedition (NWPDX) (2009–2011)
In April 2009, planetary scientist Pascal Lee and a team of four on the Northwest Passage Drive Expedition drove the HMP Okarian Humvee rover a record-setting 496 km (308 mi) on sea-ice from Kugluktuk to Cambridge Bay, Nunavut, the longest distance driven on sea-ice in a road vehicle. The HMP Okarian was being ferried from the North American mainland to the Haughton-Mars Project (HMP) Research Station on Devon Island, where it would be used as a simulator of future pressurized rovers for astronauts on the Moon and Mars. The HMP Okarian was eventually flown from Cambridge Bay to Resolute Bay in May 2009, and then driven again on sea-ice by Lee and a team of five from Resolute to the West coast of Devon Island in May 2010. The HMP Okarian reached the HMP Research Station in July 2011.
In 2009 sea ice conditions were such that at least nine small vessels and two cruise ships completed the transit of the Northwest Passage. These trips included one by Eric Forsyth on board the 42-foot (13 m) Westsail sailboat Fiona, a boat he built in the 1980s. Self-financed, Forsyth, a retired engineer from the Brookhaven National Laboratory, and winner of the Cruising Club of America’s Blue Water Medal, sailed the Canadian Archipelago with sailor Joey Waits, airline captain Russ Roberts and carpenter David Wilson. After successfully sailing the Passage, the 77-year-old Forsyth completed the circumnavigation of North America, returning to his home port on Long Island, New York.
On August 28, 2010, Bear Grylls and a team of five were the first rigid inflatable boat (RIB) crew to complete a point-to-point navigation between Pond Inlet on Baffin Island and Tuktoyaktuk in the Northwest Territories. Note: A Northwest Passage requires crossing the Arctic Circle twice, once each in the Atlantic and the Pacific oceans.
On August 30, 2012 S/Y Billy Budd, 110 Ft. , successfully completed the North West Passage in Nome (Alaska) first boat owned and crewed by Italians in history, while sailing a Northern route never sailed by a sailing pleasure vessel before. After six cruising seasons in the Arctic (Greenland, Baffin Bay, Devon Island, Kane Basin, Lancaster Sound, Peel Sound, Regent Sound) and four seasons in the South (Antarctic Peninsula, Patagonia, Falkland Islands, South Georgia), S/Y Billy Budd, owned by and under the command of the Italian sporting enthusiast, Mariacristina Rapisardi. Crewed by Mr. Marco Bonzanigo, five Italian friends, one Australian, one Dutch, one South African, and one New Zealander sailed through the North-West passage. The Northernmost route was chosen. Billy Budd sailed through the Parry Channel, Viscount Melville Sound and Prince of Wales Strait, a channel 160 miles long and 15 miles wide which flows South into the Amundsen Gulf. During the Passage Billy Budd – likely a first for a pleasure vessel – anchored in Winter Harbour in Melville Island, the very same site where almost 200 years ago Sir William Parry was blocked by ice and forced to winter.
On August 29, 2012 the Swedish yacht Belzebub, a 31-foot fiberglass cutter captained by Canadian Nicolas Peissel, Swede Edvin Buregren and Morgan Peissel, became the first sailboat in history to sail through McClure Strait, part of a journey of achieving the most northerly Northwest Passage in recorded history. Belzebub departed Newfoundland following the coast of Greenland to Qaanaaq before tracking the sea ice to Grise Fiord, Canada’s most northern community. From there the team continued through Parry Channel into McClure Strait and the Beaufort Sea, tracking the highest latitudes of 2012’s record sea ice depletion before completing their Northwest Passage September 14, 2012. The expedition received extensive media coverage, including recognition by former Vice President Al Gore. The accomplishment is recorded in the Polar Scott Institutes record of Northwest Passage Transits and recognized by the Explorers Club and the Royal Canadian Geographic Society.
At 18:45 GMT September 18, 2012, Best Explorer, a steel cutter 15.17 metres (49.8 ft), skipper Nanni Acquarone, passing between the two Diomedes, was the first Italian sailboat to complete the Northwest Passage along the classical Amundsen route. Twenty-two Italian amateur sailors took part of the trip, in eight legs from Tromsø, Norway, to King Cove, Alaska, totalling 8,200 nautical miles (15,200 km; 9,400 mi).
Setting sail from Nome, Alaska, on August 18, 2012, and reaching Nuuk, Greenland, on September 12, 2012, The World became the largest passenger vessel to transit the Northwest Passage. The ship, carrying 481 passengers, for 26 days and 4,800 nmi (8,900 km; 5,500 mi) at sea, followed in the path of Captain Roald Amundsen. The World’s transit of the Northwest Passage was documented by National Geographic photographer Raul Touzon.
In September 2013 the MS Nordic Orion became the first commercial bulk carrier to transit the Northwest Passage. She was carrying a cargo of 73,500 tons of coking coal from Port Metro Vancouver, Canada, to the Finnish Port of Pori, 15,000 more tons than would have been possible via the traditional Panama Canal route. The Northwest Passage shortened the distance by 1,000 nautical miles compared to traditional route via the Panama Canal”.
Please supply ONE link. Or more.
Repeat after me.
The CAA is not the whole arctic.
I was just responding to JCs link nothing scientific about it really. What is the CAA?
Canadian Arctic Archipelago?
Please note that prior to about the year 1980, nobody had satellite telemetry to tell them where the ice wasn’t and where they could safely sail.
I also made that point.
Would the NW and the NE passages been periodically navigable to our ancestors if they had sat nav, radar, helicopters to search the route and ice breakers plus the knowledge of back up if things should go wrong?
As well as the Vikings being able to navigate the NW passage I suspect it would have been possible during other known warm decadal periods including around the 1450’s the 1530’s the 1730’s the 1820’s and various other sporadic individual warm years or short clusters of warm years
Arctic sea ice decline is like a ball bouncing down a bumpy hill…
The 2013/14 ‘bump’ was fairly predictable on the basis of AMO behavior, where AMO anomalies tend to move anti-phase with solar cycles when in its warm mode.
“However, in future scenarios with sharp reductions in greenhouse emissions, the Arctic sea-ice recovers slightly as global temperatures decline.”
If rising GHG’s increase positive NAO, that would cool the Arctic.
The Arctic is warming, not cooling.
Yes since 1995.
No, since circa 1970.
World’s leading sea ice expert (Peter Wadhams) warms that the Arctic sea ice death spiral will make global warming worse…
“Climate change has been caused by ignorance and stupidity..”
The Arctic warming was negative North Atlantic Oscillation driven, rising CO2 cannot be responsible for that, the ignorance and stupidity is believing it is.
How does the NAO explain long-term Arctic sea ice decline?
By driving a warm AMO, and by increasing warm humidity events into the Arctic.
ul: That’s hand waving, not a proof.
Wanna try again? Or is that the best you can do?
Some commentators keep persisting in misspelling Dr. Piers Sellers name (it is as in Peter Sellers, another British comedian)
Here is Dr. Sellers’ NASA’s web page
recipe of the Arctic ‘beef goulash’ from NASA’s Dr. Sellers
“Changes in the Arctic may be disrupting the jet stream”
The tail wags the dog.
“DiCaprio issued a wake up call to the public that ignoring climate change poses great peril.”
Looks like the wake up call is only for the little people, not Leo and his massive carbon footprint global elites/embezzlers:
“After watching a short film about the dangers of overfishing, a number of wealthy guests who purportedly support saving the environment were flown in via helicopter to attend the third annual fundraiser in Saint-Tropez for the Leonardo DiCaprio Foundation last month where they dined on whole sea bass.”
When Manbearpig retires his spot as world’s leading carbon-spewing alarmist hypocrite, Leo can take over.
Piers Sellers, are you sure you’re an astronaut not a comedian? Your theory sounds a lot like this Dr. Sellers
I couldn’t get through this item:
I’m not sure if the author is disingenuous or just poorly informed, but some of the things he writes are too misleading for me to finish the piece. My breaking point was this:
Followed by a graphic which conveniently shows Land + Ocean temperatures. Of course, the BEST group only created their own land temperature data set. For ocean temperatures, they just used the HADSST dataset (which they re-interpolated). The result is in addition to the overlap in data for land records, the graph which “looked just like everybody else’s” is being compared in at least one case to things which use the same ocean data. In other words, over 70% of their results are dependent upon the exact same data, data which BEST didn’t re-analyze (re-interpolating is not re-analyzing).
The author conveniently fails to inform the reader of this, so while what he says may not be false, it is certainly very misleading. Similarly, he throws out the obscene canard:
Leaving aside it is impossible to use “all the raw data without performing any analysis” like he suggests (any creation of a global temperature record requires analysis), the author tosses out the idea adjustments decrease the total amount of warming as though that disproves the idea adjustments are being made to exaggerate a global warming trend. That’s nonsense.
If you look at what effect adjustments to the data have, it is true the net effect is to decrease the total warming. This is because warming in the past, prior to ~1950, is decreased. Warming after 1950 is increased. Because human effect on climate has increased greatly over time, we’d expect its influence to be smaller prior to 1950 and larger after 1950. That is exactly the pattern these adjustments show.
The idea decreasing the total amount of warming prior to humans having a strong influence proves the adjustments could not be exaggerating the effect humans have on global temperatures is not only wrong, it is the exact opposite of the truth. It is beyond absurd people keep promoting such a nonsensical idea. Either they haven’t given the subject any real thought, or they are just being disingenuous.
Which also agrees with the Hadley CRU, NOAA/NCDC and GISS land-only products.
It might help to read the bit about the move toward engine room intake vs. bucket readings in the years prior to and during WWII.
The only use of the word “proof” in that article comes by way of a quote from Ted Cruz, echoing Lamar Smith’s unsubstantiated accusations of politically-motivated bias in global surface temperature records:
I would note if you systematically add, adjust the numbers upwards for more recent temperatures, wouldn’t that, by definition, produce a dataset that proves your global warming theory is correct? And the more you add, the more warming you can find, and you don’t have to actually bother looking at what the thermometer says, you just add whatever number you want.
Pointing out that net adjustments over the instrumental surface record (including the oceans) are cooling does tend to weaken his thinly-evidenced and already dubious case, but only perhaps if one takes the time to dig into the main reason for the 0.2-0.3 C pre-War ocean adjustments.
brandonrgates, I get you may have some overall point you wish to make, but comments like this:
Have nothing to do with anything I said. I expressed my disdain with the piece because there are problems in it. Changing the subject away from those problems won’t change that I correctly identified problems in it. Similarly, it is just rude to say:
When this does nothing to address anything I’ve said. As a side note, I’d wager I know far more about this issue than you do. That you are trying to discuss topics I didn’t comment on doesn’t change that belief.
You may find the semantics of whether or not the article used the word “proof” interesting. I do not. If you wish to address what I’ve written, I’d be happy to discuss it, but simply changing the subject then writing:
Is wrong. You’ve done nothing to address anything I wrote, writing only on matters I didn’t discuss then turned around and said the point I made was wrong. That’s not how discussions work.
Yes of course. You should be allowed to put words in others’ mouths at will. How dare I question your right to build strawmen.
I’m sure everyone is so very proud of you.
Lamar Smith makes the accusation: In June, NOAA employees altered temperature data to get politically correct results.
Ted Cruz insinuates: I would note if you systematically add, adjust the numbers upwards for more recent temperatures, wouldn’t that, by definition, produce a dataset that proves your global warming theory is correct? And the more you add, the more warming you can find, and you don’t have to actually bother looking at what the thermometer says, you just add whatever number you want.
The article rebuts: But despite a number of methodological differences and a larger database of stations, [Berkeley Earth’s] results looked just like everybody else’s.
You balk: Followed by a graphic which conveniently shows Land + Ocean temperatures. Of course, the BEST group only created their own land temperature data set.
I rebut: Which also agrees with the Hadley CRU, NOAA/NCDC and GISS land-only products.
How can it be “misleading” for the article to state something which is materially correct? If we follow the first link embedded in the statement: But despite a number of methodological differences and a larger database of stations, [Berkeley Earth’s] results looked just like everybody else’s.
… we find this pretty picture substantiating that claim:
brandonrgates, it’s clear you have no intention of engaging in a meaningful discussion when you begin with:
I remarked on how a person offered a point as though it disproved an idea. Your response focused solely upon where the word “proof” was used in the piece he had written. Nobody believes a person has to use the word “proof” to offer something as proof, so clearly, your response was non-responsive. You now claim I was making a strawman based solely upon this pathetic attempt at semantic games.
As for your question:
I’m going to assume you don’t actually want an answer to that. Everyone knows how trivially easy it is to say things which are “materially correct” yet misleading. I’m sure you know how to do it. I imagine I could even find cases where you’ve described yourself as having done it at times. You are, after all, surprisingly honest in how you tend to behave disreputably in these discussions.
So with that, I’m going to leave this discussion. If you want to discuss the point I made, you’re welcome to. Otherwise, I’m not going to respond to you any further as one should not feed the trolls.
Nope, you need to compare more 30/40 year periods with the estimated forcing for those periods. If you use 40 years, you would have 4 periods to compare using 1880 as a start. Since forcing has increased with time, , a 1940 to 1979, shouldn’t show a larger TCR relative to 1880 to 119, than 1974 to 2015 to 1940 to 1979.
In Lewis and Curry, their selected periods have very close estimated TCR, use raw and there would be a bigger difference in the three ranges tested. You could randomize the periods and the early forcing estimates, beginning to middle just wouldn’t fit. middle to end
A simple way though is just fit a .CO2(equiv) curve to both and look at the difference. Since aerosol/solar forcing was already fudged a bit to explain 1910 to 1940, things would just get worse, if you expect aerosol/solar to be linear.
Ah. You magically know what nobody believes, so clearly you are correct.
Now you magically know what everyone knows, so again, clearly you are correct.
And you are unsurprisingly free with your interpretations of other people’s words. “I don’t always act in good faith” is a far cry from “I tend to behave disreputably”.
What’s there to discuss? Your “point” here is littered with claims like: If you estimate ECS and TCR in any sensible way, the adjustments actually increase them.
Did you describe “any sensible way”?
Did you show work to demonstrate that the adjustments actually raise ESC and TCR?
Did you *prove* that your “sensible way” really is sensible?
Unless I missed a post somewhere, I believe the anwswer to all three questions is “no”.
That’s right — adjustments reduce the long-term warming trend.
Not many people realize this.
David Appell, you’ve just managed to completely ignore the point of my comment while highlighting only a point I wrote my comment to criticize. If you wish to discuss something I’ve written, fine. I’m happy to. Responding to me while going out of your way to ignore what I wrote is rude though. Please stop.
Most importantly adjustments REDUCE the observational estimates of ECS and TCR which are critical to policy questions.
That is not necessarily true. In fact, I’d say it is more false than true as the way you make it true is to estimate ECS and TCR in terrible ways. If you estimate ECS and TCR in any sensible way, the adjustments actually increase them.
Ah yes, the old proof by conditional. In the meantime back in the real world, the methodology and codes for doing the adjustments are published for all to see in peer reviewed literature and on websites respectively. The raw and adjusted data are provided for all to compare at their leisure.
Clearly this is a conspiracy.
A proper skeptic might ask why the finagled surface temperature records — which are in very good agreement with each other — don’t more closely agree with Teh Modulz … or indeed, why Teh Modulz don’t more closely agree with themselves.
“Most importantly adjustments REDUCE the observational estimates of ECS and TCR which are critical to policy questions.”
I don’t believe that is true. Since sensitivity is defined as the response to CO2 and the response from ~1915 to 1940 is due to something other than CO2 using the “raw” data, you would just end having stronger BC forcing and more head scratching over aerosols. Might even have to reevaluate “unforced” variability. The adjustments actual make the data fit the CO2 only logic.
Here’s Zeke’s plot I lifted from the article with the log CO2 curve “eyeball fit” to the raw:
Compare this analysis by John Kennedy of the UK Met Office:
That plausibly explains the need for an adjustment between 1935-1941. More about spike over 1940-1945 possibly requiring further correction.
The adjustments over 1880-1935 are ever so slightly rising, clearly warming until 1920 and clearly cooling until 1935. Trends in the raw data are the opposite over roughly the same intervals, so the net effect is to reduce multi-decadal variability by about a tenth of a degree. I don’t think buckets vs. ERI explains that.
Pay attention to how brandonrgates writes:
Notice he doesn’t address any issue actually being discussed. He tries to mock me as a conspiracy nut, but the point I made was incredibly simple. We can see this in how Steven Mosher gets it wrong. I’ll quote captdallas2 0.8 +/- 0.3:
The article in question presented the fact adjustments decrease the total amount of warming since 1900 as proof adjustments could not be exaggerating the global warming trend. This same point has been endorsed by many people who foolishly jump on it as a talking point without putting any critical thought into the matter.
The net anthropogenic atmospheric forcings prior to 1940 were relatively small compared to those of the last 50 years. If human influence was the only factor involved, this would result in recent temperatures rising far faster than in the past. That is not what we see in the unadjusted data. In the unadjusted data, we see the warming in the first half of the 20th century was largely comparable to the warming of the second half of the 20th century.
The adjustments change this. By decreasing the warming of the first half of the 20th century while increasing the warming (by a lesser amount) in the second half, the result of the adjustments is to reduce the total amount of warming but increase the ease of using anthropogenic activity to explain the observed warming. Without these adjustments, we would have to try to find some non-anthropogenic explanation for a significant amount of warming in the early 1900s. The obvious question which would follow that is, “Has that non-anthropogenic influence caused some of the warming we’ve seen in the late 1900s?”
This is an incredibly simple point. It does not demonstrate the adjustments are wrong. It merely describes what effect the adjustments have. Everyone should be able to agree about it. Sadly, that’s not the case.
‘“Most importantly adjustments REDUCE the observational estimates of ECS and TCR which are critical to policy questions.”
I don’t believe that is true. Since sensitivity is defined as the response to CO2 and the response from ~1915 to 1940 is due to something other than CO2 using the “raw” data, you would just end having stronger BC forcing and more head scratching over aerosols. Might even have to reevaluate “unforced” variability. The adjustments actual make the data fit the CO2 only logic.”
yes it is true.
1. Using Nic Lewis’s approach to calculating TCR and ECS
2. Select the base period and final period he does
3. DELTA T changes if you use raw data or adjusted data
Its pretty damn simple
You have a base period 1859-1892
you have a final period.. 2000-2016
Use raw SST and raw SAT… calculate delta T
Use adjusted calculate delta T
So for TCR you simply have TCR = 3.71 * Delta T/ Delta F
RAW delta T is LARGER THAN Adjusted Delta T
because adjusting lowers the long term rate of warming
Instead of doing it from memory, I will give you the actual figures
1. Use Nic Lewis’s approach. Contrary to unpublished hacks, this is
a sensible way to estimate.
2. You have a base period of 1859-1882
3. you have a final period of 1995- 2015
4. The forcing for doubling c02 is held the same ( 3.71 watts)
5. The total forcing is held the same.
6. Delta Q is held the same
7. Then you have two cases
A) Delta T for raw
B) Delta T for adjusted
From Lewis’ paper using hadcrut
∆T = .75C
∆F = 2.07Wm-2
ECS = 3.71*(.75/2.07-.48)
A) Adjusted ∆T = .68C
B) Raw ∆T = .81C
ECS raw data 1.88
ECS Adjusted 1.58
Using Nic Lewis’ approach. Using his figures for everything but delta T
calculate two cases
A) Delta T for raw data
B) Delta T for adjusted
The ECS for RAW is higher than the ECS for adjusted.
If you perfer a method other than Lewis, get it published.
Steven Mosher proves my point about this lame talking point only working if you use a terrible approach to estimating climate sensitivity:
This is true only because the approach ignores all temporal issues. If you take the total change in temperature and total change in GHG levels, you can create a ratio to produce a “climate sensitivity.” There may even be some value in doing so. However, that value is severely limited. I’ll give an example:
Suppose temperatures rose .8C from 1900 to 1940. Now suppose they continued to rise after 1940 at a much slower rate. By 2010, they had risen a total of 1.0C. That is, from 1940-2010, temperatures only rose .2C. The approach Mosher describes would create a ratio of 1.0C/X where X was the total change in forcings.
Now, does that make sense? It depends on X. It certainly wouldn’t make sense if only 20% of X came from the 1900-1940 period with the other 80% coming after 1940. After all, you’d be saying for the 1900-1940 period, 20% of X caused .8C of rise in temperature while after 1940 80% of X only caused 0.2C of rise. Those proportions are all screwed up.
If you want to actually figure out how much of a change in temperatures was caused by a change in human activity (or total forcings), you shouldn’t just look at the total amount of change in each value. You should look at how the values change over time, and if they change in similar ways.
Interestingly, the group Steven Mosher belongs to, BEST, did just that in their poor attempt at providing an estimate for climate sensitivity. Were one to repeat their attempts (which I do not think were good) with a temperature series that showed greater warming prior to 1940, you would get a worse fit. The reason is if you account for how things change over time rather than just how things change in sum, the result of adjustments to the temperature record improve the narrative that humans are the dominant cause of the rise in temperatures.
“I don’t believe that is true. Since sensitivity is defined as the response to CO2 and the response from ~1915 to 1940 is due to something other than CO2 using the “raw” data, you would just end having stronger BC forcing and more head scratching over aerosols. Might even have to reevaluate “unforced” variability. The adjustments actual make the data fit the CO2 only logic.”
ECS = 3.71 (∆T/(∆F-∆Q))
Then you have to pick time periods.
Per Nic Lewis I selected 1859-1882 and 1995-2015 as base and final periods.
Skeptic approved. argue with him.
Oh, I have to laugh. While I was writing that last comment of mine, Steven Mosher talked more about Nic Lewis’s method and made these two remarks:
This is rather remarkable given Mosher himself is part of a group which published a different way of looking at this issue, a way whose results run directly contrary to what he’s claiming here.
His group isn’t the only one either. While some people have used the approach Nic Lewis uses (he didn’t even create it), that is not the only one which has been in use. There have been a number of papers which do exactly what Mosher demands people do. Either he is unaware of the literature on this topic (even though Lewis discussed some of it himself), or he is being disingenuous by intentionally ignoring methodologies that produce results different from the ones he’s promoting here.
Either way, it doesn’t really matter. While there may be some value in trying to estimate climate sensitivity in the simplistic manner of only looking at the net changes in forcings/temperature, it is far, far better to actually consider how things change over time. When you do so, the result of adjustments to the temperature record is to strengthen the AGW narrative. That’s what matters here.
Whether one thinks the adjustments are good or bad, they definitely strengthen the AGW narrative. We should all be able to agree on that.
opps, should have been here.
mosher, “ERR no
ECS = 3.71 (∆T/(∆F-∆Q))
Then you have to pick time periods.”
delta F has a number of factors with the CO2(equ) part considered to be the most power, so I pick times with little CO2, early-middle and compare that with early-late and middle-late and you get more variability but the least adjusted middle-late remarkable stays the same. Not that hard really.
captdallas2 0.8 +/- 0.3, in your misplaced comment you wrote:
Interestingly, that is what the BEST group did, except they didn’t use CO2 equivalent and they included a parameter for volcanoes. On the forcing issue, they just used CO2. They justified using CO2 rather than a CO2 equivalent by saying CO2 could serve as a proxy for the rest. I… do not agree with that decision, but the larger point is this method is an obvious one Steven Mosher’s own group has considered.
Also of interest is what our hostess had to say about the Nic Lewis analysis (which she co-authored a paper on):
I thought this was interesting as that “something else” she refers to is in fact reduced by the adjustments to the temperature record. It would have been an even greater problem for Nic Lewis and our hostess if not for the adjustments. That makes it rather peculiar for Mosher to offer their work to show the adjustments don’t strengthen the AGW narrative.
The issue actually being discussed is this article, which contains these two accusations:
Lamar Smith: In June, NOAA employees altered temperature data to get politically correct results.
Ted Cruz: I would note if you systematically add, adjust the numbers upwards for more recent temperatures, wouldn’t that, by definition, produce a dataset that proves your global warming theory is correct? And the more you add, the more warming you can find, and you don’t have to actually bother looking at what the thermometer says, you just add whatever number you want.
Nope. The article said no such thing. It laid out evidence that the net adjustments since 1880 are cooling, not warming, thereby weakening the case Smith and Cruz are making with their cherry-picked examples.
Yes, it really is.
Ah. Wait for it ….
Ok sure, got it. [wink wink]
Re Temperature graphs.
When you see the temperature graphs together, the differences is anyway minor. The big fuzz about them could be caused by the fact that even the slightest uptick in anyone is announced with heavy types.
Ole Humlums accounting of the changes during time is not comforting.
Some changes might be unavoidable, but the tendency is a bit alarming.
The problem is that you can’t see if it is the reference that changed or the real actual temperature that has changed.
I am not sure how to do it, but would it be possible to make an anomaly of the reference?
“We are seeing the effects in the shrinking of the summer Arctic sea ice and the melting of the Greenland glaciers. That melt, in turn, has been partly responsible for the three-inch rise in sea levels since 1992. The Earth is warming, the ice is melting, and sea level is rising. These are observed facts.
Are we humans the cause of these changes? The answer is an emphatic yes.”
It does not make sense, the Arctic warming is negative NAO driven, rising CO2 should increase positive NAO:
RE: Obama: Climate Change Could Threaten the Statue of Liberty
Mr. President, ask the Arabs how to ‘save’ the Statue of Liberty. They would turn Liberty Island into the face of Mickey Mouse like what they did in Palm Islands in Dubai
They must be scared of sea level rise that’s why they built the Palm Islands
It’s 41 C in Dubai so they built the indoor ski resort to enjoy global warming
What makes you think they thought of sea level rise before building the Palm Islands?
They sold them for a profit, right?
What else matters to the builders?
Yes the builders are so greedy they can build artificial islands in the sea but cannot solve a one-foot rise in sea level in 100 years. That would be catastrophic
How do you “solve” a 1-foot rise in sea level?
How do you “solve” a one-foot sea level rise?
PS: Projection for 2100 is about 3-feet.
That’s too sissy. Why don’t they project 30 feet? Never mind actual observations of 3.3 mm/yr or about one foot
Whose projections? From some model?
David, how is it a guy with a degree in physics can’t calculate the amount of rise by 2100 knowing SLR is 3.3 mm/yr?
Oh yeah, magic tipping points.
Sea level rise is not expected to be linear.
In fact, it’s already accelerating just over the satellite record.
timg56, you don’t need a degree in physics to calculate SLR. Watch and learn:
3.3 mm/yr x 100 yrs = 330 mm x 1 ft / 304.8 mm = 1.08 ft
6th grade pupils did this calculation and it was too easy for them
Linear SLR = 1.7 mm/yr (1900-2000)
Accelerating SLR = 3.2 mm/yr (1993-2015)
Yup it’s accelerating and yup it’s one foot
David and timg56,
If you don’t believe in satellite altimetry, look at tide gauges SLR = 1.2 mm/yr (1849-2009) with no observable anthropogenic SLR. Read this paper published in Journal of Coastal Research:
“Together with a general sea-level rise of 1.18 mm/y, the sum of these five sea-level oscillations constitutes a reconstructed or theoretical sea-level curve of the eastern North Sea to the central Baltic Sea (Figure 1, lower panel), which correlates very well with the observed sea-level changes of the 160-year period (1849–2009), from which 26 long tide gauge time series are available from the eastern North Sea to the central Baltic Sea.”
“we found that a possible candidate for such anthropogenic development, i.e. the large sea-level rise after 1970, is completely contained by the found small residuals, long-term oscillators, and general trend. Thus, we found that there is no observable sea-level effect of anthropogenic global warming in the world’s best recorded region.”
My arithmetic skills are fine. Better than your reading skills apparently, as this
“you don’t need a degree in physics to calculate SLR.” was exactly my point to David Appell.
I have been stating for some time that one doesn’t need a degree in physics to evaluate the potential impact of climate change. Simple arithmetic works just fine, whether it is calculating SLR, or determining the impact of lowering US (and European) CO2 emissions on a global basis, or even figuring out the fallacies in calculating the social cost of carbon, one only need to refer to math we all should have acquired by the 6th grade.
“Sea level rise is not expected to be linear.”
Again, not expected by whom? I’m sure it isn’t expected to be by folks like David Archer or James Hansen, as they are on record claiming very large increases in SLR. All based on models. And what by now is becoming a desperate hope that tipping points do exist out there.
By every seal level group, including Proudman – Simon Holgate’s group. TonyB says Simon Holgate is very conservative on sea level rise; Professor Curry cites Jevrejeva a lot.
“The Arctic’s ice is disappearing. We must reduce emissions, fast, or the human catastrophe predicted by ocean scientist Peter Wadhams will become reality”
Mr. Wadhams, 1,770 people aboard the Crystal Serenity are praying for the Arctic sea ice to disappear, fast, or the catastrophe previously by ocean liner Titanic will become reality. These people didn’t pay $120,000 to hit an iceberg. The maritime industry is so worried by disappearing Arctic sea ice that they sail a luxury cruise ship for tourists to see the catastrophe you predicted while enjoying their fine dining with wine and violin
It’s all ice. Where’s the sea?
An expedition to the North Pole intended to measure the effects of global warming ground to a halt this month when the scientist’s ship got blocked by the ice packs near Murmansk, Russia, reports reveal.
The Polar Ocean Challenge set out on a two-month campaign hoping to prove that the ice at the North Pole was melting. As the expedition’s website explains, the group aimed to show “that the Arctic sea ice coverage shrinks back so far now in the summer months that sea that was permanently locked up now can allow passage through.”
Despite their best intentions to show that the ice is melting and the temperature at the pole is higher than normal, the group has only been confronted with the exact opposite as ice continues to block their path.
The website Real Climate Science notes that the Polar melt season is half over, but temperatures have not climbed high enough to sponsor a large melt off of ice. According to the site, there has not been a big melt, and ice gains seem to be very close to the amount of ice lost because temperatures near the pole have been persistently below normal this year. And at the very least, large ice floes have blocked the ocean passages around the area.
The global warming expedition expected to be able to sail all around the Arctic Ocean through the Northeast and Northwest Passages because they assumed the ice would be gone, but they have been stymied because ice blocks most of the route they planned to take
deja vu all over again
Yes the shipping industry is praying for the sea ice to disappear to open the Northwest Passage all year round. Imagine their fuel cost savings for voyages from Europe to the US west coast not crossing the Atlantic down to Panama Canal and up the Pacific Ocean. Global warming is too weak. What are you waiting for? Melt the damn ice!!
So you argue that for a few weeks at the end of summer the possibility of passage of freight through the NE passage is a price worth paying for AGW?
Yes your AGW isn’t working. Why don’t you fart on the sea ice you might succeed in melting it. Your hot gas is most amazing!
turbo tony to the rescue
Thank you for your considered reply. Impressions now confirmed.
BTW: oh, how I larfed.
You have a special library for those do you?
Likewise, impressions now confirmed (warmists have warm butts). Al Gore boldly proclaimed the north polar ice cap will be gone by 2014. Poor Al the Greenland ice sheet and the Arctic sea ice are giving him the dirty finger. He had to use the jackhammer to break the ice. It’s really a pathetic sight
The North East passage is above Russia. Presumably your typo?
Who knows when the Northwest passage may previously have been navigable for ships with radar, Sat Navs and other high tech assistance. Perhaps in the 1920’s and 1930;s when the NE passage opened up, perhaps not
So you argue that for a few weeks at the end of summer the possibility of passage of freight through the NE passage is a price worth paying for AGW?
The problem with global warming as a crisis is that actual harmful aspects are difficult to actually observe. Average temperature increase is measured to be rising.
But since anti-depressants are pee’d out, anti-depressants in most rivers is also measurable. But the fish are still depressed, because the levels of anti-depressants are sub-therapuetic.
Many things are measurable, but not significant.
A two to three degree global surface temperature change is 50% of the four to six degree difference between the Holocene optimum and the LGM. For some additional perspective, between 12.5 and 10.5 kya, GAT rose about 1.6 C, a rate of 0.08 C/century. A three degree rise over pre-industrial by 2100 is roughly double the change in a tenth of the time.
Your definition of “not significant” is curious.
A two to three degree global surface temperature change is 50% of the four to six degree difference between the Holocene optimum and the LGM. For some additional perspective, between 12.5 and 10.5 kya, GAT rose about 1.6 C, a rate of 0.08 C/century. A three degree rise over pre-industrial by 2100 is roughly double the change in a tenth of the time.
Global average temperature did not cause the HCO or the LGM – orbital variation effect on persistent ice did.
Indeed. Your original argument was about significance of change, not cause.
Yes, and you implied that the change in GAT was significant because it caused the HCO and the LGM, but it is more accurate to say that the HCO and LGM caused changes in GAT.
The HCO was not a time of human extinction but of much of the foundation for civilization from cultures worldwide.
Anyone ending a *comment* on AGW with the words ….
“What are you waiting for? Melt the damn ice!!.”
Gets my unconsidered contempt my friend.
Oh – I know I’m down the rabbit hole here so the echo will come back my way.
It won’t make the science wrong or your selfish ideological contempt of climate scince any more relevant.
Look who’s talking. Yup that’s warmist’s science wrong
Yup that’s warmist’s ideology of climate
But as warmist’s story goes, no such thing as right or wrong, it’s all deniers’ contempt
Just ran across this item. FYI http://www.businessgreen.com/bg/news/2468410/reports-plans-for-ipccs-15c-special-report-start-to-take-shape
And this one: https://www.washingtonpost.com/news/energy-environment/wp/2016/08/22/a-huge-crack-is-spreading-across-one-of-antarcticas-biggest-ice-shelves/?utm_term=.ab570c30f980
Aww, he really loves the ice shelf.
Wonder who said it first or if only one did.
“Using paleoclimate records from the past 500 years, the researchers show that sustained warming began to occur in both the tropical oceans and the Northern Hemisphere land masses as far back as the 1830s — and they’re saying industrial-era greenhouse gas emissions were the cause, even back then. ”
“The team’s reconstructions indicated that significant and sustained warming began in the tropical oceans around the 1830s, about the same time it began over the continental land masses in the Northern Hemisphere. Warming in the Southern Hemisphere was delayed until about 50 years later, the reconstructions suggested — this likely has to do with differences in oceanic and atmospheric circulation there, Abram said. ”
” Michael Mann, distinguished professor of atmospheric science at Pennsylvania State University said he is “troubled” by the researchers’ suggestion that the planet may be more sensitive to greenhouse gas emissions than previously thought. ”
“Mann said a lot of the warming observed in the early 1800s was actually just the climate naturally recovering from this unusual cooling effect, and not primarily caused by the influence of greenhouse gas emissions, which would not become a primary driver of warming until decades later.” (Skeptical?)
(When I clicked on the link to study it paused at abstract then took me to full study)