Climate modelers can’t predict future. so now concentrate on the past

Climate modelers can’t predict future. so now concentrate on the past

https://ift.tt/2O3FP2t

WUWT regular Alan Tomalty writes:

This new study seems to be one of the first studies to admit to internal climate variability for the 20th century. The IPCC is always loath to admit this. However, since the study conclusions are built upon computer climate model simulations, I contend this is still junk science even though it bolsters the skeptic position somewhat. I draw your attention to the following quote:

“We have for example noted that the temporal variance of the majority of ensemble members is larger than what can be inferred from available observations. The result of the study must be assessed with that in mind. We have no simple explanation to this but it might be that the model projects the variance on larger scales than nature as a consequence of limited resolution. We would consequently encourage other modeling groups to undertake similar studies which will hopefully make use of the latest high resolution models coupled models (Haarsma et al. 2016). Intuitively we might have expected the opposite and that reality might expose a higher level of variance than the climate model.”

Why the climate modeler would expect that the climate model would exhibit a lower variability than the real life situation, I have no idea. I guess this is another case of the climate scientist falling in love with his model.

So, it seems that the climate scientists are working their way back in time so that they will have something to do when their CO2 alarmist study of future climate falls apart. However I don’t think that trying to model past climate will capture the public’s imagination quite the way that modelling the doomsday future climate has.

It seems that we will indeed be inundated with climate studies as per the one above,(I predict 1 per week) all steming from the following.

https://climatedataguide.ucar.edu/climate-data/noaa-20th-century-reanalysis-version-2-and-2c

I will quote from the NCAR/UCAR website even though it is NOAA’s project.

“The Twentieth Century Reanalysis (20CR) provides a comprehensive global atmospheric circulation data set spanning 1850-2014. Its chief motivation is to provide an observational validation data set, with quantified uncertainties, for assessing climate model simulations of the 20th century, with emphasis on the statistics of daily weather. The analyses are generated by assimilating only surface pressures and using monthly SST and sea ice distributions as boundary conditions within a ‘deterministic’ Ensemble Kalman Filter (EKF). A unique feature of the 20CR is that estimates of uncertainty are derived using a 56 member ensemble. Overall, the quality is approximately that of current three-day NWP forecasts.”

So they are saying that the quality is as good as a 3 day weather forecast. Hmmmmm. So does that mean that 3 days backward is as good as 3 days forward, or that the hindcast for June 21, 1852 is as good as a 3 day weather forecast? If the latter; that would be very good quality indeed.

Fig. 2a Annual global mean 2 m temperature as a function of time for all ensemble members (light blue), ensemble mean and ± 1 standard deviation (black). Observational results from HadCRUT4 (red) and JRA55 (green) are superimposed. All results are relative to the respective 1961–1990 mean as for HadCRUT4. Major volcanic eruptions are indicated, a ‘?’ indicates uncertainty to attribution. b Distribution of ensemble linear trends (gray) and linear trends for the ensemble mean (black), HadCRUT4 (red) and JRA55 (green) for the common 1958–2005 period. c Ensemble standard deviation as a function of time for the ensemble (black) and standard deviation when masked by the HadCRUT4 observations (red)

So it is obviously the former. So that must mean the accuracy of hindcasting the climate in 1850 is about as accurate as forecasting the climate of the year 2185. So come to think of it, I don’t understand the statement “Overall, the quality is approximately that of current three-day NWP forecasts.”

Their other statement:

“A unique feature of the 20CR is that estimates of uncertainty are derived using a 56 member ensemble.”

This certainly sounds like an ensemble of computer climate models but in the new language of climate scientists it means 56 simulations run on the same climate model with each simulation set to different starting parameters. So, they are using 56 different sets of computer climate model simulations, each of which do not fully understand the underlying science of the planet; and averaging the uncertainty to give one estimate of uncertainty and are calling this a strength of their project!!!!!!!!!

Is the objective here to tell you at what hours of the day it rained on June 21 in 1852 in Mobile Alabama? Or is it something grander than that? Your tax dollars, folks, being spent here..

What national security, or national economic. or national pride reasons would we have to fund past studies of weather/climate that go only as far back as 1850. Oh , I can certainly see the long range goal of this is to wipe out any warming that ever appeared without massive mounts of CO2 in the atmosphere. Once they have fiddled their way back to 1850, why stop there? The next target will be the medieval warming period. It certainly looks like the climate scientists want to put the paleoclimatologists out of business. Computer models are always “sexier” than proxies for climate and so much faster in data generation. Whenever you read the word “Reanalysis”, always remember at some point it is computer generated data even if some real world data is mixed in with it. On another page of the site I found this under Key Limitations

“Does not provide the best estimate of the atmospheric state since ~1979, when more complete observations and more comprehensive reanalyses are available”

Duhhhhh, 1979 was the year when the UAH led by Christy started to provide real data.

PS: I obtained a graph of one of their (UAH) temperature data reanalysis for the US Average Annual temperature degrees Celsius at 2 metres from surface from 1870 to 2010 at 25 degrees N – 50 degrees N and 55 W – 114W.

The graph looked like a long gentle sloping sine/cosine curve with variability from 12.5C to 15 C and no upward trend. Interestingly, the highest was in the 1930s. I guess NOAA hadn’t gotten around to adjusting this computer generated reanalysis data yet.


For the open access study online, see the below link.

https://link.springer.com/article/10.1007/s00382-018-4343-8

Superforest,Climate Change

via Watts Up With That? https://ift.tt/1Viafi3

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s