By Joseph D’Aleo, CCM
Virtually every month and year we see stories in the once reliable media and from formerly unbiased data centers that proclaim the warmest such period in the entire record back to 1895 or earlier.
In the ADDENDUM to the Research Report entitled: On the Validity of NOAA, NASA and Hadley CRU Global Average Surface Temperature Data & The Validity of EPA’s CO2 Endangerment Finding, Abridged Research Report, Dr. James P. Wallace III, Dr. (Honorary) Joseph S. D’Aleo, Dr. Craig D. Idso, June 2017 (here) provided ample evidence that the Global Average Surface Temperature (GAST) data was invalidated for use in climate modeling and for any other climate change policy analysis purpose.
“The conclusive findings of this research are that the three GAST data sets are not a valid representation of reality. In fact, the magnitude of their historical data adjustments, that removed their cyclical temperature patterns, are totally inconsistent with published and credible U.S. and other temperature data. Thus, it is impossible to conclude from the three published GAST data sets that recent years have been the warmest ever – despite current claims of record setting warming.
That is made even more true given that 71% of the earth’s surface is ocean and the only ocean data prior to the satellite era began in the 1970s was limited to ship routes mainly near land in the northern hemisphere.
“According to overseers of the long-term instrumental temperature data, the Southern Hemisphere record is “mostly made up”. This is due to an extremely limited number of available measurements both historically and even presently from Antarctica to the equatorial regions.
In 1981, NASA’s James Hansen et al reported that “Problems in obtaining a global temperature history are due to the uneven station distribution, with the Southern Hemisphere and ocean areas poorly represented,” – – – – (Science, 28 August 1981, Volume 213, Number 4511(link))
In 1978, the New York Times reported there was too little temperature data from the Southern Hemisphere to draw any reliable conclusions. The report, prepared by German, Japanese and American specialists, appeared in the Dec. 15 issue of Nature, the British journal and stated that “Data from the Southern Hemisphere, particularly south of latitude 30 south, are so meager that reliable conclusions are not possible,” the report says.
“Ships travel on well-established routes so that vast areas of ocean, are simply not traversed by ships at all, and even those that do, may not return weather data on route.
This finding was amplified recently by MIT graduate Dr. Mototaka Nakamura in a book on “the sorry state of climate science” titled Confessions of a climate scientist: the global warming hypothesis is an unproven hypothesis.
He wrote: “The supposed measuring of global average temperatures from 1890 has been based on thermometer readouts barely covering 5 per cent of the globe until the satellite era began 40-50 years ago. We do not know how global climate has changed in the past century, all we know is some limited regional climate changes, such as in Europe, North America and parts of Asia.”
For the entire record the best data quality was limited to some land areas in North America, Europe and Australia. The vast southern oceans were mainly data void.
Even so, see how few land stations were used in the data bases in the early decades of the data window.
The National Academy of Science recognized this in their first attempt at determining a trend in temperature in the 1970s, which they limited to the Northern Hemisphere land areas. It showed a dramatic warming from the 1800s to around 1940 then a reversal ending in a matching cooling b the late 1970s when even the CIA wrote that the consensus of scientists was that we may be heading towards a dangerous new ice age.
Even as the stations increased in number and coverage, their reliability became a challenge, with many large continents having the percentage of missing months in the station data. That required the data centers to guess the missing data to get a monthly and then annual average. That is done with models.
You may be surprised to see that continues today. This required guesswork allows those whose job is to validate their models the opportunity to make adjustments in ways to confirm their biases. See the initial data regions in September 2018 that were filled in by algorithms. It includes in a large data void region a record warmth assessment (Heller 2018).
In our assessments, we found that each update cooled past years more and more which serves to make over time the trends more consistent with their model scenarios.
Here is the NASA GISS adaption of the NOAA GHCN data. Each update cools the past to make the trend upward more significant.
Note how even in areas with better data, station is adjusted (corrupted) by the analysts to turn a cooling trend into the desired warming. We picked just three of many examples – one in Australia, the second in Iceland and the plot for the state of Maine.
For Australia, many examples have been uncovered including Darwin and here Amberley. Blue was the original data plot, red is the one after adjustment in Australia.
The NASA GISS plots for the Iceland raw and the adjusted data shows a cycle replaced by a linear warming ramp. The adjusted data was refuted by the Icelandic met department.
NOAAs Maine temperature trend was accessed in 2011 and again after 2013. The first showed no statistically significant trend from 1895 (-0.01F/decade) with the warmest year 1913. The second had a trend of +0.23F/decade with 1913 adjusted down almost 5F.
THE UN’S DATA CHOICE
The UN uses Hadley CRU data, the earliest and thought to be the most reliable and best constructed global data set. It too shows an adjustment down of the past temperatures in later constructions.
Climategate emails exposed the true state of the data bases used to drive global policy decisions. Their own developers and their chief scientist were exposed and forced to acknowledge the data flaws. Ian ‘Harry’ Harris, the lead CRU climate data programmer and analyst in the ‘Climategate’ emails admitted to “[The] hopeless state of their (CRU) data base. No uniform data integrity, it’s just a catalogue of issues that continues to grow as they’re found… There are hundreds if not thousands of pairs of dummy stations…and duplicates… Aarrggghhh! There truly is no end in sight. This whole project is SUCH A MESS. No wonder I needed therapy!!” http://www.di2.nu/foia/HARRY_READ_ME-0.html
The CRU scientist at the center of the Climategate scandal at East Anglia University, Phil Jones after he thought the jig was up, made a candid admission on BBC that his surface temperature data are in such disarray they probably cannot be verified or replicated, that there has been no statistically significant global warming for the last 15 years and it has cooled 0.12C/decade trend from 2002-2009. Jones specifically disavowed the “science-is-settled” slogan.
Attempting to compile a `global mean temperature’ from such fragmentary, disorganized, error-ridden, ever-changing and geographically unbalanced data is more guesswork than science.
BAD SITING AFTER MODERNIZATION
During recent decades there has been a migration away from old instruments read by trained observers. These instruments were generally in shelters that were properly located over grassy surfaces and away from obstacles to ventilation and heat sources. Today we have many more automated sensors (The MMTS) located on poles cabled to the electronic display in the observer’s home or office or at airports near the runway where the primary mission is aviation safety.
Pielke and Davey (2005) found a majority of stations, including climate stations in eastern Colorado, did not meet WMO requirements for proper siting. They extensively documented poor siting and land-use change issues in numerous peer-reviewed papers, many summarized in the landmark paper Unresolved issues with the assessment of multi-decadal global land surface temperature trends (2007).
In a volunteer survey project, Anthony Watts and his more than 650 volunteers http://www.surfacestations.org found that over 900 of the first 1067 stations surveyed in the 1221 station US climate network did not come close to meeting the specifications. Only about 3% met the ideal specification for siting. (see Fall etal here).
They found stations located next to the exhaust fans of air conditioning units, surrounded by asphalt parking lots and roads, on blistering-hot rooftops, and near sidewalks and buildings that absorb and radiate heat. They found 68 stations located at wastewater treatment plants, where the process of waste digestion causes temperatures to be higher than in surrounding areas.
The GAO was asked to review the situation and found in a report they issued in 2011,“NOAA does not centrally track whether USHCN stations adhere to siting standards…nor does it have an agency-wide policy regarding stations that don’t meet standards.” The report concluded that 42% of the network in 2010 failed to meet siting standards “Many of the USHCN stations have incomplete temperature records; very few have complete records. 24 of the 1,218 stations (about 2 percent) have complete data from the time they were established.” GAO goes on to state that most stations with long temperature records are likely to have undergone multiple changes in measurement conditions. The issue and the report were largely ignored in the media.
In 2008, Joe D’Aleo asked NOAA’s Tom Karl about the problems with siting and about the plans for a higher quality Climate Reference Network (CRN at that time called NERON). Karl said he had presented a case for a more complete CRN network to NOAA but NOAA said it was unnecessary because they had invested in the more accurate satellite monitoring. The Climate Reference Network was capped at 114 stations and would not provide meaningful trend assessment for about 10 years.
In monthly press releases no satellite measurements are ever mentioned, although NOAA claimed that was the future of observations.
THE INCONVENIENT PAUSE
Confounding the warmist claims, the satellites not under climate center control and increasingly some of the data center data provided contradictory results for almost two decades.
Nature and IPCC Lead Author Kevin Trenberth acknowledged the ‘pause’ and cyclic influences of natural factors like El Nino, ocean cycles on global climate.
The American Meteorological Society Annual Meeting in 2015 had 3 panels to attempt to explain away ‘the pause’.
BUOYS TO THE RESCUE
Satellites starting in the late 1970s began to provide full ocean coverage though they could only measure the ‘skin’ temperature, subject to diurnal variations.
Around 2004, a network of floating and capable of diving ARGO buoy (3833 as of October 2020) globally that provided coverage of ocean temperature and heat content largely missing for the previous century.
They inconveniently initially agreed with the lack of warming.
MAKE THE PAUSE GO AWAY
In 2015 pressure from the politicians funding the sciences told the scientists to fix the inconvenient facts.
John Bates, data quality officer with NOAA detailed how Tom Karl in a paper in Science in June 2015, just a few months before world leaders were to meet in Paris to agree on a costly Paris Climate Accord, removed the inconvenient pause by altering ocean temperatures
“They had good data from buoys…and “corrected” it by using the bad data from ships. You never change good data to agree with bad, but that’s what they did — so as to make it look as if the sea was warmer.” Remember with the oceans covering 71% of the globe, even small adjustments could have a major impact.
Bates here noted “the evidence kept mounting that Tom Karl constantly had his ‘thumb on the scale’—in the documentation, scientific choices, and release of datasets—in an effort to discredit the notion of a global warming hiatus and rush to time the publication of the paper to influence national and international deliberations on climate policy.”
CLIMATE REFERENCE NETWORK IGNORED
In 2008, NOAA’s Tom Karl responded to questions about the problems with siting and about the plans for a higher quality Climate Reference Network (CRN at that time called NERON). Karl said he had presented a case for a more complete CRN network to NOAA but NOAA said it was unnecessary because they had invested in the more accurate satellite monitoring. The Climate Reference Network was capped at 114 stations and would not provide meaningful trend assessment for about 10 years.
In monthly press releases no satellite measurements are ever mentioned, although NOAA claimed that was the future of observations.
Though not ever mentioned, the Climate Reference Network showed no warming in the period of record.
IT IS ALL IN THE NOISE
Even the most extreme interpretations in the models based on flawed data and failed theory are in the category of noise relative to the normal daily, seasonal and year-to year variance. Daytime highs in mid latitudes are on average 30F higher in the afternoon than early morning. The warmest month often averages 50F higher than the coldest month. The highest ever is over 100F higher than the lowest ever (as high as 187F in Montana).
Record state highs and lows, the most pristine and unaltered data set show the 1930s were by far the warmest years while recent decades have been benign.
As shown above, most of the warming was nighttime associated with urbanization and local airport heat retention.
Tom Karl, Director of the National Climate Data Center had warned in 1989 “The average difference between trends [urban siting vs. rural] amounts to an annual warming rate of 0.34°C/decade (6F/century) … The reason why the warming rate is considerably higher is that the rate may have increased after the 1950s, commensurate with the large recent growth in and around airports. “
The precipitation this year has been above with tropical help, southeast and north central. It was dry west and northeast.
Overall for the U.S, we se year-to year variance but no clear long-term trends.
The number of named tropical storms for 2020 is second highest for the North Atlantic though 66.7% of normal globally.
Conversely, the Atlantic Basin ACE at 126.3 is just 39th highest.
The last decade was the second quietest for landfalling hurricanes and major hurricanes.
We had a big April but the tornado season is running in below the 25th percentile.
The number of fires is relatively flat since 2010. Acres are similar to recent big years.
We can see prior to 1880, wildfires were more common. Sweetnam looked at long-term incidence of wildfires in North America and found they have declined the last century.
The arctic ice continues at lower levels as the warm phases in the Atlantic and Pacific deliver warmth with the currents that flow under the ice, The International Arctic Research Center at the University of Alaska Fairbanks showed how this cycle is similar to the 1920 to 1950s.
See how arctic temperatures (Polyokov) matches the ocean cycles.
Soon showed solar (TSI) tracked with arctic temperatures better than CO2.
The impact on major metros the last decade rocketed to new highs with 10 year running means at record levels back to the 1870s.