Pages

Thursday, January 18, 2018

GWPF Newsletter - Global Temperature In 2017: Not A Resurgence Of Global Warming








Hockey Stick Goes To Court

In this newsletter:

1) Global Temperature In 2017: Not A Resurgence Of Global Warming
The GWPF Observatory, 17 January 2018 
 
2) Hockey Stick Goes To Court: Global Warming Lawsuit Is Riddled With Factual Errors
Michael Bastasch, Daily Caller, 16 January 2018 


 
3) Andrew Urban: Manufacturing Climate Consensus
The Spectator (Australian Edition), 9 January 2018
 
4) Europe’s Green Energy Burning Is Killing 40,000 People Per Year, Study Claims
Linde Zuidema, Euractiv, 8 January 2018
 
5) UK Green Energy Investment Drops By 56% After Policy Changes
Adam Vaughan, The Guardian, 16 January 2018
 
6) The Price Of Green Madness: Why Is California The Poverty Capital Of America?
Kerry Jackson, Los Angeles Times, 14 January 2018


Full details:

1) Global Temperature In 2017: Not A Resurgence Of Global Warming
The GWPF Observatory, 17 January 2018 
Dr David Whitehouse, GWPF Science Editor

Global temperature data for the last 12 months refutes the idea that the warmth of 2017 was due to a resurgence of global warming. In fact, the world has been cooling.

It is clear that 2017 was a very warm year. Tomorrow, NOAA, NASA and the UK Met Office will announce by how much. It won’t be a record-breaker, but it will be in the top five, and that has already started comments about why it has been so hot. After all, the record-setting El Niño temperatures of the 2015-16 are over – so why did it remain so hot?

The reason, according to some, is clear: the resurgence of global warming. The year 2017 is the hottest non-El Niño year ever and therefore signifies a dramatic increase of global warming after 20-years or so when the global temperature hasn’t done very much.

The Guardian says that climate scientists predicted the rapid rise in global surface temperatures that we saw in 2017.

Unfortunately for the Guardian it’s not that simple. Firstly, it is rather suspicious that immediately after a record-setting year due to an El Niño there is a record-setting year for other reasons. There are questions about how much of the heat from the 2015-2016 El Niño has been dispersed, especially since the event was unique in many ways. We have relatively little understanding of such extreme events. The year 2017 was a complicated year involving the interplay of a small La Niña cooling event and a manifest attempt by El Niño to rise again. That is its story and the answer to why it was so warm. The reason lies in the strange decay of a monster El Niño, and not in a dramatic upsurge of global warming. For that we need more than one year’s data following an unusual event.

Prediction of El Niños and La Niñas is at an early stage, and it is fair to say that scientists have had limited success. Looking back to the boreal summer of 2012, an El Niño event was predicted to arise only to see the already warmed ocean in the eastern equatorial Pacific suddenly revert to a neutral condition. A similar thing occurred in the boreal summer of 2014. An exceptionally strong El Niño event was predicted but the rapid growth of the anomalous warming in the central-eastern equatorial Pacific dramatically paused during the summer, and no El Niño event formed until the end of that year.

Unsurprisingly predicting what would happen after the 2015-1016 monster has also been a problem. Since April 2016, the National Oceanic and Atmospheric Administration (NOAA) Climate Prediction Center (CPC) repeatedly said that a La Niña event would occur the following winter. However, this La Niña event failed to reach its expected strength and was short-lived. It did not dissipated the heat left over by the El Niño.

El Nino Comeback?

Following this weak La Niña in late 2016 something remarkable happened. There were good signs that another El Niño was on its way. National forecasters said that there was up to a 50% chance of it happening as the global temperature started to rise once more. NOAA said it was 50-60% probable. It was said at the time that if another El Niño did materialise, it would be only the second time in the records that the Pacific went from the hot phase of an El Niño to the cold phase of a La Niña and then back to an El Niño again within three years. It was also noted that the relatively limited nature of those records, though, means researchers couldn’t be certain that such a combination is all that rare.

But the rise stopped just a fraction short of the increase required to technically declare an El Niño, although some Japanese scientists said we did have one and using conditions in one El Nino reference region certainly shows it. It was this warming event, and the previous La Niña’s weakness that contributed to 2017’s warmth.


Click on image to enlarge. Courtesy @JKempEnergy Link.

We know little about El Niño events concerning their timing and intensity. Scientists classify El Niños into Eastern-Pacific and Central-Pacific types based on the peak region of sea temperatures. Thus the 1982-1983 and 1997-1998 events were strongly of the Eastern Pacific type. All La Niñas and moderate El Niños tend to be classified as Central Pacific events. The 2015-2016 El Niño seems to have been a mixture of the two. This shows that El Niños exhibit a wide range of behaviours. Some scientists postulate that the latest event showed the influence of ocean warming trends. There is some indication that they might be changing in character since we started observing them in detail in the satellite era. For example, more and more El Niño events showed spatial patterns centred in the central Pacific after 2000, different from those classical El Niño events centred in the eastern Pacific. Whether this is significant, cyclic or related to global warming is unknown.

So where are we now? In October, the U.S. Climate Prediction Center declared the presence of La Niña and it is thought it might persist throughout the first half of this year. This La Niña, however, is weak; and some scientists and even Bureau’s of Meteorology don’t recognise it.

The suggestion that the warmth of 2017 being a “non-El Niño” year is due to an increase in global warming doesn’t stand up. The temperature structure of the year doesn’t support that idea. Hadcrut4 shows this quite clearly.



Hadcrut4 global temperature data (2014-2017) refutes the idea that the warmth of 2017 was due to a resurgence of global warming. The world has been cooling. -- click on image to inlarge

Besides, as I have said, such a claim must be made on data for more than one year. If the global warming resurgence idea is correct we can expect to see its sign in the temperatures of subsequent years. But evidence for this is off to a shaky start. The UK Met Office prediction for 2018 is not a new warming record — due to La Niña. This is consistent with why 2017 was so warm.

Feedback: david.whitehouse@thegwpf.com

2) Hockey Stick Goes To Court: Global Warming Lawsuit Is Riddled With Factual Errors
Michael Bastasch, Daily Caller, 16 January 2018 

It’s only been about a week since New York City filed suit against five major oil companies, and one economist has already found basic factual errors in the city’s legal filing.

The city’s narrative of a grand conspiracy by fossil fuel companies to silence well-meaning climate scientists has run into reality. One of the climate researchers Mayor Bill de Blasio’s legal team attacked has fired back, pointing out factual errors in the suit.

Canadian economist Ross McKitrick said “complaint contains numerous untrue statements about matters on which I have direct personal knowledge,” according to a lengthy statement he published online.

“The idea that this work was done at the instigation of, or under payment from, Exxon or any other corporation either directly or indirectly through the Fraser Institute or any other group is wholly false,” McKitrick said.

De Blasio’s administration filed a lawsuit in January that seeks monetary damages from five oil companies, including ExxonMobil, for the alleged damages to the city caused by man-made global warming.

De Blasio, a Democrat, is joining a growing movement of local officials suing energy companies for the alleged damages from global warming. He also pledged to divest the city’s pension fund from fossil fuels over the next five years. […]

At one point in the suit, city lawyers claim “Exxon sponsored its own bogus scientific research by paying $120,000 over the course of two years” to the Fraser Institute, a conservative Canadian think tank.

The city claims that money went towards debunking the “hockey stick” graph created by climate scientist Michael Mann. The rebuttal “was rushed into print, without peer review and, in a departure from the standard scientific practice, without offering Dr. Mann and his co-authors an opportunity to respond prior to publication,” the suit claims.

City lawyers claimed the “McIntyre and McKitrick paper was subsequently debunked, but the smear of Dr. Mann’s work remains available on the web today and continues to be cited by climate deniers. Exxon’s promotion by deception thus lives on.”

Mann made waves in the late 1990s when he published his “hockey stick” study that purportedly showed unprecedented global warming in the 20th Century.

McKitrick and mining executive Steven McIntyre published a rebuttal to the study in 2003.

McKitrick, an economist at the University of Guelph, took issue with New York City’s false statements and wrote a lengthy rebuttal. He argued he was not paid by Exxon for the research, and that it was, in fact, published in peer-reviewed journals.

“The Fraser Institute was not involved with the hockey stick project and to the best of my knowledge they knew nothing about it until after it was published,” he said. “While I was a Senior Fellow of the Fraser Institute in 2003-04 this was an unpaid affiliation.”

“Our 2003 Energy and Environment paper was peer reviewed,” McKitrick said. “Regarding the issue of offering Mann a prior chance to respond, we had corresponded with Mann regarding problems we encountered replicating his results but he cut off the correspondence.”

Full story

3) Andrew Urban: Manufacturing Climate Consensus
The Spectator (Australian Edition), 9 January 2018

 

Just out, the first book to be published by the London-based Global Warming Policy Foundation is by the Australian writer/researcher Bernie Lewin, a detailed – and damning – historical deconstruction of the origins of the Intergovernmental Panel on Climate Change (IPCC), titled Searching For The Catastrophe Signal. Acclaimed climate scientist Dr Judith Curry calls it an “important new book” and goes on to provide targeted commentary “in context of the theme of ‘detection and attribution’, ‘policy cart in front of the scientific horse’ and ‘manufacturing consensus’.”

Dr Curry begins by noting that “In a connection that I hadn’t previously made, Lewin provides historical context for the focus on CO2 research in the 1970s, motivated by the ‘oil crisis’ and concerns about energy security. There was an important debate surrounding whether coal or nuclear power should be the replacement for oil.”

As she points out, this seeded what became the focus on carbon dioxide, but the focus was not driven by atmospheric scientists. This point helps us understand how things went so horribly wrong as the science and the politics mixed to become a stinking cocktail – and CO2 became its evil emission.

Lewin, a Melbourne based writer, blogger and an environmental activist since the early 1980s, was amazed at how the global warming scare had come to overwhelm all other environmental campaigns. In 2008 he started to investigate how there came to be such a widespread belief that it had a firm foundation in the authority of science.

His book provides ample detail of the (sorry) history of the IPCC and the policies that were based on its reports – the world over. In Chapter 8, he identifies just how things worked in this pseudo-scientific world. Comments Dr Curry: “The key scientific issue at the time (the mid-1990s) was detection and attribution”:

In 1995, the IPCC was stuck between its science and its politics. The only way it could save itself from the real danger of political oblivion would be if its scientific diagnosis could shift in a positive direction and bring it into alignment with policy action.  

The writing of Chapter 8 (the chapter concerned with detection and attribution) got off to a delayed start due to the late assignment of its coordinating lead author. It was not until April that someone agreed to take on the role. This was Ben Santer, a young climate modeller at Lawrence Livermore Laboratory.

The chapter that Santer began to draft was greatly influenced by a paper principally written by Tim Barnett, but it also listed Santer as an author. It was this paper that held, in a nutshell, all the troubles for the ‘detection’ quest. It was a new attempt to get beyond the old stumbling block of ‘first detection’ research: to properly establish the ‘yardstick’ of natural climate variability. The paper describes how this project failed to do so, and fabulously so.

The detection chapter that Santer drafted for the IPCC makes many references to this study. More than anything else cited in Chapter 8, it is the spoiler of all attribution claims, whether from pattern studies, or from the analysis of the global mean. It is the principal basis forthe Chapter 8 conclusion that . . . .no study to date has both detected a significant climate change and positively attributed all or part of that change to anthropogenic causes.”

But by the time the draft report had gone through the process, “a final version of a ‘bottom line’ detection claim was decided”:

The balance of evidence suggests adiscerniblehuman influence on global climate.

Frederick Seitz was furious and wrote an op-ed for The Wall Street Journal:

In my more than 60 years as a member of the American scientific community, including service as president of both the NAS and the American Physical Society, I have never witnessed a more disturbing corruption of the peer-reviewprocess than the events that led to this IPCC report.

Searching for the Catastrophe Signal: The Origins of The Intergovernmental Panel on Climate Change, available in paperback and Kindle 

4) Europe’s Green Energy Burning Is Killing 40,000 People Per Year, Study Claims
Linde Zuidema, Euractiv, 8 January 2018

Domestic wood burning has become more widespread in Europe in recent years. Exposure to smoke from domestic biomass use caused 40,000 deaths across the EU28 in 2014, new study claims.



The European Union’s dependence on burning solid biomass – most of it wood – to meet its renewable energy targets makes no sense environmentally. It harms the climate, and damages forests and biodiversity.

Because of this, opposition to the policy has swelled over the past year among the public and scientists.

Next week the European Parliament will vote on a proposed revision to the Renewable Energy Directive, which will determine the EU’s future use of biomass. If approved, it will inevitably mean the continued burning of vast quantities of biomass, mainly in the form of wood.

Quite apart from its disastrous environmental impact, there’s another reason any legislation which increases biomass burning for heating and power should be strenuously resisted.

And it’s one that – until now – has been largely overlooked.
New research for Fern by Dr Mike Holland, a leading independent air pollution expert, reveals the perilous cost to EU citizens’ health from burning solid biomass.

It indicates that tens of thousands of EU citizens are dying prematurely every year as a result of exposure to air pollution from burning solid biomass.

Dr Holland’s report also reviews the evidence of the health impact of air pollution from the use of biomass in domestic heating in the EU.

This has become more widespread in recent years driven partly by renewable energy policies, but also because wood is often cheaper than alternative heating fuels such as coal and oil. Domestic biomass burning increased in the wake of the 2008 economic crisis.

A study by Sigsgaard and others estimates that exposure to smoke from domestic biomass use led to 40,000 deaths across the EU in 2014. The authors say this is a conservative figure.

Dr Holland extends Sigsgaard’s analysis to produce a fuller picture of the range of health impacts from domestic biomass burning. In a single year, he estimates that in addition to the 40,000 deaths across the EU, there were more than 130,000 cases of bronchitis, more than 20,000 respiratory and cardiac hospital admissions, a million asthma symptom days for children aged 5-19, 43 million restricted activity days and 10 million working days lost.  All because of exposure to fine particles from domestic biomass emissions.

Full post

See also: Climate the loser as the European Parliament fails to ensure wood is burnt sustainably

5) UK Green Energy Investment Drops By 56% After Policy Changes
Adam Vaughan, The Guardian, 16 January 2018

Investment in green energy fell 56% in UK in 2017 – biggest fall of any country – after ‘stop-start’ support from government



Investment in clean energy plunged further in Britain than in any other country last year because of government policy changes, new figures show.

The amount companies spent on green energy in the UK rose during the years of the coalition government (2010-2015) but has now fallen for two years in a row under the Conservatives, according to analysis by Bloomberg Energy New Finance (BNEF).

While investment in wind, solar and other renewable sources slumped by 56% to $10.3bn (£7.5bn) in the UK, worldwide spending climbed 3% to $333.5bn (£242.4bn), the second-highest level on record.

Mary Creagh, Labour MP and chair of the environmental audit committee, warned that the fall would put the UK’s climate targets at risk.

Full story

6) The Price Of Green Madness: Why Is California The Poverty Capital Of America?
Kerry Jackson, Los Angeles Times, 14 January 2018

Guess which state has the highest poverty rate in the country? Not Mississippi, New Mexico, or West Virginia, but California, where nearly one out of five residents is poor.



That’s according to the Census Bureau’s Supplemental Poverty Measure, which factors in the cost of housing, food, utilities and clothing, and which includes noncash government assistance as a form of income.

Given robust job growth and the prosperity generated by several industries, it’s worth asking why California has fallen behind, especially when the state’s per-capita GDP increased approximately twice as much as the U.S. average over the five years ending in 2016 (12.5%, compared with 6.27%).

It’s not as though California policymakers have neglected to wage war on poverty. Sacramento and local governments have spent massive amounts in the cause. Several state and municipal benefit programs overlap with one another; in some cases, individuals with incomes 200% above the poverty line receive benefits. California state and local governments spent nearly $958 billion from 1992 through 2015 on public welfare programs, including cash-assistance payments, vendor payments and “other public welfare,” according to the Census Bureau. California, with 12% of the American population, is home today to about one in three of the nation’s welfare recipients.


In the late 1980s and early 1990s, some states — principally Wisconsin, Michigan, and Virginia — initiated welfare reform, as did the federal government under President Clinton and a Republican Congress. Tied together by a common thread of strong work requirements, these overhauls were a big success: Welfare rolls plummeted and millions of former aid recipients entered the labor force.

The state and local bureaucracies that implement California’s antipoverty programs, however, resisted pro-work reforms. In fact, California recipients of state aid receive a disproportionately large share of it in no-strings-attached cash disbursements. It’s as though welfare reform passed California by, leaving a dependency trap in place. Immigrants are falling into it: 55% of immigrant families in the state get some kind of means-tested benefits, compared with just 30% of natives.

Self-interest in the social-services community may be at fault. As economist William A. Niskanen explained back in 1971, public agencies seek to maximize their budgets, through which they acquire increased power, status, comfort and security. To keep growing its budget, and hence its power, a welfare bureaucracy has an incentive to expand its “customer” base. With 883,000 full-time-equivalent state and local employees in 2014, California has an enormous bureaucracy. Many work in social services, and many would lose their jobs if the typical welfare client were to move off the welfare rolls.

Further contributing to the poverty problem is California’s housing crisis. More than four in 10 households spent more than 30% of their income on housing in 2015. A shortage of available units has driven prices ever higher, far above income increases. And that shortage is a direct outgrowth of misguided policies.

“Counties and local governments have imposed restrictive land-use regulations that drove up the price of land and dwellings,” explains analyst Wendell Cox. “Middle-income households have been forced to accept lower standards of living while the less fortunate have been driven into poverty by the high cost of housing.” The California Environmental Quality Act, passed in 1971, is one example; it can add $1 million to the cost of completing a housing development, says Todd Williams, an Oakland attorney who chairs the Wendel Rosen Black & Dean land-use group. CEQA costs have been known to shut down entire homebuilding projects. CEQA reform would help increase housing supply, but there’s no real movement to change the law.

Extensive environmental regulations aimed at reducing carbon dioxide emissions make energy more expensive, also hurting the poor. By some estimates, California energy costs are as much as 50% higher than the national average.

Jonathan A. Lesser of Continental Economics, author of a 2015 Manhattan Institute study, “Less Carbon, Higher Prices,” found that “in 2012, nearly 1 million California households faced … energy expenditures exceeding 10% of household income. In certain California counties, the rate of energy poverty was as high as 15% of all households.” A Pacific Research Institute study by Wayne Winegarden found that the rate could exceed 17% of median income in some areas.

Full post


The London-based Global Warming Policy Forum is a world leading think tank on global warming policy issues. The GWPF newsletter is prepared by Director Dr Benny Peiser - for more information, please visit the website at www.thegwpf.com.

No comments:

Post a Comment

Thanks for engaging in the debate!

Because this is a public forum, we will only publish comments that are respectful and do NOT contain links to other sites. We appreciate your cooperation.