Big Data Finds The Medieval Warm Period - No Denial Here
In this newsletter:
1) Big Data Finds The Medieval Warm Period – No Denial Here
Jennifer Marohasy, The Spectator, 22 August 2017
2) Most Of The Recent Warming Could Be Natural
Jennifer Marohasy, 21 August 2017
3) New Survey: Only 28% Of Americans Believe Climate Scientists Understand Causes Of Climate Change ‘Very Well’
Scott Rasmussen, 21 August 2017
4) By Farming Fish In The Ocean, The World Could Produce 100 Times Current Seafood Consumption
Emma Bryce, Anthropocene, 18 August 2017
5) Lawrence J McQuillan: The Inconvenient Truth About Al Gore And The Climate 'Experts'
Investor’s Business Daily, 22 August 2017
Jennifer Marohasy, The Spectator, 22 August 2017
2) Most Of The Recent Warming Could Be Natural
Jennifer Marohasy, 21 August 2017
3) New Survey: Only 28% Of Americans Believe Climate Scientists Understand Causes Of Climate Change ‘Very Well’
Scott Rasmussen, 21 August 2017
4) By Farming Fish In The Ocean, The World Could Produce 100 Times Current Seafood Consumption
Emma Bryce, Anthropocene, 18 August 2017
5) Lawrence J McQuillan: The Inconvenient Truth About Al Gore And The Climate 'Experts'
Investor’s Business Daily, 22 August 2017
Full details:
1) Big Data Finds The Medieval Warm Period – No Denial Here
Jennifer Marohasy, The Spectator, 22 August 2017
After deconstructing 2,000-year old proxy-temperature series back to their most basic components, and then rebuilding them using the latest big data techniques, John Abbot and I show what global temperatures might have done in the absence of an industrial revolution.
According to author Leo Tolstoy, born at the very end of the Little Ice Age, in quite a cold country:
The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he already knows, without a shadow of a doubt, what is laid before him.
So, our new technical paper in GeoResJ (vol. 14, pages 36-46) will likely be ignored. Because after applying the latest big data technique to six 2,000 year-long proxy-temperature series we cannot confirm that recent warming is anything but natural – what might have occurred anyway, even if there was no industrial revolution.
Over the last few years, I’ve worked with Dr John Abbot using artificial neural networks (ANN) to forecast monthly rainfall. We now have a bunch of papers in international climate science journals showing these forecasts to be more skilful than output from general circulation models.
During the past year, we’ve extended this work to estimating what global temperatures would have been during the twentieth century in the absence of human-emission of carbon dioxide.
We began by deconstructing the six-proxy series from different geographic regions – series already published in the mainstream climate science literature. One of these, the Northern Hemisphere composite series begins in 50 AD, ends in the year 2000, and is derived from studies of pollen, lake sediments, stalagmites and boreholes.
Typical of most such temperature series, it zigzags up and down while showing two rising trends: the first peaks about 1200 AD and corresponds with a period known as the Medieval Warm Period (MWP), while the second peaks in 1980 and then shows decline. In between, is the Little Ice Age (LIA), which according to the Northern Hemisphere composite bottomed-out in 1650 AD. (Of course, the MWP corresponded with a period of generally good harvests in England – when men dressed in tunics and built grand cathedrals with tall spires. It preceded the LIA when there was famine and the Great Plague of London.)
Ignoring for the moment the MWP and LIA, you might want to simply dismiss this temperature series on the basis it peaks in 1980: it doesn’t continue to rise to the very end of the record: to the year 2000?
In fact, this decline is typical of most such proxy reconstructions – derived from pollen, stalagmites, boreholes, coral cores and especially tree rings. Within mainstream climate science the decline after 1980 is referred to as “the divergence problem”, and then hidden.
In denial of this problem, leading climate scientists have been known to even graft temperature measurements from thermometers onto the proxy record after 1980 to literally ‘hide the decline’. Phil Jones, the head of the Climate Research Unit, University of East Anglia, aptly described the technique as a ‘trick’.
Grafting thermometer data onto the end of the proxy record generally ‘fixes’ the problem after 1980, while remodelling effectively flattens the Medieval Warm Period.
There are, however, multiple lines of evidence indicating it was about a degree warmer across Europe during the MWP – corresponding with the 1200 AD rise in our Northern Hemisphere composite. In fact, there are oodles of published technical papers based on proxy records that provide a relatively warm temperature profile for this period. This was before the Little Ice Age when it was too cold to inhabit Greenland.
The modern inhabitation of Upernavik, in north west Greenland, only began in 1826, which corresponds with the beginning of the industrial age. So, the end of the Little Ice Age corresponds with the beginning of industrialisation. But did industrialisation cause the global warming? Tolstoy’s ‘intelligent man’ would immediately reply: But yes!
In our new paper in GeoResJ, we make the assumption that an artificial neural network – remember our big data/machine learning technique – trained on proxy temperatures up until 1830, would be able to forecast the combined effect of natural climate cycles through the twentieth century.
Using the proxy record from the Northern Hemisphere composite, decomposing this through signal analysis and then using the resulting component sine waves as input into an ANN, John Abbot and I generated forecasts for the period from 1830 to 2000.
Our results show up to 1°C of warming. The average divergence between the proxy temperature record and our ANN projection is just 0.09 degree Celsius. This suggests that even if there had been no industrial revolution and burning of fossil fuels, there would have still been warming through the twentieth century – to at least 1980, and of almost 1°C.
Full post
For more information, including charts and a link to the full paper read Jennifer Marohasy’s latest blog post.
2) Most Of The Recent Warming Could Be Natural
Jennifer Marohasy, 21 August 2017
The science is far from settled. In reality, some of the data is ‘problematic’, the underlying physical mechanisms are complex and poorly understood, the literature voluminous, and new alternative techniques can give very different answers to those derived from General Circulation Models and remodelled proxy-temperature series.
After deconstructing 2,000-year old proxy-temperature series back to their most basic components, and then rebuilding them using the latest big data techniques, John Abbot and I show what global temperatures might have done in the absence of an industrial revolution. The results from this novel technique, just published in GeoResJ [1], accord with climate sensitivity estimates from experimental spectroscopy but are at odds with output from General Circulation Models.
According to mainstream climate science, most of the recent global warming is our fault – caused by human emissions of carbon dioxide. The rational for this is a speculative theory about the absorption and emission of infrared radiation by carbon dioxide that dates back to 1896. It’s not disputed that carbon dioxide absorbs infrared radiation, what is uncertain is the sensitivity of the climate to increasing atmospheric concentrations.
This sensitivity may have been grossly overestimated by Svante Arrhenius more than 120 years ago, with these overestimations persisting in the computer-simulation models that underpin modern climate science [2]. We just don’t know; in part because the key experiments have never been undertaken [2].
What I do have are whizz-bang gaming computers that can run artificial neural networks (ANN), which are a form of machine learning: think big data and artificial intelligence.
My colleague, Dr John Abbot, has been using this technology for over a decade to forecast the likely direction of particular stock on the share market – for tomorrow.
Since 2011, I’ve been working with him to use this same technology for rainfall forecasting – for the next month and season [4,5,6]. And we now have a bunch of papers in international climate science journals on the application of this technique showing its more skilful than the Australian Bureau of Meteorology’s General Circulation Models for forecasting monthly rainfall.
During the past year, we’ve extended this work to build models to forecast what temperatures would have been in the absence of human-emission of carbon dioxide – for the last hundred years.
We figured that if we could apply the latest data mining techniques to mimic natural cycles of warming and cooling – specifically to forecast twentieth century temperatures in the absence of an industrial revolution – then the difference between the temperature profile forecast by the models, and actual temperatures would give an estimation of the human-contribution from industrialisation. […]
Full post
3) New Survey: Only 28% Of Americans Believe Climate Scientists Understand Causes Of Climate Change ‘Very Well’
Scott Rasmussen, 21 August 2017
Twenty-eight percent (28%) of Americans think that climate scientists understand the causes of global climate change “very well.” A Pew Research study found that only 19% believe that the climate scientists have a very good understanding of the best ways to address the issue.
In general, the study found that Americans trust climate scientists more than politicians on the topic. Two-thirds (67%) believe scientists should play a major role in addressing policy issues on the matter. Most (56%) also believe that energy industry leaders (56%) and the general public (56%) should have a major say in such policy topics.
The Pew study, however, also found that people believe there are differences of opinion among the climate scientists. Only 27% believe that there is a consensus on the issue and that just about all climate scientists believe human behavior is mostly responsible for global climate change. Another 35% think more than half hold this view.
The survey also explored the degree of trust and confidence in those researching climate science. Thirty-six percent (36%) believe that, most of the time, scientists’ research findings are motivated by a desire to advance their own careers. Only 32% say that they mostly rely on the best scientific evidence. Twenty-seven percent (27%) believe that political views of the scientists generally influence their work.
Liberal Democrats tend to express high levels of confidence in the climate scientists and their motives. Conservative Republicans are often quite skeptical. Most other Americans have mixed views.
Full story
4) By Farming Fish In The Ocean, The World Could Produce 100 Times Current Seafood Consumption
Emma Bryce, Anthropocene, 18 August 2017
If we started farming fish in all the suitable offshore habitat available globally, we could technically produce 100 times our current seafood needs, a new study reveals. Writing in Nature Ecology and Evolution, a group of researchers makes the case that this form of food production has the potential to solve some of our most pressing food security challenges, and to take the pressure off wild fisheries, too.
Fish and bivalve farming–called aquaculture–has gained attention in recent years as one possible way to secure enough protein for the planet’s burgeoning population. “Oceans represent an immense opportunity for food production, yet the open ocean environment is largely untapped as a farming resource,” the University of California, Santa Barbara-led researchers write in their paper. Theirs is the first study to put an estimate on that untapped potential.
To do so, first they selected 180 consumable species of finfish and bivalves like oysters and molluscs. Then they chose suitable aquaculture habitat based on the conditions those species require to thrive, such as depth and water temperature. Next, the researchers eliminated other types of unsuitable habitat for fish farming–like areas dedicated to marine reserves, high-traffic shipping zones, and waters deeper than 200 metres, where it would be too costly to anchor fish farms.
Through this process of elimination, the researchers were able to identify aquaculture ‘hotspots’ around the world–finding that there are almost 13 million square kilometres of suitable open-ocean habitat globally available for fish and bivalve farming. Together, these productive waters could produce about 15 billion tons of seafood each year, surpassing our current-day seafood needs 100-fold. The study also found that every coastal country on earth is capable of meeting its domestic seafood needs by using just a small percentage of its coastal waters to farm fish–which could ensure food security and also reduce the reliance on imports. If just one percent of the Indonesian coastline was reserved for aquaculture, for instance, it could produce 24 million tons of fish a year, more than enough to satisfy the national appetite for seafood.
Farming fish on this scale could also take the strain off wild fisheries, which are threatened by climate change–and are also the cause of stock declines through overfishing. Aquaculture isn’t necessarily a fault-free solution, however: fish farms have developed a negative reputation in recent decades for causing largescale nutrient pollution and the spread of diseases and parasites from farmed to wild fish. With this in mind, the researchers caution that aquaculture development would need to be closely aligned with policies and practices that protect the environment. Fish farming is now the fastest-growing food sector; it’s important to consider how to take its progress forward sustainably to limit the considerable environmental risks, they say.
Full post
5) Lawrence J McQuillan: The Inconvenient Truth About Al Gore And The Climate 'Experts'
Investor’s Business Daily, 22 August 2017
Former Vice President Al Gore stars in his second documentary, "An Inconvenient Sequel: Truth to Power."
"Sooner or later," Gore tweeted, "climate deniers in the GOP will have to confront their willful blindness to the climate crisis." But skeptics of climate alarmism have their eyes wide open and don't like what they see.
Donald Trump won the popular vote among people 45 years and older. Many in these ranks have followed grass roots environmentalism since it began, following publication of Rachel Carson's book, "Silent Spring," in 1962. Over time they've learned that celebrated environmental experts make false and wildly exaggerated predictions. A prime example is Stanford University biologist Paul Ehrlich, a longtime environmental icon and author of the 1968 book "The Population Bomb."
"Population will inevitably and completely outstrip whatever small increases in food supplies we make," Ehrlich confidently predicted in a 1970 issue of Mademoiselle. "The death rate will increase until at least 100-200 million people per year will be starving to death during the next 10 years."
He assured readers of The Progressive in 1970 that between 1980 and 1989, 4 billion people, including 65 million Americans, would perish in the "Great Die-Off."
In a 1969 essay titled "Eco-Catastrophe!" Ehrlich said "most of the people who are going to die in the greatest cataclysm in the history of man have already been born."
Undeterred, the celebrity doomsayer and his cohorts now offer a new theory, claiming in a July 2017 issue of "Proceedings of the National Academy of Sciences" that human civilization stands in peril from an ongoing mass extinction on Earth:
"Dwindling population sizes and range shrinkages (of vertebrates) amount to a massive anthropogenic erosion of biodiversity and of the ecosystem services essential to civilization. This 'biological annihilation' underlines the seriousness for humanity of Earth's ongoing sixth mass extinction event." And so on.
Ehrlich has cried "The sky is falling!" so many times that anyone with common sense and a memory rightfully dismisses his apocalyptic rhetoric.
If the environmental movement's so-called experts had been correct, nearly all animal species would be extinct today, as S. Dillon Ripley, longtime head of the Smithsonian Institution, predicted. As Nigel Calder and Kenneth Watt had it, the Earth would likely be in another ice age today. According to geochemist Harrison Brown, copper, lead, zinc, tin, gold and silver would now be gone. Likewise, Watt and U.S. government analysts predicted that U.S. oil and natural gas reserves would be depleted by now. Instead, we're drowning in the stuff.
Hearing these spectacularly wrong predictions for decades, a large segment of the population has lost confidence in environmental research, regardless of its potential merits. Climate and natural resource scientists have only themselves to blame.
The failure to enforce rigorous scientific standards and publicly denounce alarmists and charlatans has left many Americans feeling hoodwinked, disregarding all environmental research, which is a shame.
But truth and accuracy don't seem to matter to many environmentalists.
The late Stanford University climatologist Stephen Schneider told Discover magazine in 1989, "we have to offer up scary scenarios, make simplified, dramatic statements, and make little mention of any doubts we might have. ... Each of us has to decide what the right balance is between being effective and being honest."
Rather than pursuing scientific truth, the goal is to win political battles. In 1988, then-Sen.Timothy Wirth, D-Colo., said, "We've got to ... try to ride the global warming issue. Even if the theory of global warming is wrong ... we will be doing the right thing anyway in terms of economic policy and environmental policy." And ride the issue they have.
Environmental "grievance hustlers" have used deception, lies, hype and hysteria, often covered with the gauze of taxpayer-funded research, to score political victories that expand Big Government.
Here's the real inconvenient truth. If Al Gore wants the people he denounces as "climate deniers" to take him seriously, his next documentary should target the intellectual dishonesty of many of his pals in the environmental movement. Don't hold your breath.
McQuillan is a senior fellow and director of the Center on Entrepreneurial Innovation at the Independent Institute, Oakland, Calif.
Jennifer Marohasy, The Spectator, 22 August 2017
After deconstructing 2,000-year old proxy-temperature series back to their most basic components, and then rebuilding them using the latest big data techniques, John Abbot and I show what global temperatures might have done in the absence of an industrial revolution.
The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he already knows, without a shadow of a doubt, what is laid before him.
So, our new technical paper in GeoResJ (vol. 14, pages 36-46) will likely be ignored. Because after applying the latest big data technique to six 2,000 year-long proxy-temperature series we cannot confirm that recent warming is anything but natural – what might have occurred anyway, even if there was no industrial revolution.
Over the last few years, I’ve worked with Dr John Abbot using artificial neural networks (ANN) to forecast monthly rainfall. We now have a bunch of papers in international climate science journals showing these forecasts to be more skilful than output from general circulation models.
During the past year, we’ve extended this work to estimating what global temperatures would have been during the twentieth century in the absence of human-emission of carbon dioxide.
We began by deconstructing the six-proxy series from different geographic regions – series already published in the mainstream climate science literature. One of these, the Northern Hemisphere composite series begins in 50 AD, ends in the year 2000, and is derived from studies of pollen, lake sediments, stalagmites and boreholes.
Typical of most such temperature series, it zigzags up and down while showing two rising trends: the first peaks about 1200 AD and corresponds with a period known as the Medieval Warm Period (MWP), while the second peaks in 1980 and then shows decline. In between, is the Little Ice Age (LIA), which according to the Northern Hemisphere composite bottomed-out in 1650 AD. (Of course, the MWP corresponded with a period of generally good harvests in England – when men dressed in tunics and built grand cathedrals with tall spires. It preceded the LIA when there was famine and the Great Plague of London.)
Ignoring for the moment the MWP and LIA, you might want to simply dismiss this temperature series on the basis it peaks in 1980: it doesn’t continue to rise to the very end of the record: to the year 2000?
In fact, this decline is typical of most such proxy reconstructions – derived from pollen, stalagmites, boreholes, coral cores and especially tree rings. Within mainstream climate science the decline after 1980 is referred to as “the divergence problem”, and then hidden.
In denial of this problem, leading climate scientists have been known to even graft temperature measurements from thermometers onto the proxy record after 1980 to literally ‘hide the decline’. Phil Jones, the head of the Climate Research Unit, University of East Anglia, aptly described the technique as a ‘trick’.
Grafting thermometer data onto the end of the proxy record generally ‘fixes’ the problem after 1980, while remodelling effectively flattens the Medieval Warm Period.
There are, however, multiple lines of evidence indicating it was about a degree warmer across Europe during the MWP – corresponding with the 1200 AD rise in our Northern Hemisphere composite. In fact, there are oodles of published technical papers based on proxy records that provide a relatively warm temperature profile for this period. This was before the Little Ice Age when it was too cold to inhabit Greenland.
The modern inhabitation of Upernavik, in north west Greenland, only began in 1826, which corresponds with the beginning of the industrial age. So, the end of the Little Ice Age corresponds with the beginning of industrialisation. But did industrialisation cause the global warming? Tolstoy’s ‘intelligent man’ would immediately reply: But yes!
In our new paper in GeoResJ, we make the assumption that an artificial neural network – remember our big data/machine learning technique – trained on proxy temperatures up until 1830, would be able to forecast the combined effect of natural climate cycles through the twentieth century.
Using the proxy record from the Northern Hemisphere composite, decomposing this through signal analysis and then using the resulting component sine waves as input into an ANN, John Abbot and I generated forecasts for the period from 1830 to 2000.
Our results show up to 1°C of warming. The average divergence between the proxy temperature record and our ANN projection is just 0.09 degree Celsius. This suggests that even if there had been no industrial revolution and burning of fossil fuels, there would have still been warming through the twentieth century – to at least 1980, and of almost 1°C.
Full post
For more information, including charts and a link to the full paper read Jennifer Marohasy’s latest blog post.
2) Most Of The Recent Warming Could Be Natural
Jennifer Marohasy, 21 August 2017
The science is far from settled. In reality, some of the data is ‘problematic’, the underlying physical mechanisms are complex and poorly understood, the literature voluminous, and new alternative techniques can give very different answers to those derived from General Circulation Models and remodelled proxy-temperature series.
After deconstructing 2,000-year old proxy-temperature series back to their most basic components, and then rebuilding them using the latest big data techniques, John Abbot and I show what global temperatures might have done in the absence of an industrial revolution. The results from this novel technique, just published in GeoResJ [1], accord with climate sensitivity estimates from experimental spectroscopy but are at odds with output from General Circulation Models.
According to mainstream climate science, most of the recent global warming is our fault – caused by human emissions of carbon dioxide. The rational for this is a speculative theory about the absorption and emission of infrared radiation by carbon dioxide that dates back to 1896. It’s not disputed that carbon dioxide absorbs infrared radiation, what is uncertain is the sensitivity of the climate to increasing atmospheric concentrations.
This sensitivity may have been grossly overestimated by Svante Arrhenius more than 120 years ago, with these overestimations persisting in the computer-simulation models that underpin modern climate science [2]. We just don’t know; in part because the key experiments have never been undertaken [2].
What I do have are whizz-bang gaming computers that can run artificial neural networks (ANN), which are a form of machine learning: think big data and artificial intelligence.
My colleague, Dr John Abbot, has been using this technology for over a decade to forecast the likely direction of particular stock on the share market – for tomorrow.
Since 2011, I’ve been working with him to use this same technology for rainfall forecasting – for the next month and season [4,5,6]. And we now have a bunch of papers in international climate science journals on the application of this technique showing its more skilful than the Australian Bureau of Meteorology’s General Circulation Models for forecasting monthly rainfall.
During the past year, we’ve extended this work to build models to forecast what temperatures would have been in the absence of human-emission of carbon dioxide – for the last hundred years.
We figured that if we could apply the latest data mining techniques to mimic natural cycles of warming and cooling – specifically to forecast twentieth century temperatures in the absence of an industrial revolution – then the difference between the temperature profile forecast by the models, and actual temperatures would give an estimation of the human-contribution from industrialisation. […]
Full post
3) New Survey: Only 28% Of Americans Believe Climate Scientists Understand Causes Of Climate Change ‘Very Well’
Scott Rasmussen, 21 August 2017
Twenty-eight percent (28%) of Americans think that climate scientists understand the causes of global climate change “very well.” A Pew Research study found that only 19% believe that the climate scientists have a very good understanding of the best ways to address the issue.
In general, the study found that Americans trust climate scientists more than politicians on the topic. Two-thirds (67%) believe scientists should play a major role in addressing policy issues on the matter. Most (56%) also believe that energy industry leaders (56%) and the general public (56%) should have a major say in such policy topics.
The Pew study, however, also found that people believe there are differences of opinion among the climate scientists. Only 27% believe that there is a consensus on the issue and that just about all climate scientists believe human behavior is mostly responsible for global climate change. Another 35% think more than half hold this view.
The survey also explored the degree of trust and confidence in those researching climate science. Thirty-six percent (36%) believe that, most of the time, scientists’ research findings are motivated by a desire to advance their own careers. Only 32% say that they mostly rely on the best scientific evidence. Twenty-seven percent (27%) believe that political views of the scientists generally influence their work.
Liberal Democrats tend to express high levels of confidence in the climate scientists and their motives. Conservative Republicans are often quite skeptical. Most other Americans have mixed views.
Full story
4) By Farming Fish In The Ocean, The World Could Produce 100 Times Current Seafood Consumption
Emma Bryce, Anthropocene, 18 August 2017
If we started farming fish in all the suitable offshore habitat available globally, we could technically produce 100 times our current seafood needs, a new study reveals. Writing in Nature Ecology and Evolution, a group of researchers makes the case that this form of food production has the potential to solve some of our most pressing food security challenges, and to take the pressure off wild fisheries, too.
Fish and bivalve farming–called aquaculture–has gained attention in recent years as one possible way to secure enough protein for the planet’s burgeoning population. “Oceans represent an immense opportunity for food production, yet the open ocean environment is largely untapped as a farming resource,” the University of California, Santa Barbara-led researchers write in their paper. Theirs is the first study to put an estimate on that untapped potential.
To do so, first they selected 180 consumable species of finfish and bivalves like oysters and molluscs. Then they chose suitable aquaculture habitat based on the conditions those species require to thrive, such as depth and water temperature. Next, the researchers eliminated other types of unsuitable habitat for fish farming–like areas dedicated to marine reserves, high-traffic shipping zones, and waters deeper than 200 metres, where it would be too costly to anchor fish farms.
Through this process of elimination, the researchers were able to identify aquaculture ‘hotspots’ around the world–finding that there are almost 13 million square kilometres of suitable open-ocean habitat globally available for fish and bivalve farming. Together, these productive waters could produce about 15 billion tons of seafood each year, surpassing our current-day seafood needs 100-fold. The study also found that every coastal country on earth is capable of meeting its domestic seafood needs by using just a small percentage of its coastal waters to farm fish–which could ensure food security and also reduce the reliance on imports. If just one percent of the Indonesian coastline was reserved for aquaculture, for instance, it could produce 24 million tons of fish a year, more than enough to satisfy the national appetite for seafood.
Farming fish on this scale could also take the strain off wild fisheries, which are threatened by climate change–and are also the cause of stock declines through overfishing. Aquaculture isn’t necessarily a fault-free solution, however: fish farms have developed a negative reputation in recent decades for causing largescale nutrient pollution and the spread of diseases and parasites from farmed to wild fish. With this in mind, the researchers caution that aquaculture development would need to be closely aligned with policies and practices that protect the environment. Fish farming is now the fastest-growing food sector; it’s important to consider how to take its progress forward sustainably to limit the considerable environmental risks, they say.
Full post
5) Lawrence J McQuillan: The Inconvenient Truth About Al Gore And The Climate 'Experts'
Investor’s Business Daily, 22 August 2017
Former Vice President Al Gore stars in his second documentary, "An Inconvenient Sequel: Truth to Power."
"Sooner or later," Gore tweeted, "climate deniers in the GOP will have to confront their willful blindness to the climate crisis." But skeptics of climate alarmism have their eyes wide open and don't like what they see.
Donald Trump won the popular vote among people 45 years and older. Many in these ranks have followed grass roots environmentalism since it began, following publication of Rachel Carson's book, "Silent Spring," in 1962. Over time they've learned that celebrated environmental experts make false and wildly exaggerated predictions. A prime example is Stanford University biologist Paul Ehrlich, a longtime environmental icon and author of the 1968 book "The Population Bomb."
"Population will inevitably and completely outstrip whatever small increases in food supplies we make," Ehrlich confidently predicted in a 1970 issue of Mademoiselle. "The death rate will increase until at least 100-200 million people per year will be starving to death during the next 10 years."
He assured readers of The Progressive in 1970 that between 1980 and 1989, 4 billion people, including 65 million Americans, would perish in the "Great Die-Off."
In a 1969 essay titled "Eco-Catastrophe!" Ehrlich said "most of the people who are going to die in the greatest cataclysm in the history of man have already been born."
Undeterred, the celebrity doomsayer and his cohorts now offer a new theory, claiming in a July 2017 issue of "Proceedings of the National Academy of Sciences" that human civilization stands in peril from an ongoing mass extinction on Earth:
"Dwindling population sizes and range shrinkages (of vertebrates) amount to a massive anthropogenic erosion of biodiversity and of the ecosystem services essential to civilization. This 'biological annihilation' underlines the seriousness for humanity of Earth's ongoing sixth mass extinction event." And so on.
Ehrlich has cried "The sky is falling!" so many times that anyone with common sense and a memory rightfully dismisses his apocalyptic rhetoric.
If the environmental movement's so-called experts had been correct, nearly all animal species would be extinct today, as S. Dillon Ripley, longtime head of the Smithsonian Institution, predicted. As Nigel Calder and Kenneth Watt had it, the Earth would likely be in another ice age today. According to geochemist Harrison Brown, copper, lead, zinc, tin, gold and silver would now be gone. Likewise, Watt and U.S. government analysts predicted that U.S. oil and natural gas reserves would be depleted by now. Instead, we're drowning in the stuff.
Hearing these spectacularly wrong predictions for decades, a large segment of the population has lost confidence in environmental research, regardless of its potential merits. Climate and natural resource scientists have only themselves to blame.
The failure to enforce rigorous scientific standards and publicly denounce alarmists and charlatans has left many Americans feeling hoodwinked, disregarding all environmental research, which is a shame.
But truth and accuracy don't seem to matter to many environmentalists.
The late Stanford University climatologist Stephen Schneider told Discover magazine in 1989, "we have to offer up scary scenarios, make simplified, dramatic statements, and make little mention of any doubts we might have. ... Each of us has to decide what the right balance is between being effective and being honest."
Rather than pursuing scientific truth, the goal is to win political battles. In 1988, then-Sen.Timothy Wirth, D-Colo., said, "We've got to ... try to ride the global warming issue. Even if the theory of global warming is wrong ... we will be doing the right thing anyway in terms of economic policy and environmental policy." And ride the issue they have.
Environmental "grievance hustlers" have used deception, lies, hype and hysteria, often covered with the gauze of taxpayer-funded research, to score political victories that expand Big Government.
Here's the real inconvenient truth. If Al Gore wants the people he denounces as "climate deniers" to take him seriously, his next documentary should target the intellectual dishonesty of many of his pals in the environmental movement. Don't hold your breath.
McQuillan is a senior fellow and director of the Center on Entrepreneurial Innovation at the Independent Institute, Oakland, Calif.
The London-based Global Warming Policy Forum is a world leading think tank on global warming policy issues. The GWPF newsletter is prepared by Director Dr Benny Peiser - for more information, please visit the website at www.thegwpf.com.
No comments:
Post a Comment
Thanks for engaging in the debate!
Because this is a public forum, we will only publish comments that are respectful and do NOT contain links to other sites. We appreciate your cooperation.