No Convincing Evidence That Greenland Ice Sheet Is Melting Rapidly

Greenland1.jpg

When Greenland’s ice sheet lost an estimated 11.3 billion tonnes (12.5 billion tons) of ice on a single warm day this August, following back-to-back heat waves in June and July in western Europe, the climate doomsday machine went into overdrive. Predictions abounded of ever-accelerating melting of both Greenland and Antarctic ice sheets, and the imminent drowning of cities such as London and Miami by rising seas.

But all this hype is unnecessary fear-mongering. As discussed in the previous post, the massive Antarctic ice sheet may not be melting at all as the world warms. And the much smaller Greenland ice sheet isn’t melting any faster on average now than it was 15 years ago. While this year’s summer melt was above average, it was no more than seen seven years before, in 2012.

Melting takes place only during the short Greenland summer, the meltwater running over the ice sheet surface into the ocean, as well as funneling its way down through thick glaciers, helping speed up their flow toward the sea. Although some of the ice lost in summer is normally ice gained over the long winter from the accumulation of compacted snow, last winter’s low snowfall quickly disappeared this year in an early start to the melt season.   

Greenland melt.jpg

The figure to the left shows that at the peak of the 2019 event, over 60% of the ice sheet surface melted; on July 30 and 31, NOAA (the U.S. National Oceanic and Atmospheric Administration) reported that even the very highest point of the ice sheet liquefied briefly – a rare occurrence. The ice sheet, 2-3 km (6,600-9,800 feet) thick, consists of layers of compressed snow built up over at least hundreds of thousands of years. In addition to summer melting, the sheet loses ice by calving of icebergs at its edges.

The figure below depicts the daily variation, over a full year, of the estimated mass of ice in the Greenland ice sheet, relative to its average value from 1981 to 2010. The loss of ice during the summer months of June, July and August is clearly visible, the biggest recent losses having occurred in 2012 and 2019.

Greenland ice loss 2019.jpg

This year’s total ice loss has been estimated at 329 billion tonnes (363 billion tons), somewhat lower than the record 458 billion tonnes (505 billion tons) that melted in 2012. Despite the high 2012 loss, however, the average loss from 2012 through 2016 of 247 billion tonnes (272 billion tons) per year was essentially the same as the 2002 through 2011 average loss of 263 billion tonnes (290 billion tons) per year, according to the IPCC (Intergovernmental Panel on Climate Change).

So, while the average annual loss of 258 billion tonnes (284 billion tons) between 2002 and 2016 is a big jump from the average 75 billion tonnes (83 billion tons) of ice lost yearly during the 20th century, it appears that the Greenland ice sheet has at least stabilized since 2002. The 21st-century increase in summer melt rate may arise partly from dominance of the negative phase of the natural North Atlantic Oscillation since about 2000.

Little known about Greenland is that the ice sheet was smaller than it is today over most of the period since the end of the last ice age. During the long interglacial epoch, as human civilization developed and thrived, there were several periods when it was warmer on average in Greenland than at present, as illustrated in the next figure. This has been deduced by analyzing ice cores extracted from the Greenland ice sheet; the cores carry a record of past temperatures and atmospheric composition. 

Greenland temp(2) 5,000 yrs.jpg

One of the warm spells was the Medieval Warm Period, an era when Scandinavian Vikings colonized Greenland – growing crops, raising animals and hunting seals for meat and walruses for ivory. The Vikings are thought to have abandoned the island after temperatures dropped with the onset of the Little Ice Age. But there’s little doubt that what made it possible for the Vikings to settle in Greenland at all were a relatively hospitable climate and less ice than exists today.

To put everything in perspective, the present ice loss of 247 billion tonnes (272 billion tons) of ice every year represents only about 0.01% of the total mass of the ice sheet. At the current rate, therefore, it would take another 10,000 years for all Greenland’s ice to melt.

Next: Ocean Warming: How the IPCC Distorts the Evidence

No Convincing Evidence That Antarctic Ice Sheet is Melting

Antarctica Dome A.jpg

Of all the observations behind mass hysteria over our climate, none induces as much panic as melting of the earth’s two biggest ice sheets, covering the polar landmasses of Antarctica and Greenland. As long ago as 2006, Al Gore’s environmental documentary “An Inconvenient Truth” proclaimed that global warming would melt enough ice to cause a 6-meter (20-foot) rise in sea level “in the near future.” Today, every calving of a large iceberg from an ice shelf or glacier whips the mainstream media into a frenzy.

The huge Antarctic ice sheet alone would raise global sea levels by about 60 meters (200 feet) were it to melt completely. But there’s little evidence that the kilometers-thick ice sheet, which contains about 90% of the world’s freshwater ice, is melting at all.

Antarctica ice shelf.jpg

Any calving of large icebergs – a natural process unrelated to warming – from an ice shelf, or even disintegration into small icebergs, barely affects sea level. This is because the ice that breaks off was already floating on the ocean. Although a retreating ice shelf can contribute to sea level rise by accelerating the downhill flow of glaciers that feed the shelf, current breakups of Antarctic ice shelves are adding no more than about 0.1 mm (about 4/1000ths of an inch) per year to global sea levels, according to NOAA (the U.S. National Oceanic and Atmospheric Administration).

Global warming has certainly affected Antarctica, though not by as much as the Arctic. East Antarctica, by far the largest region that covers two thirds of the continent, heated up by only 0.06 degrees Celsius (0.11 degrees Fahrenheit) per decade between 1958 and 2012. At the South Pole, which is located in East Antarctica, temperatures actually fell in recent decades.

For comparison, global temperatures over this period rose by 0.11 degrees Celsius (0.20 degrees Fahrenheit) per decade, and Arctic temperatures shot up at an even higher rate. Antarctic warming from 1958 to 2012 is illustrated in the figure below, based on NOAA data. East Antarctica is to the right, West Antarctica to the left of the figure.

Antarctic temps 1958-2012.jpg

You can see, however, that temperatures in West Antarctica and the small Antarctic Peninsula, which points toward Argentina, increased more rapidly than in East Antarctica, by 0.22 degrees Celsius (0.40 degrees Fahrenheit) and 0.33 degrees Celsius (0.59 degrees Fahrenheit) per decade, respectively – faster than the global average. Still, the Peninsula has cooled since 2000.

It’s not surprising, therefore, that all the hype about imminent collapse of the Antarctic ice sheet centers on events in West Antarctica, such as glaciers melting at rapid rates. The Fifth Assessment Report of the UN’s IPCC (Intergovernmental Panel on Climate Change) maintained with high confidence that, between 2005 and 2010, the ice sheet was shedding mass and causing sea levels to rise by 0.41 mm per year, contributing about 24% of the measured rate of 1.7 mm (1/16th of an inch) per year between 1900 and 2010.

On the other hand, a 2015 NASA study reported that the Antarctic ice sheet was actually gaining rather than losing ice in 2008, and that ice thickening was making sea levels fall by 0.23 mm per year. The study authors found that the ice loss from thinning glaciers in West Antarctica and the Antarctic Peninsula was currently outweighed by new ice formation in East Antarctica resulting from warming-enhanced snowfall. Across the continent, Antarctica averages roughly  5 cm (2 inches) of precipitation per year. The same authors say that the trend has continued until at least 2018, despite a recent research paper by an international group of polar scientists endorsing the IPCC human-caused global warming narrative of diminishing Antarctic ice.

The two studies are both based on satellite altimetry – the same method used to measure sea levels, but in this case measuring the height of the ice sheet. Both studies also depend on models to correct the raw data for factors such as snowdrift, ice compaction and motion of the underlying bedrock. It’s differences in the models that give rise to the diametrically opposite results of the studies, one finding that Antarctic ice is melting away but the other concluding that it’s really growing.

Such uncertainty, even in the satellite era, shouldn’t be surprising. Despite the insistence of many climate scientists that theirs is a mature field of research, much of today’s climate science is dependent on models to interpret the empirical observations. The models, just like computer climate models, aren’t always good representations of reality.

Al Gore’s 6-meter (20-foot) rise hasn’t happened yet, and isn’t likely to happen even by the end of this century. Global panic over the impending meltdown of Antarctica is totally unwarranted.

(This post has also been kindly reproduced in full on the Climate Depot blog.)

Next: No Convincing Evidence That Greenland Ice Sheet Is Melting Rapidly

Shrinking Sea Ice: Evaluation of the Evidence

Most of us know about the loss of sea ice in the Arctic due to global warming. The dramatic reduction in summer ice cover, which has continued for almost 40 years, is frequently hyped by the mainstream media and climate activists as an example of what we’re supposedly doing to the planet.

But the loss is nowhere near as much as predicted, and in fact was no more in the summer of 2019 than in 2007. Also, it’s little known that Arctic sea ice has melted before during the record heat of the 1930s. And the sea ice around Antarctica, at the other end of the globe, has been expanding since at least 1979.

Actual scientific observations of sea ice in the Arctic and Antarctic have only been possible since satellite measurements began in 1979. The figure below shows satellite-derived images of Arctic sea ice extent in the summer of 1979 (left image), and the summer (September) and winter (March) of 2018 (right image). Sea ice expands to its maximum extent during the winter and shrinks during summer months.   

Arctic ice 1979.jpg
Arctic ice 2018.jpg

Arctic summer ice extent decreased by approximately 33% over the interval from 1979 to 2018; while it still encases northern Greenland, it no longer reaches the Russian coast.

However, there has been no net ice loss since 2007, with the year-to-year minimum extents fluctuating around a plateau. An exception was 2012, when a powerful August storm known as the Great Arctic Cyclone tore off a large chunk of ice from the main sea ice pack. Clearly, the evidence refutes numerous prognostications by advocates of catastrophic human-caused warming that Arctic ice would be completely gone by 2016. 

Before 1979, the only data available on Arctic sea ice are scattered observations from sources such as ship reports, aircraft reconnaissance and drifting buoys – observations recorded and synthesized by the Danish Meteorological Institute and the Russian Arctic and Antarctic Research Institute. Analyses of this spotty data have resulted in numerous reconstructions of Arctic sea ice extent in the pre-satellite era.

One such recent reconstruction is shown in the next figure, depicting reconstructed Arctic summer ice area, in millions of square kilometers, from 1900 to 2013. The reconstruction was based on the strong correlation of Arctic sea ice extent with Arctic air temperatures during the satellite era, especially in the summer, a correlation assumed to be the same in earlier years as well. This assumption then enabled the researchers to reconstruct the sea ice area before 1979 from observed temperatures in that era.  

Ice.jpg

What this graph reveals is that summer ice cover in the Arctic, apart from its present decline since about 1979, contracted previously in the 1920s and 1930s. According to the researchers, the biggest single-year decrease in area, which occurred in 1936, was about 26% – not much less than the 33% drop by 2018. Although this suggests that the relatively low sea ice extents in recent years are comparable to the 1930s, the reconstruction doesn’t incorporate any actual pre-satellite observations. Other reconstructions that do incorporate the earlier data show a smaller difference between the 1930s and today.

It’s the opposite story for sea ice in the Antarctic, which is at its lowest extent during the southern summer in February, as shown in the satellite-derived image below for 2018-19.

Antarctic ice 2018-2019.jpg

Despite the contraction in the Arctic, the sea ice around Antarctica has been expanding during the satellite era. As can be seen from the following figure, Antarctic sea ice has gained in extent by an average of 1.8% per decade (the dashed line represents the trend), though the ice extent fluctuates greatly from year to year. Antarctic sea ice covers a larger area than Arctic ice but occupies a smaller overall volume, because it’s only about half as thick.

Antarctic ice.jpg

Another fallacious claim about disappearing sea ice in the Arctic, one that has captured the public imagination like no other, is that the polar bear population is diminishing along with the ice. But, while this may yet happen in the future, current evidence shows that the bear population has been stable for the whole period that the ice has been decreasing and may even be growing, according to the native Inuit.

In summary, Arctic sea ice shrank from about 1979 to 2007 because of global warming, but has remained at the same extent on average in the 12 years since then, while Antarctic sea ice has expanded slightly over the whole period. So there’s certainly no cause for alarm.

Next: No Convincing Evidence That Antarctic Ice Sheet is Melting

No Evidence That Climate Change Is Accelerating Sea Level Rise

Malé, Maldives Capital City

Malé, Maldives Capital City

By far the most publicized phenomenon cited as evidence for human-induced climate change is rising sea levels, with the media regularly trumpeting the latest prediction of the oceans flooding or submerging cities in the decades to come. Nothing instills as much fear in low-lying coastal communities as the prospect of losing one’s dwelling to a hurricane storm surge or even slowly encroaching seawater. Island nations such as the Maldives in the Indian Ocean and Tuvalu in the Pacific are convinced their tropical paradises are about to disappear beneath the waves.

There’s no doubt that the average global sea level has been increasing ever since the world started to warm after the Little Ice Age ended around 1850. But there’s no reliable scientific evidence that the rate of rise is accelerating, or that the rise is associated with any human contribution to global warming.   

A comprehensive 2018 report on sea level and climate change by Judith Curry, a respected climate scientist and global warming skeptic, emphasizes the complexity of both measuring and trying to understand recent sea level rise. Because of the switch in 1993 from tide gauges to satellite altimetry as the principal method of measurement, the precise magnitude of sea level rise as well as projections for the future are uncertain.

According to both Curry and the UN’s IPCC (Intergovernmental Panel on Climate Change), the average global rate of sea level rise from 1901 to 2010 was 1.7 mm (about 1/16th of an inch) per year. In the latter part of that period from 1993 onward, the rate of rise was 3.2 mm per year, almost double the average rate – though this estimate is considered too high by some experts. But, while the sudden jump may seem surprising and indicative of acceleration, the fact is that the globally averaged sea level fluctuates considerably over time. This is illustrated in the IPCC’s figure below, which shows estimates from tide gauge data of the rate of rise from 1900 to 1993.

cropped.jpg

It’s clear that the rate of rise was much higher than its 20th century average during the 30 years from 1920 to 1950, and much lower than the average from 1910 to 1920 and again from 1955 to 1980. Strong regional differences exist too. Actual rates of sea level rise range from negative in Stockholm, corresponding to a falling sea level, as that region continues to rebound after melting of the last ice age’s heavy ice sheet, to positive rates three times higher than average in the western Pacific Ocean.

The regional variation is evident in the next figure, showing the average rate of sea level rise across the globe, measured by satellite, between 1993 and 2014.

Sea level rise rate 1993-2014.jpg

You can see that during this period sea levels increased fastest in the western Pacific as just noted, and in the southern Indian and Atlantic Oceans. At the same time, the sea level fell near the west coast of North America and in the Southern Ocean near Antarctica.

The reasons for such a jumbled picture are several. Because water expands and occupies more volume as it gets warmer, higher ocean temperatures raise sea levels. Yet the seafloor is not static and can sink under the weight of the extra water in the ocean basin that comes from melting glaciers and ice caps, and can be altered by underwater volcanic eruptions. Land surfaces can also sink (as well as rebound), as a result of groundwater depletion in arid regions or landfilling in coastal wetlands. For example, about 50% of the much hyped worsening of tidal flooding in Miami Beach, Florida is due to sinking of reclaimed swampland.

Historically, sea levels have been both lower and higher in the past than at present. Since the end of the last ice age, the average level has risen about 120 meters (400 feet), as depicted in the following figure. After it reached a peak in at least some regions about 6,000 years ago, however, the sea level has changed relatively little, even when industrialization began boosting atmospheric CO2. Over the 20th century, the worldwide average rise was about 15-18 cm (6-7 inches).

Sea level rise 24,000 yr.jpg

That the concerns of islanders are unwarranted despite rising seas is borne out by recent studies revealing that low-lying coral reef islands in the Pacific are actually growing in size by as much as 30% per century, and not shrinking. The growth is due to a combination of coral debris buildup, land reclamation and sedimentation. Another study found that the Maldives -- the world's lowest country -- formed when sea levels were even higher than they are today. Studies such as these belie the popular claim that islanders will become “climate refugees,” forced to leave their homes as sea levels rise.

Next: Shrinking Sea Ice: Evaluation of the Evidence

No Evidence That Heat Kills More People than Cold

The irony in the recent frenzy over heat waves is that many more humans die each year from cold than they do from heat. But you wouldn’t know that from sensational media headlines reporting “killer” heat waves and conditions “as hot as hell.” In reality, cold weather worldwide kills 17 times as many people as heat.

This conclusion was reached by a major international study in 2015, published in the prestigious medical journal The Lancet. The study analyzed more than 74 million deaths in 384 locations across 13 countries including Australia, China, Italy, Sweden, the UK and USA, over the period from 1985 to 2012. The results are illustrated in the figure below, showing the average daily rate of premature deaths from heat or cold as a percentage of all deaths, by country.

World heat vs cold deaths.jpg

Perhaps not surprisingly, moderate cold kills people far more often than extreme cold, for a wide range of different climates. Extreme cold was defined by the study authors as temperatures falling below the 2.5th percentile at each location, a limit which varied from as low as -11 degrees Celsius (12 degrees Fahrenheit) in Toronto, Canada to as high as 25 degrees Celsius (77 degrees Fahrenheit) in Bangkok, Thailand. Moderate cold includes all temperatures from this lower limit up to the so-called optimum, the temperature at which the daily death rate at that location is a minimum.

Likewise, extreme heat was defined as temperatures above the 97.5th percentile at each location, and moderate heat as temperatures from the optimum up to the 97.5th percentile. But unlike cold, extreme and moderate heat cause approximately equal numbers of excess deaths.

The study found that on average, 7.71% of all deaths could be attributed to hot or cold – to temperatures above or below the optimum – with 7.29% being due to cold, but only 0.42% due to heat. That single result puts the lie to the popular belief that heat waves are deadlier than cold spells. Hypothermia kills a lot more of us than heat stroke. And though both high and low temperatures can increase the risk of exacerbating cardiovascular, respiratory and other conditions, it’s cold that is the big killer.

This finding is further borne out by seasonal mortality statistics. France, for instance, recorded 700 excess deaths attributed to heat in the summer of 2016, 475 in 2017 and 1,500 in 2018. Yet excess deaths from cold in the French winter from December to March average approximately 24,000. Even the devastating summer heat wave of 2003 claimed only 15,000 lives in France.

Similar statistics come from the UK, where an average of 32,000 more deaths occur during each December to March period than in any other four-month interval of the year. Flu epidemics boosted this total to 37,000 in the winter of 2016-17, and to 50,000 in 2017-18. Just as in France, these numbers for deaths from winter cold far exceed summer mortalities in the UK due to heat, which reached only 860 in 2018 and just 2,200 in the heat-wave year of 2003.

Even more evidence that cold kills a lot more people than heat is seen in an earlier study, published in the BMJ (formerly the British Medical Journal) in 2000. This study, restricted to approximately 3 million deaths in western Europe from 1988 to 1992, found that annual cold related deaths were much higher than heat related deaths in all seven regions studied – with the former averaging 2,000 per million people and the latter only 220 per million. Additionally, no more deaths from heat occurred in hotter regions than colder ones.

A sophisticated statistical approach was necessary in both studies. This is because of differences between regions and individuals, and the observation that, while death from heat is typically rapid and occurs within a few days, death from cold can be delayed up to three or four weeks. The larger Lancet study used more advanced statistical modeling than the BMJ study.

And despite the finding that more than 50% of published papers in biomedicine are not reproducible, the fact that two independent papers reached essentially the same result gives their conclusions some credibility.

Next: No Evidence That Climate Change Is Accelerating Sea Level Rise

No Evidence That Climate Change Causes Weather Extremes: (6) Heat Waves

This Northern Hemisphere summer has seen searing, supposedly record high temperatures in France and elsewhere in Europe. According to the mainstream media and climate alarmists, the heat waves are unprecedented and a harbinger of harsh, scorching hot times to come.

But this is absolute nonsense. In this sixth and final post in the present series, I’ll examine the delusional beliefs that the earth is burning up and may shortly be uninhabitable, and that this is all a result of human-caused climate change. Heat waves are no more linked to climate change than any of the other weather extremes we’ve looked at.

The brouhaha over two almost back-to-back heat waves in western Europe is a case in point. In the second, which occurred toward the end of July, the WMO (World Meteorological Organization) claimed that the mercury in Paris reached a new record high of 42.6 degrees Celsius (108.7 degrees Fahrenheit) on July 25, besting the previous record of 40.4 degrees Celsius (104.7 degrees Fahrenheit) set back in July, 1947. And a month earlier during the first heat wave, temperatures in southern France hit a purported record 46.0 degrees Celsius (114.8 degrees Fahrenheit) on June 28.

How convenient to ignore the past! Reported in Australian and New Zealand newspapers from August, 1930 is an account of an earlier French heatwave, in which the temperature soared to a staggering 50 degrees Celsius (122 degrees Fahrenheit) in the Loire valley, located in central France. That’s a full 4.0 degrees Celsius (7.2 degrees Fahrenheit) above the so-called record just mentioned in southern France, where the temperature in 1930 may well have equaled or exceeded the Loire valley’s towering record.

And the same newpaper articles reported a temperature in Paris that day of 38 degrees Celsius (100 degrees Fahrenheit), stating that back in 1870 the thermometer had reached an even higher, unspecified level there – quite possibly above the July 2019 “record” of 42.6 degrees Celsius (108.7 degrees Fahrenheit).    

The same duplicity can be seen in proclamations about past U.S. temperatures. Although it’s frequently claimed that heat waves are increasing in both intensity and frequency, there’s simply no scientific evidence for such a bold assertion. The following figure charts official data from NOAA (the U.S. National Oceanic and Atmospheric Administration) showing the yearly number of days, averaged over all U.S. temperature stations, from 1895 to 2018 with extreme temperatures above 38 degrees Celsius (100 degrees Fahrenheit) and 41 degrees Celsius (105 degrees Fahrenheit)

ac-rebuttal-heat-waves-081819.jpg

The next figure shows NOAA’s data for the year in which the record high temperature in each U.S. state occurred. Of the 50 state records, a total of 32 were set in the 1930s or earlier, but only seven since 1990.

US high temperature records.jpg

It’s obvious from these two figures that there were more U.S. heat waves in the 1930s, and they were hotter, than in the present era of climate hysteria. Indeed, the annual number of days on which U.S. temperatures reached 100 degrees, 95 degrees or 90 degrees Fahrenheit has been steadily falling since the 1930s. The EPA (Environmental Protection Agency)’s Heat Wave Index for the 48 contiguous states also shows clearly that the 1930s were the hottest decade.

Globally, it’s exactly the same story, as depicted in the figure below.

World record high temperatures 500.jpg

Of the seven continents, six recorded their all-time record high temperatures before 1982, three records dating from the 1930s or before; only Asia has set a record more recently (the WMO hasn’t acknowledged the 122 degrees Fahrenheit 1930 record in the Loire region). And yet the worldwide baking of the 1930s didn’t set the stage for more and worse heat waves in the years ahead, even as CO2 kept pouring into the atmosphere – the scenario we’re told, erroneously, that we face today. In fact, the sweltering 1930s were followed by global cooling from 1940 to 1970.

Contrary to the climate change narrative, the recent European heat waves came about not because of global warming, but rather a weather phenomenon known as jet stream blocking. Blocking results from an entirely different mechanism than the buildup of atmospheric CO2, namely a weakening of the sun’s output that may portend a period of global cooling ahead. A less active sun generates less UV radiation, which in turn perturbs winds in the upper atmosphere, locking the jet stream in a holding or blocking pattern. In this case, blocking kept a surge of hot Sahara air in place over Europe for extended periods.

It should be clear from all the evidence presented above that mass hysteria over heat waves and climate change is completely unwarranted. Current heat waves have as little to do with global warming as floods, droughts, hurricanes, tornadoes and wildfires.

Next: No Evidence That Heat Kills More People than Cold

No Evidence That Climate Change Causes Weather Extremes: (5) Wildfires

Probably the most fearsome of the weather extremes commonly blamed on human-caused climate change are tornadoes – the previous topic in this series – and wildfires. Both can arrive with little or no warning, making it difficult or impossible to flee, are often deadly, and typically destroy hundreds of homes and other structures. But just like tornadoes, there is no scientific evidence that the frequency or severity of wildfires are on the rise in a warming world.

You wouldn’t know that, however, from the mass hysteria generated by the mainstream media and climate activists almost every time a wildfire breaks out, especially in naturally dry climates such as those in California, Australia or Spain. While it’s true that the number of acres burned annually in the U.S. has gone up over the last 20 years or so, the present burned area is still only a small fraction of what it was back in the record 1930s, as seen in the figure below, showing data compiled by the U.S. National Interagency Fire Center.

Wildfires US-acres-burned 1926-2017 copy.jpg

Because modern global warming was barely underway in the 1930s, climate change clearly has nothing to do with the incineration of U.S. forests. Exactly the same trend is apparent in the next figure, which depicts the estimated area worldwide burned by wildfires, by decade from 1900 to 2010. Clearly, wildfires have diminished globally as the planet has warmed.

Global Burned Area

1900-2010

Wildfires global-acres-burned JPG.jpg

In the Mediterranean, although the annual number of wildfires has more than doubled since 1980, the burned area over three decades has mimicked the global trend and declined:

Mediterranean Wildfire Occurrence & Burnt Area

1980-2010

Wildfires Mediterranean_number_and_area 1980-2010 copy.jpg

The contrast between the Mediterranean and the U.S., where wildfires are becoming fewer but larger in area, has been attributed to different forest management policies on the two sides of the Atlantic – despite the protestations of U.S. politicians and firefighting officials in western states that climate change is responsible for the uptick in fire size. The next figure illustrates the timeline from 1600 onwards of fire occurrence at more than 800 different sites in western North America. 

Western North America Wildfire Occurrence

1600-2000

Western North American wildfires JPG.jpg

The sudden drop in wildfire occurrence around 1880 has been ascribed to the expansion of American livestock grazing in order to feed a rapidly growing population. Intensive sheep and cattle grazing after that time consumed most of the grasses that previously constituted the fuel for wildfires. This depletion of fuel, together with the firebreaks created by the constant movement of herds back and forth to water sources, and by the arrival of railroads, drastically reduced the incidence of wildfires. And once mechanical equipment for firefighting such as fire engines and aircraft became available in the 20th century, more and more emphasis was placed on wildfire prevention.

But wildfire suppression in the U.S. has led to considerable increases in forest density and the buildup of undergrowth, both of which greatly enhance the potential for bigger and sometimes hotter fires – the latter characterized by a growing number of terrifying, superhot “firenadoes” or fire whirls occasionally observed in today’s wildfires.

Intentional burning, long used by native tribes and early settlers and even advocated by some environmentalists who point out that fire is in fact a natural part of forest ecology as seen in the preceding figure, has become a thing of the past. Only now, after several devastating wildfires in California, is the idea of controlled burning being revived in the U.S. In Europe, on the other hand, prescribed burning has been supported by land managers for many years.

Combined with overgrowth, global warming does play a role by drying out vegetation and forests more rapidly than before. But there’s no evidence at all for the notion peddled by the media that climate change has amplified the impact of fires on the ecosystem, known technically as fire severity. Indeed, at least 10 published studies of forest fires in the western U.S. have found no recent trend in increasing fire severity.

You may think that the ever-rising level of CO2 in the atmosphere would exacerbate wildfire risk, since CO2 promotes plant growth. But at the same time, higher CO2 levels reduce plant transpiration, meaning that plants’ stomata or breathing pores open less, the leaves lose less water and more moisture is retained in the soil. Increased soil moisture has led to a worldwide greening of the planet.

In summary, the mistaken belief that the “new normal” of devastating wildfires around the globe is a result of climate change is not supported by the evidence. Humans, nevertheless, are the primary reason that wildfires have become larger and more destructive today. Population growth has caused more people to build in fire-prone areas, where fires are frequently sparked by an aging network of power lines and other electrical equipment. Coupled with poor forest management, this constitutes a recipe for disaster.

Next: No Evidence That Climate Change Causes Weather Extremes: (6) Heat Waves

No Evidence That Climate Change Causes Weather Extremes: (4) Tornadoes

tornadoes.jpg

Tornadoes are smaller and claim fewer lives than hurricanes. But the roaring twisters can be more terrifying because of their rapid formation and their ability to hurl objects such as cars, structural debris, animals and even people through the air. Nonetheless, the narrative that climate change is producing stronger and more deadly tornadoes is as fallacious as the nonexistent links between climate change and other weather extremes previously examined in this series.

Again, the UN’s IPCC (Intergovernmental Panel on Climate Change), whose assessment reports constitute the bible for the climate science community, has dismissed any connection between global warming and tornadoes. While the agency concedes that escalating temperatures and humidity may create atmospheric instability conducive to tornadoes, it also points out that other factors governing tornado formation, such as wind shear, diminish in a warming climate. In fact, declares the IPCC, the apparent increasing trend in tornadoes simply reflects their reporting by a larger number of people now living in remote areas.

A tornado is a rapidly rotating column of air, usually visible as a funnel cloud, that extends like a dagger from a parent thunderstorm to the ground. Demolishing homes and buildings in its often narrow path, it can travel many kilometers before dissipating. The most violent EF5 tornadoes attain wind speeds up to 480 km per hour (300 mph).

The U.S. endures by far the most tornadoes of any country, mostly in so-called Tornado Alley extending northward from central Texas through the Plains states. The annual incidence of all U.S. tornadoes from 1954 to 2017 is shown in the figure below. It’s obvious that no trend exists over a period that included both cooling and warming spells, with net global warming of approximately 0.7 degrees Celsius (1.3 degrees Fahrenheit) during that time.

US Tornadoes (NOAA) 1954-2017.jpg

But, as an illustration of how U.S. tornado activity can vary drastically from year to year, 13 successive days of tornado outbreaks in 2019 saw well over 400 tornadoes touch down in May, with June a close second – and this following seven quiet years ending in 2018, which was the quietest year in the entire record since 1954. The tornado surge, however, had nothing to do with climate change, but rather an unusually cold winter and spring in the West that, combined with heat from the Southeast and late rains, provided the ingredients for severe thunderstorms. 

The next figure depicts the number of strong (EF3 or greater) tornadoes observed in the U.S. each year during the same period from 1954 to 2017. Clearly, the trend is downward instead of upward; the average number of strong tornadoes annually from 1986 to 2017 was 40% less than from 1954 to 1985. Once more, global warming cannot have played a role. 

US strong tornadoes (NOAA) 1954-2017.jpg

In the U.S., tornadoes cause about 80 deaths and more than 1,500 injuries per year. The most deadly episode of all time in a single day was the “tri-State” outbreak in 1925, which killed 747 people and resulted in the most damage from any tornado outbreak in U.S. history. The most ferocious tornado outbreak ever recorded, spawning a total of 30 EF4 or EF5 tornadoes, was in 1974.

Tornadoes also occur more rarely in other parts of the world such as South America and Europe. The earliest known tornado in history occurred in Ireland in 1054. The human toll from tornadoes in Bangladesh actually exceeds that in the U.S., at an estimated 179 deaths per year, partly due to the region’s high population density. It’s population growth and expansion outside urban areas that have caused the cost of property damage from tornadoes to mushroom in the last few decades, especially in the U.S.

Next: No Evidence That Climate Change Causes Weather Extremes: (5) Wildfires

No Evidence That Climate Change Causes Weather Extremes: (3) Hurricanes

This third post in our series on the spurious links between climate change and extreme weather examines the incidence of hurricanes – powerful tropical cyclones that all too dramatically demonstrate the fury nature is capable of unleashing.

Although the UN’s IPCC (Intergovernmental Panel on Climate Change) has noted an apparent increase in the strongest (Category 4 and 5) hurricanes in the Atlantic Ocean, there’s almost no evidence for any global trend in hurricane strength. And the IPCC has found “no significant observed trends” in the number of global hurricanes each year.

Hurricanes occur in the Atlantic and northeastern Pacific Oceans, especially in and around the Gulf of Mexico; their cousins, typhoons, occur in the northwestern Pacific. Hurricanes can be hundreds of miles in extent with wind speeds up to 240 km per hour (150 mph) or more, and often exact a heavy toll in human lives and personal property. The deadliest U.S. hurricane in recorded history struck Galveston, Texas in 1900, killing an estimated 8,000 to 12,000 people. In the Caribbean, the Great Hurricane of 1780 killed 27,500 and winds exceeded an estimated 320 km per hour (200 mph). The worst hurricanes and typhoons worldwide have each claimed hundreds of thousands of lives.

How often hurricanes have occurred globally since 1981 is depicted in the figure below.

Hurricane frequency global (Ryan Maue).jpg

You can see immediately that the annual number of hurricanes overall (upper graph) is dropping. But, while the number of major hurricanes of Category 2, 3, 4 or 5 strength (lower graph) seems to show a slight increase over this period, the trend has been ascribed to improvements in observational capabilities, rather than warming oceans that provide the fuel for tropical cyclones.

The lack of any trend in major global hurricanes is borne out by the number of Category 3, 4 or 5 global hurricanes that make landfall, illustrated in the next figure. 

Hurricanes - global landfalls 1970-2018.png

It’s clear that the frequency of landfalling hurricanes of any strength (Categories 1 through 5) hasn’t changed in the nearly 50 years since 1970 – during a time when the globe warmed by approximately 0.6 degrees Celsius (1.1 degrees Fahrenheit). So the strongest hurricanes today aren’t any more extreme or devastating than those in the past. If anything, major landfalling hurricanes in the U.S. are tied to La Niña cycles in the Pacific Ocean, not to global warming.

Data for the North Atlantic basin, which has the best quality data available in the world, do, however, show heightened hurricane activity over the last 20 years. The figure below illustrates the frequency of all North Atlantic hurricanes (top graph) and major hurricanes (bottom graph) for the much longer period from 1851 to 2018.

Hurricanes - North Atlantic & major 1850-2020.png

What the data reveals is that the annual number of major North Atlantic hurricanes during the 1950s and 1960s was at least comparable to that during the last two decades when, as can be seen, the number took a sudden upward hike from the 1970s, 1980s and 1990s. But, because the earth was cooling in the 1950s and 1960s, the present enhanced hurricane activity in the North Atlantic is highly unlikely to result from global warming.

Even though it appears from the figure that major North Atlantic hurricanes were less frequent before about 1940, the lower numbers simply reflect the relative lack of observations in early years of the record. Aircraft reconnaissance flights to gather data on hurricanes didn’t begin until 1944, while satellite coverage dates from only 1966. While the data shown in the figure above has been adjusted to compensate for these deficiencies, it’s probable that the number of major North Atlantic hurricanes before 1944 is still undercounted.

The true picture is much more complicated, and any explanation of changing hurricane behavior needs to account as well for other factors, such as the now more rapid intensification of these violent storms and their slower tracking than before, both of which result in heavier rain following landfall.

The short duration of the observational record, and the even shorter record from the satellite era, makes it impossible to assess whether recent hurricane activity is unusual for the present interglacial period. Paleogeological studies of sediments in North Atlantic coastal waters suggest that the current boosted hurricane activity is not at all unusual, with several periods of frequent intense hurricane strikes having occurred thousands of years ago.

Next: No Evidence That Climate Change Causes Weather Extremes: (4) Tornadoes

No Evidence That Climate Change Causes Weather Extremes: (2) Floods

Widespread flooding and devastating tornadoes in the U.S. Midwest this May only served to amplify the strident voices of those who claim that climate change has intensified the occurrence of major floods, droughts, hurricanes, heat waves and wildfires. Like-minded voices in other countries have also fallen into the same trap of linking weather extremes to global warming.  

Apart from the IPCC (Intergovernmental Panel on Climate Change)’s dismissal of such hysterical beliefs, an increasing number of research studies are helping to dispel the notion that a warmer world is necessarily accompanied by more severe weather.

A 2017 Australian study of global flood risk concluded very little evidence exists that worldwide flooding is becoming more prevalent. Despite average rainfall getting heavier as the planet warms, the study authors point out that excessive precipitation is not the only cause of flooding. What is less obvious is that alterations to the catchment area – such as land-use changes, deforestation and the building of dams – also play a major role. 

Yet the study found that the biggest influence on flood trends is not more intense precipitation, changes in forest cover or the presence of dams, but the size of the catchment area. Previous studies had emphasized small catchment areas, as these were thought less likely to have been extensively modified. However, the new study discovered that, while smaller catchments do show a trend in flood risk that’s increasing over time, larger catchments exhibit a decreasing trend.

Globally, larger catchments dominate, so the trend in flood risk is actually decreasing rather than increasing in most parts of the globe, if there’s any trend at all. This is illustrated in the figure below, the data coming from 1,907 different locations over the 40 years from 1966 to 2005. Additional data from other locations and for a longer (93-year) period show the same global trend.

Flood1.jpg

But while the overall trend is decreasing, the local trend in regions where smaller catchments are more common, such as Europe, eastern North America and southern Africa, is toward more flooding. The study authors suggest that the lower flood trend in larger catchment areas is due to the expanding presence of agriculture and urbanization.

Another 2017 study, this time restricted to North America and Europe, found “no compelling evidence for consistent changes over time” in the occurrence of major floods from 1930 to 2010.  Like the first study described above, this research included both small and large catchment areas. But the only catchments studied were those with minimal alterations and less than 10% urbanization, so as to focus on any trends driven by climate change.

The second figure below shows the likelihood of a 100-year flood occurring in North America or Europe in any given year, during two slightly different periods toward the end of the 20th century. A 100-year flood is a massive flood that occurs on average only once a century, and has a 1 in 100 or 1% chance of occurring or being exceeded in any given year – although the actual interval between 100-year floods is often less than 100 years.

Flood2.jpg
Flood3.jpg

You can see that for both periods studied, the probability of a 100-year flood in North America or Europe hovers around the 1% (0.01) level or below, implying that 100-year floods were no more or less likely to occur during those intervals than at any time. The straight lines drawn through the data points are meaningless. Similar results were obtained for 50-year floods. 

Although the international study authors concluded that major floods in the Northern Hemisphere between 1931 and 2010 weren’t caused by global warming and were no more likely than expected from chance alone, they did find that floods were influenced by the climate. The strongest influence is the naturally occurring Atlantic Multidecadal Oscillation, an ocean cycle that causes heavier than normal rainfall in Europe and lighter rainfall in North America during its positive phase – leading to an increase in major European floods and a decrease in North American ones.

The illusion that major floods are becoming more common is due in part to the world’s growing population and the appeal, in the more developed countries at least, of living near water. This has led to people building their dream homes in harm’s way on river or coastal floodplains, where rainfall-swollen rivers or storm surges result in intermittent flooding and subsequent devastation. It’s changing human wants rather than climate change that are responsible for disastrous floods.

Next: No Evidence That Climate Change Causes Weather Extremes: (3) Hurricanes

No Evidence That Climate Change Causes Weather Extremes: (1) Drought

Weather extremes are a commonly cited line of evidence for human-caused climate change. Despite the UN’s IPCC (Intergovernmental Panel on Climate Change) having found little to no evidence that global warming triggers extreme weather, the mainstream media and more than a few climate scientists don’t hesitate to trumpet their beliefs to the contrary at every opportunity.

In this and subsequent blog posts, I’ll show how the quasi-religious belief linking extreme weather events to climate change is badly mistaken and at odds with the actual scientific record. We’ll start with drought.

Droughts have been a continuing feature of the earth’s climate for millennia. Although generally caused by a severe fall-off in precipitation, droughts can be aggravated by other factors such as elevated temperatures, soil erosion and overuse of available groundwater. The consequences of drought, which can be catastrophic for human and animal life, include crop failure, starvation and mass migration. A major exodus of early humans out of Africa about 135,000 years ago is thought to have been driven by drought.  

Getting a good handle on drought has only been possible since the end of the 19th century, when the instrumentation needed to measure extreme weather accurately was first developed. The most widely used gauge of dry conditions is the Palmer Drought Severity Index that measures both dryness and wetness and classifies them as “moderate”, “severe” or “extreme.” The figure below depicts the Palmer Index for the U.S. during the past century or so, for all three drought or wetness classifications combined.

US drought index 1900-2012 JPG.jpg

What jumps out immediately is the lack of any long-term trend in either dryness or wetness in the U.S. With the exception of the 1930s Dust Bowl years, the pattern of drought (upper graph) looks boringly similar over the entire 112-year period, as does the pattern of excessive rain (lower graph).

Much the same is true for the rest of the world. The next figure illustrates two different drought indices during the period 1910-2010 for India, a country subject to parching summer heat followed by drenching monsoonal rains; negative values denote drought and positive values wetness. The two indices are a version of the Palmer Drought Severity Index (sc-PDSI, top graph), and the Standardized Precipitation Index (SPI, bottom graph). The SPI, which relies on rainfall data only, is easier to calculate than the PDSI, which depends on both rainfall and temperature. While both indices are useful, the SPI is better suited to making comparisons between different regions.

India Drought Index

1910-2010

SPEI index India JPG TOP.jpg
SPEI index India JPG BOTTOM.jpg

You’ll see that the SPI in India shows no particular tendency over the 100-year period toward either dryness or wetness, though there are 20-year intervals exhibiting one of the two conditions; the apparent trend of the PDSI toward drought since 1990 is an artifact of the index. Similar records for other countries around the globe all show the same thing – no drying of the planet as a whole over more than 100 years.

Recently, the mainstream media created false alarm over drought by mindlessly broadcasting the results of a new study, purporting to demonstrate that global warming will soon result in “unprecedented drying.” By combining computer models with long-term observations, the study authors claim to have definitively connected global warming to drought.

But this claim doesn’t hold up, even in the study’s results. Although the authors were able to match warming to drought conditions during the first half of the 20th century, their efforts were a dismal failure after that. From 1950 to 1980, the “fingerprint” of human-caused global warming completely disappeared, in spite of ever-increasing CO2 in the atmosphere. And from 1981 onward, the fingerprint was so faint that it couldn’t be distinguished from background noise. So the assertion by the authors that global warming causes drought is nothing but wishful thinking.

As further evidence that climate change isn’t exacerbating drought, the final figure below shows the Palmer Index for the U.S. since 1996. Just like the record for the period from 1900 up to 2012 illustrated in the first figure above, there is no discernible trend in either dryness or wetness. While the West and Southwest have both experienced lengthy spells of drought during this period, extreme dry conditions now appear to have abated in both Texas and California.

US drought index 1996-2018 JPG.jpg

In summary, the scientific evidence simply doesn’t support any link between drought and climate change. The IPCC was right to express low confidence in any global-scale observed trend.

Next: No Evidence That Climate Change Causes Weather Extremes: (2) Floods

Are UFO Sightings a Threat to Science?

Credit: CoolCatGameStudio from Pixabay

Credit: CoolCatGameStudio from Pixabay

Do UFO sightings threaten science? The short answer is that UFO observations don’t in themselves – as long as one separates true observations from the questionable claims of alien abduction and other supposed extraterrestrial activity on Earth.    

Unlike pseudosciences such as astrology or crystal healing, UFOs belong to the realm of science, even if we don’t know exactly what some of them are. Sightings of ethereal objects in the sky have been reported throughout recorded history, although there’s been a definite uptick since the advent of air travel in the 20th century. According to recently released records, UK wartime prime minister Winston Churchill colluded with General Dwight Eisenhower to suppress the alleged observation of a UFO by a British bomber crew toward the end of World War II, out of fear that reporting it would cause mass panic.

Since then, numerous incidents have been reported in countries across the globe, by scientists and nonscientists alike. The U.S. Air Force, which coined the term UFO, undertook a series of studies from 1947 to 1969 that included more than 12,000 claimed UFO sightings. The project concluded that the vast majority of sightings could be explained as misidentified conventional objects or natural phenomena, such as spy planes, helium balloons, clouds or meteors – or occasionally, hoaxes. Nonetheless, there was no explanation for 701 (about 6%) of the sightings investigated. 

Only in the last several months has the existence of a new U.S. program to study UFOs been disclosed, this time under the aegis of the Pentagon. Begun in 2007, the secret program apparently continues until this day, though its government funding ended in 2012. One of the few publicized incidents which was examined involved two Navy F/A-18F fighter pilots, who chased an oval object that appeared to be moving at impossibly high speeds for humans, off the coast of southern California in 2004.

Perhaps the most famous American event was the so-called Roswell incident in 1947, when an Air Force balloon designed for nuclear test monitoring crashed at a ranch near Roswell, New Mexico. The official but deceptive statement by the military that it was a high-altitude weather balloon only served to generate ever-escalating conspiracy theories about the crash. The theories postulated that the military had covered up the crash landing of an alien spacecraft, and that bodies of its extraterrestrial crew had been recovered and preserved. Over the years, details of the story became embellished to the point where more than one candidate for U.S. President promised to unlock the secret government files on Roswell.

Belief in alien activity is where UFO lore departs from science. While it’s possible that some of the small percentage of unexplained UFO sightings have been spaceships piloted by extraterrestrial beings, there’s currently no credible evidence that aliens actually exist, nor that they’ve ever visited planet Earth.

In particular, it’s belief in alien abductions that constitutes a threat to science, the hallmarks of which are empirical evidence and logic. In the U.S., the phenomenon began with the mysterious case of Betty and Barney Hill in 1961. The Hills claim to have encountered a UFO while driving home on an isolated rural road in New Hampshire, and to have been seized by humanoid figures with large eyes who took them onto their spaceship, where invasive experiments were performed on the terrified pair. Afterwards, both the Hills’ watches stopped working and they had no recollection of two hours of their bewildering drive.

Although the alien abduction narrative captured the American imagination during the next two decades, the Air Force ultimately dismissed the story and determined that the alien craft was a “natural” object. Indeed, there’s no reliable empirical evidence that any of the millions of other reported abductions have been real.  

Psychologists attribute the episodes to false memories and fantasies created by a human brain that we’re still struggling to understand. Possible physical causes of the abduction phenomenon include epilepsy, hallucinations and sleep paralysis, a condition in which a person is half-awake — conscious, though unable to move.

But while abduction stories may be entertaining, they qualify as irrational pseudoscience because they can’t be falsified. Pseudoscience is frequently based on faith in a belief, instead of scientific evidence, and makes vague and often grandiose claims that can’t be tested. One of the clear-cut ways to differentiate real science from pseudoscience is the falsifiability criterion formulated by 20th-century philosopher Sir Karl Popper: a genuine scientific theory or law must be capable in principle of being invalidated – of being disproved – by observation or experiment. That’s not possible with alien abductions, which can’t be either proved or disproved.

Next: No Evidence That Climate Change Causes Weather Extremes: (1) Drought

UN Species Extinction Report Spouts Unscientific Hype, Dubious Math

An unprecedented decline in nature’s animal and plant species is supposedly looming, according to a UN body charged with developing a knowledge base for preservation of the planet’s biodiversity. In a dramatic announcement this month, the IPBES (Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services) claimed that more species are currently at risk of extinction than at any time in human history and that the extinction rate is accelerating. But these claims are nonsensical hype, based on wildly exaggerated numbers that can’t be corroborated.

Credit: Ben Curtis, Associated Press

Credit: Ben Curtis, Associated Press

The IPBES report summary, which is all that has been released so far, states that “around 1 million of an estimated 8 million animal and plant species (75% of which are insects), are threatened with extinction.” Apart from the as-yet-unpublished report, there’s little indication of the source for these estimates, which are as mystifying as the classic magician’s rabbit produced from an empty hat.

It appears from the report summary that the estimates are derived from a much more reliable set of numbers – the so-called Red List of threatened species, compiled by the IUCN (International Union for Conservation of Nature). The IUCN, not affiliated with the UN, is an international environmental network highly regarded for its assessments of the world’s biodiversity, including evaluation of the extinction risk of thousands of species. The network includes a large number of biologists and conservationists.

Of an estimated 1.7 million species in total, the IUCN’s Red List has currently assessed just 98,512 species, of which it lists 27,159 or approximately 28% as threatened with extinction. The IUCN’s “threatened” description includes the categories “critically endangered,” “endangered” and “vulnerable.”

A close look at the IUCN category definitions reveals that “vulnerable” represents a probability of extinction in the wild of merely “at least 10% within 100 years,” and “endangered” an extinction probability of “at least 20% within a maximum of 100 years.” Both of these categories are hardly a major cause for concern, yet together they embrace 78% of the IUCN’s compilation of threatened species. That leaves just 22% or about 5,900 critically endangered species, whose probability of extinction in the wild is assessed at more than 50% over the next 100 years – high enough for these species to be genuinely at risk of becoming extinct.

But while the IUCN presents these numbers matter-of-factly without fanfare, the much more political IPBES resorts to unashamed hype by extrapolating the statistics beyond the 98,512 species that the IUCN has actually investigated, and by assuming a total number of species far in excess of the IUCN’s estimated 1.7 million. Estimates of just how many species the Earth hosts vary considerably, from the IUCN number of 1.7 million all the way up to 1 trillion. The IPBES number of 8 million species appears to be plucked out of nowhere, as does the 1 million threatened with extinction, despite the IPBES report being the result of a “systematic review” of 15,000 scientific and government sources.

According to IPBES chair Sir Robert Watson, the 1 million number was derived from the 8 million by what appears to be an arbitrary calculation based on the IUCN’s much lower numbers. The IPBES assumes a global total of 5.5 million insects – compared with the IUCN’s Red List estimate of 1.0 million – which, when subtracted from the 8 million grand total, leaves 2.5 million non-insect species. This 2.5 million is then multiplied by the IUCN 28% threatened rate, and the 5.5 million insects multiplied by a mysterious unspecified lower rate, to arrive at the 1 million species in danger. That far excedes the IUCN’s estimate of 27,159.

Not only does the IPBES take unjustified liberties with the IUCN statistics, but its extinction rate projection bears no relationship whatsoever to actual extinction data. A known 680 vertebrate species have been driven to extinction since the 16th century, with 66 known insect extinctions recorded over the same period – or approximately 1.5 extinctions per year on average. The IPBES report summary states that the current rate of global species extinction is tens to hundreds of times higher than this and accelerating, but without explanation except for the known effect of habitat loss on animal species.

Maybe we should give the IPBES the benefit of the doubt and suspend judgment until the full report is made available. But with such a disparity between its estimates and the more sober assessment of the IUCN, it seems that the IPBES numbers are sheer make-believe. One million species on the brink of extinction is nothing but fiction, when the true number could be as low as 5,900.

Next: Are UFO Sightings a Threat to Science?

Science, Political Correctness and the Great Barrier Reef

A recent Australian court case highlights the intrusion of political correctness into science to bolster the climate change narrative. On April 16, a federal judge ruled that Australian coral scientist Dr. Peter Ridd had been unlawfully fired from his position at North Queensland’s James Cook University, for questioning his colleagues’ research on the impact of climate change on the Great Barrier Reef. In his ruling, the judge criticized the university for not respecting Ridd’s academic freedom.

Great Barrier Reef.jpg

The Great Barrier Reef is the world's largest coral reef system, 2,300 km (1,400 miles) long and visible from outer space. Labeled by CNN as one of the seven natural wonders of the world, the reef is a constant delight to tourists, who can view the colorful corals from a glass-bottomed boat or by snorkeling or scuba diving.

Rising temperatures, especially during the prolonged El Niño of 2016-17, have severely damaged portions of the Great Barrier Reef – so much so that the reef has become the poster child for global warming. Corals are susceptible to overheating and undergo bleaching when the water gets too hot, losing their vibrant colors. But exactly how much of the Great Barrier Reef has been affected, and how quickly it’s likely to recover, are controversial issues among reef researchers.

Ridd’s downfall came after he authored a chapter on the resilience of Great Barrier Reef corals in the book, Climate Change: The Facts 2017. In his chapter and subsequent TV interviews, Ridd bucked the politically correct view that the reef is doomed to an imminent death by climate change, and criticized the work of colleagues at the university’s Centre of Excellence for Coral Reef Studies. He maintained that his colleagues’ findings on the health of the reef in a warming climate were flawed, and that scientific organizations such as the Centre of Excellence could no longer be trusted.  

Ridd had previously been censured by the university for going public with a dispute over a different aspect of reef health. This time, his employer accused Ridd of “uncollegial” academic misconduct and warned him to remain silent about the charge. When he didn’t, the university fired him after a successful career of more than 40 years.

At the crux of the issue of bleaching is whether or not it’s a new phenomenon. The politically correct view of many of Ridd’s fellow reef scientists is that bleaching didn’t start until the 1980s as global warming surged, so is an entirely man-made spectacle. But Ridd points to scientific records that reveal multiple coral bleaching events around the globe throughout the 20th century.

The fired scientist also disagrees with his colleagues over the extent of bleaching from the massive 2016-17 El Niño. Ridd estimates that just 8% of Great Barrier Reef coral actually died; much of the southern end of the reef didn’t suffer at all. But his politically correct peers maintain that the die-off was anywhere from 30% to 95%.

Such high estimates, however, are for very shallow water coral – less than 2 meters (7 feet) below the surface, which is only a small fraction of all the coral in the reef. A recent independent study found that deep water coral – down to depths of more than 40 meters (130 feet) – saw far less bleaching. And while Ridd’s critics claim that warming has reduced the growth rate of new coral by 15%, he finds that the growth rate has increased slightly over the past 100 years.

Ridd explains the adaptability of corals to heating as a survival mechanism, in which the multitude of polyps that constitute a coral exchange the microscopic algae that normally live inside the polyps and give coral its striking colors. Hotter than normal water causes the algae to poison the coral that then expels them, turning the polyps white. But to survive, the coral needs resident algae which supply it with energy by photosynthesis of sunlight. So from the surrounding water, the coral selects a different species of algae better suited to hot conditions, a process that enables the coral to recover within a few years, says Ridd.

Ridd attributes what he believes are the erroneous conclusions of his reef scientist colleagues to a failure of the peer review process in scrutinizing their work. To support his argument, he cites the so-called reproducibility crisis in contemporary science – the vast number of peer-reviewed studies that can’t be replicated in subsequent investigations and whose findings turn out to be false. Although it’s not known how severe irreproducibility is in climate science, it’s a serious problem in the biomedical sciences, where as many as 89% of published results in certain fields can’t be reproduced.

In Ridd’s opinion, as well as mine, studies predicting that the Great Barrier Reef is in imminent peril are based more on political correctness than good science.

Next: UN Species Extinction Report Spouts Unscientific Hype, Dubious Math

Grassroots Climate Change Movement Ignores Actual Evidence

Earth Day 2019 is marked by the recent launch of several grassroots organizations whose ostensible aim is to combat climate change. The crusades include the UK’s Extinction Rebellion, the Swedish WeDontHaveTime, and the pied-piper-like campaign sparked by striking Swedish schoolgirl Greta Thunberg. What’s most disturbing about them all is not their intentions or methods, but their ignorance and their disregard of scientific evidence.

Common to the entire movement is the delusional belief that climate Armageddon is imminent – a mere 12 years away, according to U.S. congresswoman Alexandria Ocasio-Cortez. The WeDontHaveTime manifesto declares that “climate change is killing us” and that we’re already experiencing catastrophe. Trumpets Extinction Rebellion: “The science is clear … we are in a life or death situation … ,” a sentiment echoed by the Sunrise Movement in the U.S. And a proclamation of the youth climate strikers insists that “The climate crisis … is the biggest threat in human history.”

But despite the climate hysteria, these activists show almost no knowledge of the science that supposedly underlies their doomsday claims. Instead, they resort to logically fallacious appeals to authority. Apart from the UN’s IPCC (Intergovernmental Panel on Climate Change), which is as much a political body as a scientific one, the authorities include the former head of NASA’s Goddard Institute for Space Studies, James Hansen – known for his hype on global warming – and the UK Met Office, an agency with a dismal track record of predicting even the coming season’s weather.

Among numerous mistaken assertions by the would-be crusaders is the constant drumbeat of extreme weather events attributed to human emissions of greenhouse gases. The sadly uninformed protesters seem completely unaware that anomalous weather has been part of the earth’s climate from ancient times, long before industrialization bolstered the CO2 level in the atmosphere. They don’t bother to check the actual evidence that reveals no long-term trend whatsoever in hurricanes, heat waves, floods, droughts and wildfires in more than 100 years. Linking weather extremes to global warming or CO2 is empty-headed ignorance.

Another fallacy is that the huge Antarctic ice sheet, containing about 90% of the freshwater ice on the earth’s surface, is losing ice and causing sea-level rise to accelerate. But while it’s true that glaciers in West Antarctica and the Antarctic Peninsula are thinning, there’s evidence, albeit controversial, that the ice loss is outweighed by new ice formation in East Antarctica from warming-enhanced snowfall. The much smaller Greenland ice sheet is indeed losing ice by melting, but not at an alarming rate.

The cluelessness of the climate change movement is also exemplified by its embrace of false predictions of the future, such as the claim that climate change will cause shortfalls in food production. If anything, exactly the reverse is true. Higher temperatures and the fertilizing effect of CO2, which helps plants grow, boost crop yields and make plants more resistant to drought.

Participation in the movement runs in the hundreds of thousands around the world, especially among school climate strikers. The eco-anarchist Extinction Rebellion, formed last year, promotes acts of nonviolent civil disobedience to achieve its goals, harking back to “Ban the Bomb” and US civil rights protests of the 1950s and 1960s. To “save the planet”, the organization is calling for greenhouse gas emissions to be reduced to net zero as soon as 2025.

The newly created WeDontHaveTime subscribes to the widely held political, but unscientific belief that climate change is an existential crisis, and that catastrophe lurks around the corner. Its particular focus is on building a global social media network dedicated to climate change, with the initial phase being launched today, April 22.

The school strike for climate has similar aims, to be achieved by children around the globe playing hooky from school. An estimated total of more than a million pupils in 125 countries demonstrated in strikes on March 15.

The movement’s lack of scientific knowledge extends to the origin of CO2 emissions as well. Extinction Rebellion and WeDontHaveTime, at least, appear oblivious to the fact that the lion’s share of the world’s CO2 emissions comes from China and India alone – 34% in 2019, by preliminary estimates, and increasing yearly. If the climate change catastrophists were really serious about their objectives, they’d be directing their efforts against the governments of these two countries instead of wasting time on the West.

Next: Science, Political Correctness and the Great Barrier Reef

The Sugar Industry: Sugar Daddy to Manipulated Science?

Industry funding of scientific research often comes with strings attached. There’s plenty of evidence that industries such as tobacco and lead have been able to manipulate sponsored research to their advantage, in order to create doubt about the deleterious effects of their product. But has the sugar industry, currently in the spotlight because of concern over sugary drinks, done the same?

suger large.jpg

This charge was recently leveled at the industry by a team of scientists at UCSF (University of California, San Francisco), who accused the industry of funding research in the 1960s that downplayed the risks of consuming sugar and overstated the supposed dangers of eating saturated fat. Both saturated fat and sugar had been linked to coronary heart disease, which was surging at the time.

The UCSF researchers claim to have discovered evidence that an industry trade group secretly paid two prominent Harvard scientists to conduct a literature review refuting any connection between sugar and heart disease, and making dietary fat the villain instead. The published review made no mention of sugar industry funding.

A year after the review came out, the trade group funded an English researcher to conduct a study on laboratory rats. Initial results seemed to confirm other studies indicating that sugars, which are simple carbohydrates, were more detrimental to heart health than complex or starchy carbohydrates like grains, beans and potatoes. This was because sugar appeared to elevate the blood level of triglyceride fats, today a known risk factor for heart disease, through its metabolism by microbes in the gut.

Perhaps more alarmingly, preliminary data suggested that consumption of sugar – though not starch – produced high levels of an enzyme called beta-glucuronidase that other contemporary studies had associated with bladder cancer in humans. Before any of this could be confirmed, however, the industry trade organization shut the research project down; the results already obtained were never published.

The UCSF authors say in a second paper that the literature review’s dismissal of contrary studies, together with the suppression of evidence tying sugar to triglycerides and bladder cancer, show how the sugar industry has attempted for decades to bury scientific data on the health risks of eating sugar. If the findings of the laboratory study had been disclosed, they assert, sugar would probably have been scrutinized as a potential carcinogen, and its role in cardiovascular disease would have been further investigated. Added one of the UCSF team, “This is continuing to build the case that the sugar industry has a long history of manipulating science.”

Marion Nestle, an emeritus professor of food policy at New York University, has commented that the internal industry documents unearthed by the UCSF researchers were striking “because they provide rare evidence that the food industry suppressed research it did not like, a practice that has been documented among tobacco companies, drug companies and other industries.”

Nonetheless, the current sugar trade association disputes the UCSF claims, calling them speculative and based on questionable assumptions about events that took place almost 50 years ago. The association also considers the research itself tainted, because it was conducted and funded by known critics of the sugar industry. The industry has consistently denied that sugar plays any role in promoting obesity, diabetes or heart disease.

And despite a statement by the trade association’s predecessor that it was created “for the basic purpose of increasing the consumption of sugar,” other academics have defended the industry. They point out that, at the time of the industry review and the rat study in the 1960s, the link between sugar and heart disease was supported by only limited evidence, and the dietary fat hypothesis was deeply entrenched in scientific thinking, being endorsed by the AHA (American Heart Association) and the U.S. NHI (National Heart Institute).

But, says Nestle, it’s déjà vu today, with the sugar and beverage industries now funding research to let the industries off the hook for playing a role in causing the current obesity epidemic. As she notes in a commentary in the journal JAMA Internal Medicine:

"Is it really true that food companies deliberately set out to manipulate research in their favor? Yes, it is, and the practice continues.”

Next: Grassroots Climate Change Movement Ignores Actual Evidence

Measles Rampant Again, Thanks to Anti-Vaccinationists

Measles is on the march once more, even though vaccination against the disease has cut the number of worldwide deaths from an estimated 2.6 million per year in the mid-20th century to 110,000 in 2017. But thanks to the anti-scientific, anti-vaccination movement and the ever expanding reach of social media, measles cases are now at a 20-year high in Europe and as many U.S. cases were reported in the first two months of 2019 as in the first six months of 2018.

measles large.jpg

Highly contagious, measles is not a malady to be taken lightly. One in 1,000 people who catch it die of the disease; most of the victims are children under five. Even those who survive are at high risk of falling prey to encephalitis, an often debilitating infection of the brain that can lead to seizures and mental retardation. Other serious complications of measles include blindness and pneumonia.

It’s not the first time that measles has reared its ugly head since the widespread introduction of the MMR (measles-mumps-rubella) vaccine in 1963. Although laws mandating vaccination for schoolchildren were in place in all 50 U.S. states by 1980, sporadic outbreaks of the disease have continued to occur. Before the surge in 2018-19, a record number of 667 cases of measles from 23 outbreaks were reported in the U.S. in 2014. And major epidemics are currently raging in countries such as Ukraine and the Philippines.

The primary reason for all these outbreaks is that more and more parents are choosing not to vaccinate their children. The WHO (World Health Organization), for the first time, has listed vaccine hesitancy as one of the top 10 global threats of 2019.

While some parents oppose immunization on religious or philosophical grounds, by far the most objections come from those who insist that all vaccines cause disabling side effects or other diseases – even though the available scientific data doesn’t support such claims. As discussed in a previous post, there’s absolutely no scientific evidence for the once widely held belief that MMR vaccination results in autism, for example.

Anti-vaccinationists, when accused of exposing their children to unnecessary risk by refusing immunization because of unjustified fears about vaccine safety, rationalize their stance by appealing to herd immunity. Herd immunity is the mass protection from an infectious disease that results when enough members of the community become immune to the disease through vaccination, just as sheer numbers protect a herd of animals from predators. Once a sufficiently large number of people have been vaccinated, viruses and bacteria can no longer spread in that community.

For measles, herd immunity requires up to 94% of the populace to be immunized. That the threshold is lower than 100%, however, enables anti-vaccinationists to hide their children in the herd. By not vaccinating their offspring but choosing to live among the vaccinated, anti-vaxxers avoid the one in one million risk of their children experiencing serious side effects from the vaccine, while simultaneously not exposing them to infection – at least not in their own community.  

But hiding in the herd takes advantage of others and is morally indefensible. Certain vulnerable groups can’t be vaccinated at all, including those with weakened immune systems such as children undergoing chemotherapy for cancer or the elderly on immunosuppressive therapy for rheumatic diseases. If too many people choose not to vaccinate, the percentage vaccinated will fall below the threshold, herd immunity will break down and those whose protection depends on those around them being vaccinated will suffer.

Another contentious issue is exemptions from mandatory vaccination for religious or philosophical reasons. While some American parents regard the denial of schooling to unvaccinated children as an infringement of their constitutional rights, supreme courts in several U.S. states have ruled that the right to practice religion freely doesn’t include liberty to expose the community or a child to communicable disease. And ever since it was found in 2006 that the highest incidence of diseases such as whooping cough occurred in the states most generous in granting exemptions, more and more states have abolished nonmedical exemptions altogether.

But other countries are not so vigilant. In Madagascar, for instance, less than an estimated 60% of the population has been immunized against measles – because of which an epidemic there has caused more than 900 deaths in six months, according to the WHO. Although the WHO says that the reasons for the global rise in measles cases are complex, there’s no doubt that resistance to vaccination is a major factor. It’s not helped by the extensive dissemination of anti-vaccination misinformation by Russian propagandists.

Next: The Sugar Industry: Sugar Daddy to Manipulated Science?

Does Climate Change Threaten National Security?

Earth new.jpg

The U.S. White House’s proposed Presidential Committee on Climate Security (PCCS) is under attack – by the mainstream media, Democrats in Congress and military retirees, among others. The committee’s intended purpose is to conduct a genuine scientific assessment of climate change.

But the assailants’ claim that the PCCS is a politically motivated attempt to overthrow science has it backwards. The Presidential Committee will undertake a scientifically motivated review of climate change science, in the hope of eliminating the subversive politics that have taken over the scientific debate.

It’s those opposed to the committee who are playing politics and abusing science. The whole political narrative about greenhouse gases and dangerous anthropogenic (human-caused) warming, including the misguided Paris Agreement that the U.S. has withdrawn from, depends on faulty computer climate models that failed to predict the recent slowdown in global warming, among other shortcomings. The actual empirical evidence for a substantial human contribution to global warming is flimsy.

And the supposed 97% consensus among climate scientists that global warming is largely man-made is a gross exaggeration, mindlessly repeated by politicians and the media.

The 97% number comes primarily from a study of approximately 12,000 abstracts of research papers on climate science over a 20-year period. What is rarely revealed is that nearly 8,000 of the abstracts expressed no opinion at all on human-caused warming. When that and a subsidiary survey are taken into account, the climate scientist consensus percentage falls to between 33% and 63% only. So much for an overwhelming majority! 

Blatant exaggeration like this for political purposes is all too common in climate science. An example that permeates current news articles and official reports on climate change is the hysteria over extreme weather. Almost every hurricane, major flood, drought, wildfire or heat wave is ascribed to global warming.

But careful examination of the actual scientific data shows that if there’s a trend in any of these events, it’s downward rather than upward. Even the UN’s Intergovernmental Panel on Climate Change has found little to no evidence that global warming increases the occurrence of many types of extreme weather.

Polar bear JPG 250.jpg

Another over-hyped assertion about climate change is that the polar bear population at the North Pole is shrinking because of diminishing sea ice in the Arctic, and that the bears are facing extinction. Yet, despite numerous articles in the media and photos of apparently starving bears, current evidence shows that the polar bear population has actually been steady for the whole period that the ice has been decreasing and may even be growing, according to the native Inuit.

All these exaggerations falsely bolster the case for taking immediate action to combat climate change, supposedly by pulling back on fossil fuel use. But the mandate of the PCCS is to cut through the hype and assess just what the science actually says.  

A specific PCCS goal is to examine whether climate change impacts U.S. national security, a connection that the defense and national security agencies have strongly endorsed.

A recent letter of protest to the President from a group of former military and civilian national security professionals expresses their deep concern about “second-guessing the scientific sources used to assess the threat … posed by climate change.” The PCCS will re-evaluate the criteria employed by the national agencies to link national security to climate change.

The protest letter also claims that less than 0.2% of peer-reviewed climate science papers dispute that climate change is driven by humans. This is nonsense. In solar science alone during the first half of 2017, the number of peer-reviewed papers affirming a strong link between the sun and our climate, independent of human activity, represented approximately 4% of all climate science papers during that time – and there are many other fields of study apart from the sun.

Let’s hope that formation of the new committee will not be thwarted and that it will uncover other truths about climate science.

(This post was published previously on March 7, on The Post & Email blog.)

Next: Measles Rampant Again, Thanks to Anti-Vaccinationists

Nature vs Nurture: Does Epigenetics Challenge Evolution?

A new wrinkle in the traditional nature vs nurture debate – whether our behavior and personalities are influenced more by genetics or by our upbringing and environment – is the science of epigenetics. Epigenetics describes the mechanisms for switching individual genes on or off in the genome, which is an organism’s complete set of genetic instructions.

epigenetics.jpg

A controversial question is whether epigenetic changes can be inherited. According to Darwin’s 19th-century theory, evolution is governed entirely by heritable variation of what we now know as genes, a variation that usually results from mutation; any biological changes to the whole organism during its lifetime caused by environmental factors can’t be inherited. But recent evidence from studies on rodents suggests that epigenetic alterations can indeed be passed on to subsequent generations. If true, this implies that our genes record a memory of our lifestyle or behavior today that will form part of the genetic makeup of our grandchildren and great-grandchildren.

So was Darwin wrong? Is epigenetics an attack on science? At first blush, epigenetics is reminiscent of Lamarckism – the pre-Darwinian notion that acquired characteristics are heritable, promulgated by French naturalist Jean-Baptiste Lamarck. Lamarck’s most famous example was the giraffe, whose long neck was thought at the time to have come from generations of its ancestors stretching to reach foliage in high trees, with longer and longer necks then being inherited.

Darwin himself, when his proposal of natural selection as the evolutionary driving force was initially rejected, embraced Lamarckism as a possible alternative to natural selection. But the Lamarckian view was later discredited, as more and more evidence for natural selection accumulated, especially from molecular biology.

Nonetheless, the wheel appears to have turned back to Lamarck’s idea over the last 20 years. Several epidemiological studies have established an apparent link between 20th-century starvation and the current prevalence of obesity in the children and grandchildren of malnourished mothers. The most widely studied event is the Dutch Hunger Winter, the name given to a 6-month winter blockade of part of the Netherlands by the Germans toward the end of World War II. Survivors, who included Hollywood actress Audrey Hepburn, resorted to eating grass and tulip bulbs to stay alive.

The studies found that mothers who suffered malnutrition during early pregnancy gave birth to children who were more prone to obesity and schizophrenia than children of well-fed mothers. More unexpectedly, the same effects showed up in the grandchildren of the women who were malnour­ished during the first three months of their pregnancy. Similarly, an increased incidence of Type II diabetes has been discovered in adults whose pregnant mothers experienced starvation during the Ukrainian Famine of 1932-33 and the Great Chinese Famine of 1958-61.  

All this data points to the transmission from generation to generation of biological effects caused by an individual’s own experiences. Further evidence for such epigenetic, Lamarckian-like changes comes from laboratory studies of agouti mice, so called because they carry the agouti gene that not only makes the rodents fat and yellow, but also renders them susceptible to cancer and diabetes. By simply altering a pregnant mother’s diet, researchers found they could effectively silence the agouti gene and produce offspring that were slender and brown, and no longer prone to cancer or diabetes.  

The modified mouse diet was rich in methyl donors, small molecules that attach themselves to the DNA string in the genome and switch off the troublesome gene, and are found in foods such as onions and beets. In addition to its DNA, any genome in fact contains an array of chemical markers and switches that constitute the instructions for the estimated 21,000 protein-coding genes in the genome. That is, the array is able to turn the expression of particular genes on or off.

However, the epigenome, as this array is called, can’t alter the genes themselves. A soldier who loses a limb in battle, for example, will not bear children with shortened arms or legs. And, while there’s limited evidence that epigenetic changes in humans can be transmitted between generations, such as the starvation studies described above, the possibility isn’t yet fully established and further research is needed.

One line of thought, for which an increasing amount of evidence exists in animals and plants, is that epigenetic change doesn’t come from experience or use – as in the case of Lamarck’s giraffe – but actually results from Darwinian natural selection. The idea is that in order to cope with an environmental threat or need, natural selection may choose the variation in the species that has an epigenome favoring the attachment to its DNA of a specific type of molecule such as a methyl donor, capable of expressing or silencing certain genes. In other words, epigenetic changes can exploit existing heritable genetic variation, and so are passed on.

Is this explanation correct or, as creationists would like to think, did Darwin’s theory of evolution get it wrong? Time will tell.

How the Scientific Consensus Can Be Wrong

consensus wrong 250.jpg

Consensus is a necessary step on the road from scientific hypothesis to theory. What many people don’t realize, however, is that a consensus isn’t necessarily the last word. A consensus, whether newly proposed or well-established, can be wrong. In fact, the mistaken consensus has been a recurring feature of science for many hundreds of years.

A recent example of a widespread consensus that nevertheless erred was the belief that peptic ulcers were caused by stress or spicy foods – a dogma that persisted in the medical community for much of the 20th century. The scientific explanation at the time was that stress or poor eating habits resulted in excess secretion of gastric acid, which could erode the digestive lining and create an ulcer.

But two Australian doctors discovered evidence that peptic ulcer disease was caused by a bacterial infection of the stomach, not stress, and could be treated easily with antibiotics. Yet overturning such a longstanding consensus to the contrary would not be simple. As one of the doctors, Barry Marshall, put it:

“…beliefs on gastritis were more akin to a religion than having any basis in scientific fact.”

To convince the medical establishment the pair were right, Marshall resorted in 1984 to the drastic measure of infecting himself with a potion containing the bacterium in question (known as Helicobacter pylori). Despite this bold and risky act, the medical world didn’t finally accept the new doctrine until 1994. In 2005, Barry Marshall and Robin Warren were awarded the Nobel Prize in Medicine for their discovery.    

Earlier last century, an individual fighting established authority had overthrown conventional scientific wisdom in the field of geology. Acceptance of Alfred Wegener’s revolutionary theory of continental drift, proposed in 1912, was delayed for many decades – even longer than resistance continued to the infection explanation for ulcers – because the theory was seen as a threat to the geological establishment.

Geologists of the day refused to take seriously Wegener’s circumstantial evidence of matchups across the ocean in continental coastlines, animal and plant fossils, mountain chains and glacial deposits, clinging instead to the consensus of a contracting earth to explain these disparate phenomena. The old consensus of fixed continents endured among geologists even as new, direct evidence for continental drift surfaced, including mysterious magnetic stripes on the seafloor. But only after the emergence in the 1960s of plate tectonics, which describes the slow sliding of thick slabs of the earth’s crust, did continental drift theory become the new consensus.

A much older but well-known example of a mistaken consensus is the geocentric (earth-centered) model of the solar system that held sway for 1,500 years. This model was originally developed by ancient Greek philosophers Plato and Aristotle, and later simplified by the astronomer Ptolemy in the 2nd century. Medieval Italian mathematician Galileo Galilei fought to overturn the geocentric consensus, advocating instead the rival heliocentric (sun-centered) model of Copernicus – the model which we accept today, and for which Galileo gathered evidence in the form of unprecedented telescopic observations of the sun, planets and planetary moons.    

Although Galileo was correct, his endorsement of the heliocentric model brought him into conflict with university academics and the Catholic Church, both of which adhered to Ptolemy’s geocentric model. A resolute Galileo insisted that:

 “In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual.”

But to no avail: Galileo was called before the Inquisition, forbidden to defend Copernican ideas, and finally sentenced to house arrest for publishing a book that did just that and also ridiculed the Pope.

These are far from the only cases in the history of science of a consensus that was wrong. Others include the widely held 19th-century religious belief in creationism that impeded acceptance of Darwin’s theory of evolution, and the 20th-century paradigm linking saturated fat to heart disease.

Consensus is built only slowly, so belief in the consensus tends to become entrenched over time and is not easily abandoned by its devotees. This is certainly the case for the current consensus that climate change is largely a result of human activity – a consensus, as I’ve argued in a previous post, that is most likely mistaken.

Next: Nature vs Nurture: Does Epigenetics Challenge Evolution?