Nature vs Nurture: Does Epigenetics Challenge Evolution?

A new wrinkle in the traditional nature vs nurture debate – whether our behavior and personalities are influenced more by genetics or by our upbringing and environment – is the science of epigenetics. Epigenetics describes the mechanisms for switching individual genes on or off in the genome, which is an organism’s complete set of genetic instructions.

epigenetics.jpg

A controversial question is whether epigenetic changes can be inherited. According to Darwin’s 19th-century theory, evolution is governed entirely by heritable variation of what we now know as genes, a variation that usually results from mutation; any biological changes to the whole organism during its lifetime caused by environmental factors can’t be inherited. But recent evidence from studies on rodents suggests that epigenetic alterations can indeed be passed on to subsequent generations. If true, this implies that our genes record a memory of our lifestyle or behavior today that will form part of the genetic makeup of our grandchildren and great-grandchildren.

So was Darwin wrong? Is epigenetics an attack on science? At first blush, epigenetics is reminiscent of Lamarckism – the pre-Darwinian notion that acquired characteristics are heritable, promulgated by French naturalist Jean-Baptiste Lamarck. Lamarck’s most famous example was the giraffe, whose long neck was thought at the time to have come from generations of its ancestors stretching to reach foliage in high trees, with longer and longer necks then being inherited.

Darwin himself, when his proposal of natural selection as the evolutionary driving force was initially rejected, embraced Lamarckism as a possible alternative to natural selection. But the Lamarckian view was later discredited, as more and more evidence for natural selection accumulated, especially from molecular biology.

Nonetheless, the wheel appears to have turned back to Lamarck’s idea over the last 20 years. Several epidemiological studies have established an apparent link between 20th-century starvation and the current prevalence of obesity in the children and grandchildren of malnourished mothers. The most widely studied event is the Dutch Hunger Winter, the name given to a 6-month winter blockade of part of the Netherlands by the Germans toward the end of World War II. Survivors, who included Hollywood actress Audrey Hepburn, resorted to eating grass and tulip bulbs to stay alive.

The studies found that mothers who suffered malnutrition during early pregnancy gave birth to children who were more prone to obesity and schizophrenia than children of well-fed mothers. More unexpectedly, the same effects showed up in the grandchildren of the women who were malnour­ished during the first three months of their pregnancy. Similarly, an increased incidence of Type II diabetes has been discovered in adults whose pregnant mothers experienced starvation during the Ukrainian Famine of 1932-33 and the Great Chinese Famine of 1958-61.  

All this data points to the transmission from generation to generation of biological effects caused by an individual’s own experiences. Further evidence for such epigenetic, Lamarckian-like changes comes from laboratory studies of agouti mice, so called because they carry the agouti gene that not only makes the rodents fat and yellow, but also renders them susceptible to cancer and diabetes. By simply altering a pregnant mother’s diet, researchers found they could effectively silence the agouti gene and produce offspring that were slender and brown, and no longer prone to cancer or diabetes.  

The modified mouse diet was rich in methyl donors, small molecules that attach themselves to the DNA string in the genome and switch off the troublesome gene, and are found in foods such as onions and beets. In addition to its DNA, any genome in fact contains an array of chemical markers and switches that constitute the instructions for the estimated 21,000 protein-coding genes in the genome. That is, the array is able to turn the expression of particular genes on or off.

However, the epigenome, as this array is called, can’t alter the genes themselves. A soldier who loses a limb in battle, for example, will not bear children with shortened arms or legs. And, while there’s limited evidence that epigenetic changes in humans can be transmitted between generations, such as the starvation studies described above, the possibility isn’t yet fully established and further research is needed.

One line of thought, for which an increasing amount of evidence exists in animals and plants, is that epigenetic change doesn’t come from experience or use – as in the case of Lamarck’s giraffe – but actually results from Darwinian natural selection. The idea is that in order to cope with an environmental threat or need, natural selection may choose the variation in the species that has an epigenome favoring the attachment to its DNA of a specific type of molecule such as a methyl donor, capable of expressing or silencing certain genes. In other words, epigenetic changes can exploit existing heritable genetic variation, and so are passed on.

Is this explanation correct or, as creationists would like to think, did Darwin’s theory of evolution get it wrong? Time will tell.

How the Scientific Consensus can be Wrong

consensus wrong 250.jpg

Consensus is a necessary step on the road from scientific hypothesis to theory. What many people don’t realize, however, is that a consensus isn’t necessarily the last word. A consensus, whether newly proposed or well-established, can be wrong. In fact, the mistaken consensus has been a recurring feature of science for many hundreds of years.

 A recent example of a widespread consensus that nevertheless erred was the belief that peptic ulcers were caused by stress or spicy foods – a dogma that persisted in the medical community for much of the 20th century. The scientific explanation at the time was that stress or poor eating habits resulted in excess secretion of gastric acid, which could erode the digestive lining and create an ulcer.

But two Australian doctors discovered evidence that peptic ulcer disease was caused by a bacterial infection of the stomach, not stress, and could be treated easily with antibiotics. Yet overturning such a longstanding consensus to the contrary would not be simple. As one of the doctors, Barry Marshall, put it:

“…beliefs on gastritis were more akin to a religion than having any basis in scientific fact.”

To convince the medical establishment the pair were right, Marshall resorted in 1984 to the drastic measure of infecting himself with a potion containing the bacterium in question (known as Helicobacter pylori). Despite this bold and risky act, the medical world didn’t finally accept the new doctrine until 1994. In 2005, Barry Marshall and Robin Warren were awarded the Nobel Prize in Medicine for their discovery.    

Earlier last century, an individual fighting established authority had overthrown conventional scientific wisdom in the field of geology. Acceptance of Alfred Wegener’s revolutionary theory of continental drift, proposed in 1912, was delayed for many decades – even longer than resistance continued to the infection explanation for ulcers – because the theory was seen as a threat to the geological establishment.

Geologists of the day refused to take seriously Wegener’s circumstantial evidence of matchups across the ocean in continental coastlines, animal and plant fossils, mountain chains and glacial deposits, clinging instead to the consensus of a contracting earth to explain these disparate phenomena. The old consensus endured among geologists even as new, direct evidence for continental drift surfaced, including mysterious magnetic stripes on the seafloor. But only after the emergence in the 1960s of plate tectonics, which describes the slow sliding of thick slabs of the earth’s crust, did continental drift theory become the new consensus.

A much older but well-known example of a mistaken consensus is the geocentric (earth-centered) model of the solar system that held sway for 1,500 years. This model was originally developed by ancient Greek philosophers Plato and Aristotle, and later simplified by the astronomer Ptolemy in the 2nd century. Medieval Italian mathematician Galileo Galilei fought to overturn the geocentric consensus, advocating instead the rival heliocentric (sun-centered) model of Copernicus – the model which we accept today, and for which Galileo gathered evidence in the form of unprecedented telescopic observations of the sun, planets and planetary moons.    

Although Galileo was correct, his endorsement of the heliocentric model brought him into conflict with university academics and the Catholic Church, both of which adhered to Ptolemy’s geocentric model. A resolute Galileo insisted that:

 “In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual.”

But to no avail: Galileo was called before the Inquisition, forbidden to defend Copernican ideas, and finally sentenced to house arrest for publishing a book that did just that and also ridiculed the Pope.

These are far from the only cases in the history of science of a consensus that was wrong. Others include the widely held 19th-century religious belief in creationism that impeded acceptance of Darwin’s theory of evolution, and the 20th-century paradigm linking saturated fat to heart disease.

Consensus is built only slowly, so belief in the consensus tends to become entrenched over time and is not easily abandoned by its devotees. This is certainly the case for the current consensus that climate change is largely a result of human activity – a consensus, as I’ve argued in a previous post, that is most likely mistaken.

Next: Nature vs Nurture: Does Epigenetics Challenge Evolution?

Use and Misuse of the Law in Science

Aside from patent law, science and the law are hardly bosom pals. But there are many parallels between them: above all, they’re both crucially dependent on evidence and logic. However, while the legal system has been used to defend science and to settle several scientific issues, it has also been misused for advocacy by groups such as anti-evolutionists and anti-vaccinationists.

law.jpg

In the U.S., the law played a major role in keeping the teaching of creationism out of schools during the latter part of the 20th century. Creationism, discussed in previous posts on this blog, is a purely religious belief that rejects the theory of evolution. Because of the influence of the wider Protestant fundamentalist movement earlier in the century, which culminated in the infamous Scopes Monkey Trial of 1925, little evolution was taught in American public schools and universities for decades.

All that changed in 1963, when the U.S., as part of an effort to catch up to the rival Soviet Union in science, issued a new biology text, making high-school students aware for the first time of their apelike ancestors. And five years later, the U.S. Supreme Court struck down the last of the old state laws banning the teaching of evolution in schools.

In 1987 the Supreme Court went further, in upholding a ruling by a Louisiana judge that a state law, mandating that equal time be given to the teaching of creation science and evolution in public schools, was unconstitutional. Creationism suffered another blow in 2005 when a judge in Dover, Pennsylvania ruled that the school board’s sanctioning of the teaching of intelligent design in its high schools was also unconstitutional. The board had angered teachers and parents by requiring biology teachers to make use of an intelligent design reference book in their classes.

All these events show how the legal system was misused repeatedly by anti-evolutionists to argue that creationism should be taught in place of or alongside evolution in public schools, but how at the same time the law was used successfully to quash the creationist efforts and to bolster science.

Much the same pattern can be seen with anti-vaccine advocates, who have misused lawsuits and the courtroom to maintain that their objections to vaccination are scientific and that vaccines are harmful. But judges in many different courts have found the evidence presented for all such contentions to be unscientific.

The most notable example was a slew of cases – 5,600 in all – that came before the U.S. Vaccine Court in 2007. Alleged in these cases was that autism, the often devastating neurological disorder in children, is caused by vaccination with the measles-mumps-rubella (MMR) vaccine, or by a combination of the vaccine with a mercury-based preservative. To handle the enormous caseload, the court chose three special masters to hear just three test cases on each of the two charges.

In 2009 and 2010, the Vaccine Court unanimously rejected both contentions. The special masters called the evidence weak and unpersuasive, and chastised doctors and researchers who “peddled hope, not opinions grounded in science and medicine.”

Likewise, the judge in a UK court case alleging a link between autism and the combination diphtheria-tetanus-pertussis (DTP) vaccine found that the “plaintiff had failed to establish … that the vaccine could cause permanent brain damage in young children.” The judge excoriated a pediatric neurologist whose testimony at the trial completely contradicted assertions the doctor had made in a previous research paper that had triggered the litigation, along with other lawsuits, in the first place.

But, while it took a court of law to establish how unscientific the evidence for the claims about vaccines was, and it was the courts that kept the teaching of unscientific creationism out of school science classes, the court of public opinion has not been greatly swayed in either case. As many as 40% of the general public worldwide believe that all life forms, including ourselves, were created directly by God out of nothing, and that the earth is only between 6,000 and 10,000 years old. And more and more parents are choosing not to vaccinate their children, insisting that vaccines always cause disabling side effects or even other diseases.

Although the law has done its best to uphold the court of science, the attack on science continues.

Next week: Subversion of Science: The Low-Fat Diet

On Science Skeptics and Deniers

Do all climate change skeptics also question the theory of evolution? Do anti-vaccinationists also believe that GMO foods are unsafe? As we’ll see in this post, scientific skepticism and “science denial” are much more nuanced than most people think.

Skeptic.jpg

To begin with, scientific skeptics on hot-button issues such as climate change, vaccination and GMOs (genetically modified organisms) are often linked together as anti-science deniers. But the simplistic notion that skeptics and deniers are one and the same – the stance taken by the mainstream media – is mistaken. And the evidence shows that skeptics or deniers in one area of science aren’t necessarily so in other areas.

The split between outright deniers of the science and skeptics who merely question some of it varies markedly, surveys show, from approximately twice as many deniers as skeptics on evolution to about half as many deniers compared to skeptics on climate change.

In evolution, approximately 32% of the American public are creationists who deny Darwin’s theory of evolution entirely, while another 14% are skeptical of the theory. In climate change, the numbers are reversed with about 19% denying any human role in global warming, and a much larger 35% (averaged from here and here) accepting a human contribution but being skeptical about its magnitude. In GMOs, on the other hand, the percentages of skeptics and deniers are about the same.

The surveys also reveal that anti-science skepticism or denial don’t carry over from one issue to another. For example, only about 65% of evolutionary skeptics or deniers are also climate change skeptics or deniers: the remaining 35% who doubt or reject evolution believe in the climate change narrative of largely human-caused warming. So the two groups of skeptics or deniers don’t consist of the same individuals, although there is some overlap.

In the case of GMO foods, approximately equal percentages of the public reject the consensus among scientists that GMOs are safe to eat, and are skeptical about climate change. Once more, however, the two groups don’t consist of the same people. And, even though most U.S. farmers accept the consensus on the safety of GMO crops but are climate change skeptics, there are environmentalists who are GMO deniers or skeptics but accept the prevailing belief on climate change. Prince Charles is a well-known example of the latter.

Social scientists who study such surveys have identified two main influences on scientific skepticism and denial: religion and politics. As we might expect, opinions about evolution are strongly tied to religious identity, practice and belief. And, while Evangelicals are much more likely to be skeptical about climate change than those with no religious affiliation, climate skepticism overall seems to be driven more by politics – specifically, political conservatism – than by religion.

In the political sphere, U.S. Democrats are more inclined than Republicans to believe that human actions are the cause of global warming, that the theory of evolution is valid, and that GMO foods are safe to eat. However, other factors influence the perception of GMO food safety, such as corporate control of food production and any government intervention. Variables like demographics and education come into the picture too, in determining skeptical attitudes on all issues.

Lastly, a striking aspect of skepticism and denial in contemporary science is the gap in opinion between scientists and the general public. Although skepticism is an important element of the scientific method, a far larger percentage of the population in general question the prevailing wisdom on scientific issues than do scientists, with the possible exception of climate change. The precise reasons for this gap are complex according to a recent study, and include religious and political influences as well as differences in cognitive functioning and in education. While scientists may possess more knowledge of science, the public may exhibit more common sense.

Next week: Use and Misuse of the Law in Science

Why Creation Science Isn’t Science

According to so-called creation science – the widely held religious belief that the world and all its living creatures were created by God in just six days – the earth is only 6,000 to 10,000 years old. The faith-based belief rejects Darwin’s scientific theory of evolution, which holds that life forms evolved over a long period of time through the process of natural selection. In resorting to fictitious claims to justify its creed, creation science only masquerades as science.    

creation science.jpg

Creation science has its roots in a literal interpretation of the Bible. To establish a biblical chronology, various scholars have estimated the lifespans of prominent figures and the intervals between significant historical events described in the Bible. The most detailed chronology was drawn up in the 1650s by an Irish Archbishop, who calculated that exactly 4,004 years elapsed between the creation and the birth of Jesus. It’s this dubious calculation that underlies the 6,000-year lower limit for the age of the earth. 

Scientific evidence, however, tells us that the earth’s actual age is 4.5 to 4.6 billion years. Even when Darwin proposed his theory, the available evidence at the time indicated an age of at least a few hundred thousand years. Darwin himself believed that the true number was more like several hundred million years, based on his forays into geology. 

By the early 1900s, the newly developed method of radiometric dating dramatically boosted estimates of Earth’s age into the billion year range – a far cry from the several thousand years that young-Earth creationists allow, derived from their literal reading of the Bible. Radiometric dating relies on the radioactive decay of certain chemical elements such as uranium, carbon or potassium, for which the decay rates are accurately known.

To overcome the vast discrepancy between the scientifically determined age of the earth and the biblical estimate, young-Earth creationists – who, surprisingly, include hundreds of scientists with an advanced degree in science or medicine – twist science in a futile effort to discredit radiometric dating. Absurdly, they object that the method can’t be trusted because of a handful of instances when radiometric dating has been incorrect. But such an argument in no way proves a young earth, and in any case fails to invalidate a technique that has yielded correct results, as established independently by other methods, tens of thousands of times.

Another, equally ridiculous claim is that somehow the rate of radioactive decay underpinning the dating method was billions of times higher in the past, which would considerably shorten radiometrically measured ages. Some creationists even maintain that radioactive decay sped up more than once. What they don’t realize is that any significant change in decay rates would imply that fundamental physical constants (such as the speed of light) had also changed. If that were so, we’d be living in a completely different type of universe. 

Among other wild assertions that creationists use as evidence that the planet is no more than 10,000 years old are rapid exponential decay of the earth’s magnetic field, which is a spurious claim, and the low level of helium in the atmosphere, which merely reflects how easily the gas escapes from the earth and has nothing to do with its age.

Apart from such futile attempts to shorten the earth’s longevity, young-Earth creationists also rely on the concept of flood geology to prop up their religious beliefs. Flood geology, which I’ve discussed in detail elsewhere, maintains that the planet was reshaped by a massive worldwide flood as described in the biblical story of Noah’s ark. It’s as preposterously unscientific as creationist efforts to uphold the idea of a young earth.

The depth of the attack on modern science can be seen in polls showing that a sizable 38% of the U.S. adult public, and a similar percentage globally, believe that God created humans in their present form within the last 10,000 years. The percentage may be higher yet for those who identify with certain religions, and perhaps a further 10% believe in intelligent design, the form of creationism discussed in last week’s post. The breadth of disbelief in the theory of evolution is astounding, especially considering that it’s almost universally accepted by mainstream Churches and the overwhelming majority of the world’s scientists.

Next week: On Science Skeptics and Deniers

What Intelligent Design Fails to Understand About Evolution

ID.jpg

One of the threats to modern science is the persistence of faith-based beliefs about the origin of life on Earth, such as the concept of intelligent design which holds that the natural world was created by an intelligent designer – who may or may not be God or another deity. Intelligent design, like other forms of creationism, is incompatible with the theory of evolution formulated by English naturalist Charles Darwin in the 19th century. But, in asserting that complex biological systems defy any scientific explanation, believers in intelligent design fail to understand the basic features of evolution.

The driving force in biological evolution, or descent from a common ancestor through cumulative change over time, is the process of natural selection. The essence of natural selection is that, as in human society, nature produces more offspring than can survive, and that variation in a species means some offspring have a slightly greater chance of survival than the others.  These offspring have a better chance of reproducing and passing the survival trait on to the next generation than those who lack the trait.

A common misconception about natural selection is that it is an entirely random process. But this is not so. Genetic variation within a species, which distinguishes individuals from one another and usually results from mutation, is indeed random. However, the selection aspect isn’t random but rather a snowballing process, in which each evolutionary step that selects the variation best suited to reproduction builds on the previous step.

Intelligent design proponents often argue that the “astonishing complexity” of living cells and biological complexes such as the bacterial flagellum – a whip-like appendage on a bacterial cell that rotates like a propeller – precludes their evolution via the step-by-step mechanism of natural selection. Such complex systems, they insist, can only be created as an integrated whole and must therefore have been designed by an intelligent entity.

There are several sound scientific reasons why this claim is fallacious: for example, natural selection can work on modular units already assembled for another purpose. But the most telling argument is simply that evolution is incremental and can take millions or even hundreds of millions of years – a length of time that is incomprehensible, meaningless to us as himans, to whom even a few thousand years seems an eternity. The laborious, trial-and-error, one-step-at-a-time assembly of complex biological entities may indeed not be possible in a few thousand years, but is easily accomplished in a time span that’s beyond our imagination.     

However, evolution aside, intelligent design can’t lay any claim to being science. Most intelligent design advocates do accept the antiquity of life on earth, unlike adherents to the deceptively misnamed creation science, the topic for next week’s post. But neither intelligent design nor creation science offers any scientific alternative to Darwin’s mechanism of natural selection. And they both distort or ignore the vast body of empirical evidence for evolution, which includes the fossil record and biodiversity as well as a host of modern-day observations from fields such as molecular biology and embryology.

That intelligent design and creation science aren’t science at all is apparent from the almost total lack of peer-reviewed papers published in the scientific literature. Apart from a few articles (such as this one) in educational journals on the different forms of creationism, the only known paper on creationism itself – an article, based on intelligent design, about the epoch known as the Cambrian explosion – was published in an obscure biological journal in 2004. But one month later, the journal’s publishing society reprimanded the editor for not handling peer review properly and repudiated the article. In its formal explanation, the society emphasized that no scientific evidence exists to support intelligent design.

A valid scientific theory must, at least in principle, be capable of being invalidated or disproved by observation or experiment. Along with other brands of creationism, intelligent design is a pseudoscience that can’t be falsified because it depends not on scientific evidence, but on a religious belief based on faith in a supernatural creator. There’s nothing wrong with faith, but it’s the very antithesis of science. Science requires evidence and a skeptical evaluation of claims, while faith demands unquestioning belief, without evidence.

Next week: Why Creation Science Isn’t Science