Nature vs Nurture: Does Epigenetics Challenge Evolution?

A new wrinkle in the traditional nature vs nurture debate – whether our behavior and personalities are influenced more by genetics or by our upbringing and environment – is the new science of epigenetics. Epigenetics describes the mechanisms for switching individual genes on or off in the genome, which is an organism’s complete set of genetic instructions.

epigenetics.jpg

A controversial question is whether epigenetic changes can be inherited. According to Darwin’s 19th-century theory, evolution is governed entirely by heritable variation of what we now know as genes, a variation that usually results from mutation; any biological changes to the whole organism during its lifetime caused by environmental factors can’t be inherited. But recent evidence from studies on rodents suggests that epigenetic alterations can indeed be passed on to subsequent generations. If true, this implies that our genes record a memory of our lifestyle or behavior today that will form part of the genetic makeup of our grandchildren and great-grandchildren.

So was Darwin wrong? Is epigenetics an attack on science? At first blush, epigenetics is reminiscent of Lamarckism – the pre-Darwinian notion that acquired characteristics are heritable, promulgated by French naturalist Jean-Baptiste Lamarck. Lamarck’s most famous example was the giraffe, whose long neck was thought at the time to have come from generations of its ancestors stretching to reach foliage in high trees, with longer and longer necks then being inherited.

Darwin himself, when his proposal of natural selection as the evolutionary driving force was initially rejected, embraced Lamarckism as a possible alternative to natural selection. But the Lamarckian view was later discredited, as more and more evidence for natural selection accumulated, especially from molecular biology.

Nonetheless, the wheel appears to have turned back to Lamarck’s idea over the last 20 years. Several epidemiological studies have established an apparent link between 20th-century starvation and the current prevalence of obesity in the children and grandchildren of malnourished mothers. The most widely studied event is the Dutch Hunger Winter, the name given to a 6-month winter blockade of part of the Netherlands by the Germans toward the end of World War II. Survivors, who included Hollywood actress Audrey Hepburn, resorted to eating grass and tulip bulbs to stay alive.

The studies found that mothers who suffered malnutrition during early pregnancy gave birth to children who were more prone to obesity and schizophrenia than children of well-fed mothers. More unexpectedly, the same effects showed up in the grandchildren of the women who were malnour­ished during the first three months of their pregnancy. Similarly, an increased incidence of Type II diabetes has been discovered in adults whose pregnant mothers experienced starvation during the Ukrainian Famine of 1932-33 and the Great Chinese Famine of 1958-61.  

All this data points to the transmission from generation to generation of biological effects caused by an individual’s own experiences. Further evidence for such epigenetic, Lamarckian-like changes comes from laboratory studies of agouti mice, so called because they carry the agouti gene that not only makes the rodents fat and yellow, but also renders them susceptible to cancer and diabetes. By simply altering a pregnant mother’s diet, researchers found they could effectively silence the agouti gene and produce offspring that were slender and brown, and no longer prone to cancer or diabetes.  

The modified mouse diet was rich in methyl donors, small molecules that attach themselves to the DNA string in the genome and switch off the troublesome gene, and are found in foods such as onions and beets. In addition to its DNA, any genome in fact contains an array of chemical markers and switches that constitute the instructions for the estimated 21,000 protein-coding genes in the genome. That is, the array is able to turn the expression of particular genes on or off.

However, the epigenome, as this array is called, can’t alter the genes themselves. A soldier who loses a limb in battle, for example, will not bear children with shortened arms or legs. And, while there’s limited evidence that epigenetic changes in humans can be transmitted between generations, such as the starvation studies described above, the possibility isn’t yet fully established and further research is needed.

One line of thought, for which an increasing amount of evidence exists in animals and plants, is that epigenetic change doesn’t come from experience or use – as in the case of Lamarck’s giraffe – but actually results from Darwinian natural selection. The idea is that in order to cope with an environmental threat or need, natural selection may choose the variation in the species that has an epigenome favoring the attachment to its DNA of a specific type of molecule such as a methyl donor, capable of expressing or silencing certain genes. In other words, epigenetic changes can exploit existing heritable genetic variation, and so are passed on.

Is this explanation correct or, as creationists would like to think, did Darwin’s theory of evolution get it wrong? Time will tell.

Why Creation Science Isn’t Science

According to so-called creation science – the widely held religious belief that the world and all its living creatures were created by God in just six days – the earth is only 6,000 to 10,000 years old. The faith-based belief rejects Darwin’s scientific theory of evolution, which holds that life forms evolved over a long period of time through the process of natural selection. In resorting to fictitious claims to justify its creed, creation science only masquerades as science.    

creation science.jpg

Creation science has its roots in a literal interpretation of the Bible. To establish a biblical chronology, various scholars have estimated the lifespans of prominent figures and the intervals between significant historical events described in the Bible. The most detailed chronology was drawn up in the 1650s by an Irish Archbishop, who calculated that exactly 4,004 years elapsed between the creation and the birth of Jesus. It’s this dubious calculation that underlies the 6,000-year lower limit for the age of the earth. 

Scientific evidence, however, tells us that the earth’s actual age is 4.5 to 4.6 billion years. Even when Darwin proposed his theory, the available evidence at the time indicated an age of at least a few hundred thousand years. Darwin himself believed that the true number was more like several hundred million years, based on his forays into geology. 

By the early 1900s, the newly developed method of radiometric dating dramatically boosted estimates of Earth’s age into the billion year range – a far cry from the several thousand years that young-Earth creationists allow, derived from their literal reading of the Bible. Radiometric dating relies on the radioactive decay of certain chemical elements such as uranium, carbon or potassium, for which the decay rates are accurately known.

To overcome the vast discrepancy between the scientifically determined age of the earth and the biblical estimate, young-Earth creationists – who, surprisingly, include hundreds of scientists with an advanced degree in science or medicine – twist science in a futile effort to discredit radiometric dating. Absurdly, they object that the method can’t be trusted because of a handful of instances when radiometric dating has been incorrect. But such an argument in no way proves a young earth, and in any case fails to invalidate a technique that has yielded correct results, as established independently by other methods, tens of thousands of times.

Another, equally ridiculous claim is that somehow the rate of radioactive decay underpinning the dating method was billions of times higher in the past, which would considerably shorten radiometrically measured ages. Some creationists even maintain that radioactive decay sped up more than once. What they don’t realize is that any significant change in decay rates would imply that fundamental physical constants (such as the speed of light) had also changed. If that were so, we’d be living in a completely different type of universe. 

Among other wild assertions that creationists use as evidence that the planet is no more than 10,000 years old are rapid exponential decay of the earth’s magnetic field, which is a spurious claim, and the low level of helium in the atmosphere, which merely reflects how easily the gas escapes from the earth and has nothing to do with its age.

Apart from such futile attempts to shorten the earth’s longevity, young-Earth creationists also rely on the concept of flood geology to prop up their religious beliefs. Flood geology, which I’ve discussed in detail elsewhere, maintains that the planet was reshaped by a massive worldwide flood as described in the biblical story of Noah’s ark. It’s as preposterously unscientific as creationist efforts to uphold the idea of a young earth.

The depth of the attack on modern science can be seen in polls showing that a sizable 38% of the U.S. adult public, and a similar percentage globally, believe that God created humans in their present form within the last 10,000 years. The percentage may be higher yet for those who identify with certain religions, and perhaps a further 10% believe in intelligent design, the form of creationism discussed in last week’s post. The breadth of disbelief in the theory of evolution is astounding, especially considering that it’s almost universally accepted by mainstream Churches and the overwhelming majority of the world’s scientists.

Next week: On Science Skeptics and Deniers

What Intelligent Design Fails to Understand About Evolution

ID.jpg

One of the threats to modern science is the persistence of faith-based beliefs about the origin of life on Earth, such as the concept of intelligent design which holds that the natural world was created by an intelligent designer – who may or may not be God or another deity. Intelligent design, like other forms of creationism, is incompatible with the theory of evolution formulated by English naturalist Charles Darwin in the 19th century. But, in asserting that complex biological systems defy any scientific explanation, believers in intelligent design fail to understand the basic features of evolution.

The driving force in biological evolution, or descent from a common ancestor through cumulative change over time, is the process of natural selection. The essence of natural selection is that, as in human society, nature produces more offspring than can survive, and that variation in a species means some offspring have a slightly greater chance of survival than the others.  These offspring have a better chance of reproducing and passing the survival trait on to the next generation than those who lack the trait.

A common misconception about natural selection is that it is an entirely random process. But this is not so. Genetic variation within a species, which distinguishes individuals from one another and usually results from mutation, is indeed random. However, the selection aspect isn’t random but rather a snowballing process, in which each evolutionary step that selects the variation best suited to reproduction builds on the previous step.

Intelligent design proponents often argue that the “astonishing complexity” of living cells and biological complexes such as the bacterial flagellum – a whip-like appendage on a bacterial cell that rotates like a propeller – precludes their evolution via the step-by-step mechanism of natural selection. Such complex systems, they insist, can only be created as an integrated whole and must therefore have been designed by an intelligent entity.

There are several sound scientific reasons why this claim is fallacious: for example, natural selection can work on modular units already assembled for another purpose. But the most telling argument is simply that evolution is incremental and can take millions or even hundreds of millions of years – a length of time that is incomprehensible, meaningless to us as himans, to whom even a few thousand years seems an eternity. The laborious, trial-and-error, one-step-at-a-time assembly of complex biological entities may indeed not be possible in a few thousand years, but is easily accomplished in a time span that’s beyond our imagination.     

However, evolution aside, intelligent design can’t lay any claim to being science. Most intelligent design advocates do accept the antiquity of life on earth, unlike adherents to the deceptively misnamed creation science, the topic for next week’s post. But neither intelligent design nor creation science offers any scientific alternative to Darwin’s mechanism of natural selection. And they both distort or ignore the vast body of empirical evidence for evolution, which includes the fossil record and biodiversity as well as a host of modern-day observations from fields such as molecular biology and embryology.

That intelligent design and creation science aren’t science at all is apparent from the almost total lack of peer-reviewed papers published in the scientific literature. Apart from a few articles (such as this one) in educational journals on the different forms of creationism, the only known paper on creationism itself – an article, based on intelligent design, about the epoch known as the Cambrian explosion – was published in an obscure biological journal in 2004. But one month later, the journal’s publishing society reprimanded the editor for not handling peer review properly and repudiated the article. In its formal explanation, the society emphasized that no scientific evidence exists to support intelligent design.

A valid scientific theory must, at least in principle, be capable of being invalidated or disproved by observation or experiment. Along with other brands of creationism, intelligent design is a pseudoscience that can’t be falsified because it depends not on scientific evidence, but on a religious belief based on faith in a supernatural creator. There’s nothing wrong with faith, but it’s the very antithesis of science. Science requires evidence and a skeptical evaluation of claims, while faith demands unquestioning belief, without evidence.

Next week: Why Creation Science Isn’t Science