The Sugar Industry: Sugar Daddy to Manipulated Science?

Industry funding of scientific research often comes with strings attached. There’s plenty of evidence that industries such as tobacco and lead have been able to manipulate sponsored research to their advantage, in order to create doubt about the deleterious effects of their product. But has the sugar industry, currently in the spotlight because of concern over sugary drinks, done the same?

suger large.jpg

This charge was recently leveled at the industry by a team of scientists at UCSF (University of California, San Francisco), who accused the industry of funding research in the 1960s that downplayed the risks of consuming sugar and overstated the supposed dangers of eating saturated fat. Both saturated fat and sugar had been linked to coronary heart disease, which was surging at the time.

The UCSF researchers claim to have discovered evidence that an industry trade group secretly paid two prominent Harvard scientists to conduct a literature review refuting any connection between sugar and heart disease, and making dietary fat the villain instead. The published review made no mention of sugar industry funding.

A year after the review came out, the trade group funded an English researcher to conduct a study on laboratory rats. Initial results seemed to confirm other studies indicating that sugars, which are simple carbohydrates, were more detrimental to heart health than complex or starchy carbohydrates like grains, beans and potatoes. This was because sugar appeared to elevate the blood level of triglyceride fats, today a known risk factor for heart disease, through its metabolism by microbes in the gut.

Perhaps more alarmingly, preliminary data suggested that consumption of sugar – though not starch – produced high levels of an enzyme called beta-glucuronidase that other contemporary studies had associated with bladder cancer in humans. Before any of this could be confirmed, however, the industry trade organization shut the research project down; the results already obtained were never published.

The UCSF authors say in a second paper that the literature review’s dismissal of contrary studies, together with the suppression of evidence tying sugar to triglycerides and bladder cancer, show how the sugar industry has attempted for decades to bury scientific data on the health risks of eating sugar. If the findings of the laboratory study had been disclosed, they assert, sugar would probably have been scrutinized as a potential carcinogen, and its role in cardiovascular disease would have been further investigated. Added one of the UCSF team, “This is continuing to build the case that the sugar industry has a long history of manipulating science.”

Marion Nestle, an emeritus professor of food policy at New York University, has commented that the internal industry documents unearthed by the UCSF researchers were striking “because they provide rare evidence that the food industry suppressed research it did not like, a practice that has been documented among tobacco companies, drug companies and other industries.”

Nonetheless, the current sugar trade association disputes the UCSF claims, calling them speculative and based on questionable assumptions about events that took place almost 50 years ago. The association also considers the research itself tainted, because it was conducted and funded by known critics of the sugar industry. The industry has consistently denied that sugar plays any role in promoting obesity, diabetes or heart disease.

And despite a statement by the trade association’s predecessor that it was created “for the basic purpose of increasing the consumption of sugar,” other academics have defended the industry. They point out that, at the time of the industry review and the rat study in the 1960s, the link between sugar and heart disease was supported by only limited evidence, and the dietary fat hypothesis was deeply entrenched in scientific thinking, being endorsed by the AHA (American Heart Association) and the U.S. NHI (National Heart Institute).

But, says Nestle, it’s déjà vu today, with the sugar and beverage industries now funding research to let the industries off the hook for playing a role in causing the current obesity epidemic. As she notes in a commentary in the journal JAMA Internal Medicine:

"Is it really true that food companies deliberately set out to manipulate research in their favor? Yes, it is, and the practice continues.”

Next: Grassroots Climate Change Movement Ignores Actual Evidence

Nature vs Nurture: Does Epigenetics Challenge Evolution?

A new wrinkle in the traditional nature vs nurture debate – whether our behavior and personalities are influenced more by genetics or by our upbringing and environment – is the science of epigenetics. Epigenetics describes the mechanisms for switching individual genes on or off in the genome, which is an organism’s complete set of genetic instructions.

epigenetics.jpg

A controversial question is whether epigenetic changes can be inherited. According to Darwin’s 19th-century theory, evolution is governed entirely by heritable variation of what we now know as genes, a variation that usually results from mutation; any biological changes to the whole organism during its lifetime caused by environmental factors can’t be inherited. But recent evidence from studies on rodents suggests that epigenetic alterations can indeed be passed on to subsequent generations. If true, this implies that our genes record a memory of our lifestyle or behavior today that will form part of the genetic makeup of our grandchildren and great-grandchildren.

So was Darwin wrong? Is epigenetics an attack on science? At first blush, epigenetics is reminiscent of Lamarckism – the pre-Darwinian notion that acquired characteristics are heritable, promulgated by French naturalist Jean-Baptiste Lamarck. Lamarck’s most famous example was the giraffe, whose long neck was thought at the time to have come from generations of its ancestors stretching to reach foliage in high trees, with longer and longer necks then being inherited.

Darwin himself, when his proposal of natural selection as the evolutionary driving force was initially rejected, embraced Lamarckism as a possible alternative to natural selection. But the Lamarckian view was later discredited, as more and more evidence for natural selection accumulated, especially from molecular biology.

Nonetheless, the wheel appears to have turned back to Lamarck’s idea over the last 20 years. Several epidemiological studies have established an apparent link between 20th-century starvation and the current prevalence of obesity in the children and grandchildren of malnourished mothers. The most widely studied event is the Dutch Hunger Winter, the name given to a 6-month winter blockade of part of the Netherlands by the Germans toward the end of World War II. Survivors, who included Hollywood actress Audrey Hepburn, resorted to eating grass and tulip bulbs to stay alive.

The studies found that mothers who suffered malnutrition during early pregnancy gave birth to children who were more prone to obesity and schizophrenia than children of well-fed mothers. More unexpectedly, the same effects showed up in the grandchildren of the women who were malnour­ished during the first three months of their pregnancy. Similarly, an increased incidence of Type II diabetes has been discovered in adults whose pregnant mothers experienced starvation during the Ukrainian Famine of 1932-33 and the Great Chinese Famine of 1958-61.  

All this data points to the transmission from generation to generation of biological effects caused by an individual’s own experiences. Further evidence for such epigenetic, Lamarckian-like changes comes from laboratory studies of agouti mice, so called because they carry the agouti gene that not only makes the rodents fat and yellow, but also renders them susceptible to cancer and diabetes. By simply altering a pregnant mother’s diet, researchers found they could effectively silence the agouti gene and produce offspring that were slender and brown, and no longer prone to cancer or diabetes.  

The modified mouse diet was rich in methyl donors, small molecules that attach themselves to the DNA string in the genome and switch off the troublesome gene, and are found in foods such as onions and beets. In addition to its DNA, any genome in fact contains an array of chemical markers and switches that constitute the instructions for the estimated 21,000 protein-coding genes in the genome. That is, the array is able to turn the expression of particular genes on or off.

However, the epigenome, as this array is called, can’t alter the genes themselves. A soldier who loses a limb in battle, for example, will not bear children with shortened arms or legs. And, while there’s limited evidence that epigenetic changes in humans can be transmitted between generations, such as the starvation studies described above, the possibility isn’t yet fully established and further research is needed.

One line of thought, for which an increasing amount of evidence exists in animals and plants, is that epigenetic change doesn’t come from experience or use – as in the case of Lamarck’s giraffe – but actually results from Darwinian natural selection. The idea is that in order to cope with an environmental threat or need, natural selection may choose the variation in the species that has an epigenome favoring the attachment to its DNA of a specific type of molecule such as a methyl donor, capable of expressing or silencing certain genes. In other words, epigenetic changes can exploit existing heritable genetic variation, and so are passed on.

Is this explanation correct or, as creationists would like to think, did Darwin’s theory of evolution get it wrong? Time will tell.

How the Scientific Consensus can be Wrong

consensus wrong 250.jpg

Consensus is a necessary step on the road from scientific hypothesis to theory. What many people don’t realize, however, is that a consensus isn’t necessarily the last word. A consensus, whether newly proposed or well-established, can be wrong. In fact, the mistaken consensus has been a recurring feature of science for many hundreds of years.

 A recent example of a widespread consensus that nevertheless erred was the belief that peptic ulcers were caused by stress or spicy foods – a dogma that persisted in the medical community for much of the 20th century. The scientific explanation at the time was that stress or poor eating habits resulted in excess secretion of gastric acid, which could erode the digestive lining and create an ulcer.

But two Australian doctors discovered evidence that peptic ulcer disease was caused by a bacterial infection of the stomach, not stress, and could be treated easily with antibiotics. Yet overturning such a longstanding consensus to the contrary would not be simple. As one of the doctors, Barry Marshall, put it:

“…beliefs on gastritis were more akin to a religion than having any basis in scientific fact.”

To convince the medical establishment the pair were right, Marshall resorted in 1984 to the drastic measure of infecting himself with a potion containing the bacterium in question (known as Helicobacter pylori). Despite this bold and risky act, the medical world didn’t finally accept the new doctrine until 1994. In 2005, Barry Marshall and Robin Warren were awarded the Nobel Prize in Medicine for their discovery.    

Earlier last century, an individual fighting established authority had overthrown conventional scientific wisdom in the field of geology. Acceptance of Alfred Wegener’s revolutionary theory of continental drift, proposed in 1912, was delayed for many decades – even longer than resistance continued to the infection explanation for ulcers – because the theory was seen as a threat to the geological establishment.

Geologists of the day refused to take seriously Wegener’s circumstantial evidence of matchups across the ocean in continental coastlines, animal and plant fossils, mountain chains and glacial deposits, clinging instead to the consensus of a contracting earth to explain these disparate phenomena. The old consensus endured among geologists even as new, direct evidence for continental drift surfaced, including mysterious magnetic stripes on the seafloor. But only after the emergence in the 1960s of plate tectonics, which describes the slow sliding of thick slabs of the earth’s crust, did continental drift theory become the new consensus.

A much older but well-known example of a mistaken consensus is the geocentric (earth-centered) model of the solar system that held sway for 1,500 years. This model was originally developed by ancient Greek philosophers Plato and Aristotle, and later simplified by the astronomer Ptolemy in the 2nd century. Medieval Italian mathematician Galileo Galilei fought to overturn the geocentric consensus, advocating instead the rival heliocentric (sun-centered) model of Copernicus – the model which we accept today, and for which Galileo gathered evidence in the form of unprecedented telescopic observations of the sun, planets and planetary moons.    

Although Galileo was correct, his endorsement of the heliocentric model brought him into conflict with university academics and the Catholic Church, both of which adhered to Ptolemy’s geocentric model. A resolute Galileo insisted that:

 “In questions of science, the authority of a thousand is not worth the humble reasoning of a single individual.”

But to no avail: Galileo was called before the Inquisition, forbidden to defend Copernican ideas, and finally sentenced to house arrest for publishing a book that did just that and also ridiculed the Pope.

These are far from the only cases in the history of science of a consensus that was wrong. Others include the widely held 19th-century religious belief in creationism that impeded acceptance of Darwin’s theory of evolution, and the 20th-century paradigm linking saturated fat to heart disease.

Consensus is built only slowly, so belief in the consensus tends to become entrenched over time and is not easily abandoned by its devotees. This is certainly the case for the current consensus that climate change is largely a result of human activity – a consensus, as I’ve argued in a previous post, that is most likely mistaken.

Next: Nature vs Nurture: Does Epigenetics Challenge Evolution?

How Hype is Hurting Science

The recent riots in France over a proposed carbon tax, aimed at supposedly combating climate change, were a direct result of blatant exaggeration in climate science for political purposes. It’s no coincidence that the decision to move forward with the tax came soon after an October report from the UN’s IPCC (Intergovernmental Panel on Climate Change), claiming that drastic measures to curtail climate change are necessary by 2030 in order to avoid catastrophe. President Emmanuel Macron bought into the hype, only to see his people rise up against him.

Exaggeration has a long history in modern science. In 1977, the select U.S. Senate committee drafting new low-fat dietary recommendations wildly exaggerated its message by declaring that excessive fat or sugar in the diet was as much of a health threat as smoking, even though a reasoned examination of the evidence revealed that wasn’t true.

About a decade later, the same hype infiltrated the burgeoning field of climate science. At another Senate committee hearing, astrophysicist James Hansen, who was then head of GISS (NASA’s Goddard Institute for Space Studies), declared he was 99% certain that the 0.4 degrees Celsius (0.7 degrees Fahrenheit) of global warming from 1958 to 1987 was caused primarily by the buildup of greenhouse gases in the atmosphere, and wasn’t a natural variation. This assertion was based on a computer model of the earth’s climate system.

At a previous hearing, Hansen had presented climate model predictions of U.S. temperatures 30 years in the future that were three times higher than they turned out to be. This gross exaggeration makes a mockery of his subsequent claim that the warming from 1958 to 1987 was all man-made. His stretching of the truth stands in stark contrast to the caution and understatement of traditional science.

But Hansen’s hype only set the stage for others. Similar computer models have also exaggerated the magnitude of more recent global warming, failing to predict the pause in warming from the late 1990s to about 2014. During this interval, the warming rate dropped to below half the rate measured from the early 1970s to 1998. Again, the models overestimated the warming rate by two or three times.

An exaggeration mindlessly repeated by politicians and the mainstream media is the supposed 97% consensus among climate scientists that global warming is largely man-made. The 97% number comes primarily from a study of approximately 12,000 abstracts of research papers on climate science over a 20-year period. But what is never revealed is that almost 8,000 of the abstracts expressed no opinion at all on anthropogenic (human-caused) warming. When that and a subsidiary survey are taken into account, the climate scientist consensus percentage falls to between 33% and 63% only. So much for an overwhelming majority! 

A further over-hyped assertion about climate change is that the polar bear population at the North Pole is shrinking because of diminishing sea ice in the Arctic, and that the bears are facing extinction. For global warming alarmists, this claim has become a cause célèbre. Yet, despite numerous articles in the media and photos of apparently starving bears, current evidence shows that the polar bear population has actually been steady for the whole period that the ice has been decreasing – and may even be growing, according to the native Inuit.

It’s not just climate data that’s exaggerated (and sometimes distorted) by political activists. Apart from the historical example in nutritional science cited above, the same trend can be found in areas as diverse as the vaccination debate and the science of GMO foods.

Exaggeration is a common, if frowned-upon marketing tool in the commercial world: hype helps draw attention in the short term. But its use for the same purpose in science only tarnishes the discipline. And, just as exaggeration eventually turns off commercial customers interested in a product, so too does it make the general public wary if not downright suspicious of scientific proclamations. The French public has recognized this on climate change.

Subversion of Science: The Low-Fat Diet

low-fat.jpg

Remember the low-fat-diet? Highly popular in the 1980s and 1990s, it was finally pushed out of the limelight by competitive eating regimens such as the Mediterranean diet. That the low-fat diet wasn’t particularly healthy hadn’t yet been discovered. But its official blessing for decades by the governments of both the U.S. and UK represents a subversion of science by political forces that overlook evidence and abandon reason.

The low-fat diet was born in a 1977 report from a U.S. government committee chaired by Senator George McGovern, which had become aware of research purportedly linking excessive fat in the diet to killer diseases such as coronary heart disease and cancer. The committee hoped that its report would do as much for diet and chronic disease as the earlier Surgeon General’s report had done for smoking and lung cancer.

The hypothesis that eating too much saturated fat results in heart disease, caused by narrowing of the coronary arteries, was formulated by American physiologist Ancel Keys in the 1950s. Keys’ own epidemiological study, conducted in seven different countries, initially confirmed his hypothesis. But many other studies failed to corroborate the diet-heart hypothesis, and Keys’ data itself no longer substantiated it 25 years later. Double-blind clinical trials which, unlike epidemiological studies are able to establish causation, also gave results in conflict with the hypothesis.

Although it was found that eating less saturated fat could lower cholesterol levels, a growing body of evidence showed that it didn’t help to ward off heart attacks or prolong life spans. Yet Senator McGovern’s committee forged ahead regardless. The results of all the epidemiological studies and major clinical trials that refuted the diet-heart hypothesis were simply ignored – a classic case of science being trampled on by politics.

The McGovern committee’s report turned the mistaken hypothesis into nutritional dogma by drawing up a detailed set of dietary guidelines for the American public. After heated political wrangling with other government agencies, the USDA (U.S. Department of Agriculture) formalized the guidelines in 1980, effectively sanctioning the first ever, official low-fat diet. The UK followed suit a few years later.

While the guidelines erroneously linked high consumption of saturated fat to heart disease, they did concede that what constitutes a healthy level of fat in the diet was controversial. The guidelines recommended lowering intake of high-fat foods such as eggs and butter; boosting consumption of fruits, vegetables, whole grains, poultry and fish; and eating fewer foods high in sugar and salt.

With government endorsement, the low-fat diet quickly became accepted around the world. It was difficult back then even to find cookbooks that didn’t extol the virtues of the diet. Unfortunately for the public, the diet promoted to conquer one disease contributed to another – obesity – because it replaced fat with refined carbohydrates. And it wasn’t suitable for everyone.

This first became evident in the largest ever, long-term clinical trial of the low-fat diet, known as the Women’s Health Initiative. But, just like the earlier studies that led to the creation of the diet, the trial again showed that the diet-heart hypothesis didn’t hold up, at least for women.  After eight years, the low-fat diet was found to have had no effect on heart disease or deaths from the disease. Worse still, in a short-term study of the low-fat diet in U.S. Boeing employees, women who had followed the low-fat diet appeared to have actually increased their risk for heart disease.

A UN review of available data in 2008 concluded that several clinical trials of the diet “have not found evidence for beneficial effects of low-fat diets,” and commented that there wasn’t any convincing evidence either for any significant connection between dietary fat and coronary heart disease or cancer.

Today the diet-heart hypothesis is no longer widely accepted and nutritional science is beginning to regain the ground taken over by politics. But it has taken over 60 years for this attack on science to be repulsed.

Next week: How Hype is Hurting Science

When No Evidence is Evidence: GMO Food Safety

The twin hallmarks of genuine science are empirical evidence and logic. But in the case of foods containing GMOs (genetically modified organisms), it’s the absence of evidence to the contrary that provides the most convincing testament to the safety of GMO foods. Although almost 40% of the public in the U.S. and UK remain skeptical, there simply isn’t any evidence to date that GMOs are deadly or even unhealthy for humans.

Absence of evidence doesn’t prove that GMO foods are safe beyond all possible doubt, of course. Such proof is impossible in practice, as harmful effects from some as-yet unknown GMO plant can’t be categorically ruled out. But a committee of the U.S. NAS (National Academy of Sciences, Engineering and Medicine) undertook a study in 2016 to examine any negative effects as well as potential benefits of both currently commercialized and future GMO crops.

images.jpg

The study authors found no substantial evidence that the risk to human health was any different for current GMO crops on the market than for their traditionally crossbred counterparts. Crossbreeding or artificial hybridization refers to the conventional form of plant breeding, first developed in the 18th century and continually refined since then, which revolutionized agriculture before genetic engineering came on the scene in the 1970s. The evidence evaluated in the study included presentations by 80 people with diverse expertise on GMO crops; hundreds of comments and documents from individuals and organizations; and an extensive survey by the committee of published scientific papers.

The committee reexamined the results of several types of testing conducted in the past to evaluate genetically engineered crops and the foods derived from them. Although they found that many animal-feeding studies weren’t optimal, the large number of such experimental studies provided “reasonable evidence” that eating GMO foods didn’t harm animals (typically rodents). This conclusion was reinforced by long-term data on livestock health before and after GMO feed crops were introduced.

Two other informative tests involved analyzing the composition of GMO plants and testing for allergens. The NAS study found that while there were differences in the nutrient and chemical compositions of GMO plants compared to similar non-GMO varieties, the differences fell within the range of natural variation for non-GMO crops. 

In the case of specific health problems such as allergies or cancer possibly caused by eating genetically modified foods, the committee relied on epidemiological studies, since long-term randomized controlled trials have never been carried out. The results showed no difference between studies conducted in the U.S. and Canada, where the population has consumed GMO foods since the late 1990s, and similar studies in the UK and Europe, where very few GMO foods are eaten. The committee acknowledged, however, that biases may exist in the epidemiological data available on certain health problems.

The NAS report also recommended a tiered approach to future safety testing of GMOs. The recommendation was to use newly available DNA analysis technologies to evaluate the risks to human health or to the environment of a plant –  grown by either conventional hybridization or genetic engineering – and then to do safety testing only on those plant varieties that show signs of potential hazards.

While there is documentation that the NAS committee listened to both sides of the GMO debate and made an honest attempt to evaluate the available evidence fairly, this hasn’t always been so in other NAS studies. Just as politics have interfered in the debate over Roundup and cancer, as discussed in last week’s post, the NAS has been accused of substituting politics for science. Further accusations include insufficient attention to conflicts of interest among committee and panel members, and even turning a blind eye to scientific misconduct (including falsification of data). Misconduct is an issue I’ll return to in future posts.

Next week: What Intelligent Design Fails to Understand About Evolution