Behind The Headlines: What Laymen Should Know About Everyday Issues In Science And Health
Presentation by Gregory ConkoDirector of Food Safety Policy, Competitive Enterprise InstituteInternational Association of Culinary Professionals24th Annual International ConferenceSan Diego, CaliforniaApril 19, 2002
Every day, we see newspaper headlines about the latest scientific findings. Just in the last few years, we’ve been deluged with scare stories about everything from allergy drugs to vasectomies causing cancer. And, naturally, many people change their behavior to avoid these new-found demons. Typically, though, there is very little substance these reports. But, while our attraction to sensationalism means that scary heath reports are bound to get press attention, the bigger problem is that most laymen – journalists included – do not have the basic tools needed to see through the scary headlines and truly understand the relevance of basic scientific findings. The end result is that, in the absence of a better understanding of the issues, our innate biases tend to get in the way of making rational decisions. Take the example of a story made famous a couple of years ago, when a junior high school student named Nathan Zohner surveyed a group of classmates for a school science project. Zohner told them about a chemical called dihydrogen monoxide. It is colorless, odorless, tasteless, and causes thousands of deaths every year. Prolonged exposure to its solid form causes severe tissue damage, exposure to its gaseous form causes severe burns, and it has been found in excised tumors of terminal cancer patients. Of 50 people Nathan surveyed, 43 said that dihydrogen monoxide should be banned, 6 weren’t sure what to do, and only one person correctly identified dihydrogen monoxide as plain old water, or H2O. Now, the thought of reasonable people wanting to ban water may seem like a bit of an exaggeration. But this story makes two good points: First, it illustrates how even educated laymen will often fall back on a reflexive opposition to things that seem strange or new to them – especially things with complicated chemical names. And second, it serves as a reminder that nothing – not even clean, pure water – is ever totally safe. Thus, it is important for laymen become familiar with some basic scientific concepts, because bad science – or a poor understanding of good science – can cause consumers, policymakers, and businesses to mis-allocate attention and resources away from where they are needed most and to waste resources on things of little importance. Unfortunately, the public tends to have unrealistic expectations about what science can deliver. A recent poll in England found that 61 percent of the public expect scientists to provide 100 percent guarantees about the safety of products. But science can not deliver that. The scientific method can only prove that things are dangerous. It can never prove them to be safe. Let me give you an example. If you wanted to see whether or not a new food additive can be safely used, you would conduct a number of experiments. And, your hypothesis that the additive was safe could be proven false by observing a harmful effect. However, if the ingredient wasn’t dangerous at all, you could never conduct enough experiments to show that there were never harmful effects under every possible circumstance. You cannot prove a negative. Internalizing this fact is especially important in the context of interpreting scientific findings and making decisions based upon them. In addition – for reasons that I’ll get into in a moment – it’s not always clear that the result of any one experiment is correct. So, making reasonable judgments about scientific reports becomes even more difficult. Now, the task of giving you all the knowledge you need to adequately judge scientific information is a pretty daunting one – especially given the limited time I have today. So, I want to concentrate on just three things that you should consider when thinking about questions of health and safety: Validity, Context, and Trade-offs.
- First, Validity – Was the study conducted properly, and are its conclusions generalizable to broader populations? Is it true?
- Next, Context – Any scientific study is just little snapshot of data. So, what broader ramifications do the findings have for real life situations?
- And finally, Trade-Offs – Since even the safest products are not totally safe, there is no simple solution to any health problem. Be aware that sometimes the cure can be worse than the disease.
Now, let’s take a look at each of these in a little more detail. First, Validity. The best kind of science for determining if something is hazardous or beneficial is the "Randomized Trial." You set up an experiment, in which you assign subjects at random to a test group or a control group in which every single variable – apart from the one you’re testing – is held constant. But when testing substances that are suspected of being toxic or carcinogenic, including enough people in a randomized trial over a long enough time to be of any use would be extremely slow and hugely expensive. In many cases, you would literally have to test thousands of people over their entire lifetimes to gauge whether the effect was caused by the substance you’re testing. Moreover, randomly exposing human subjects to a substance that is suspected of being dangerous would be unethical. So, alternatives must be found. They are epidemiology and animal experiments. Epidemiology is the examination of population statistics to see if any differences in lifestyle, diet, or some environmental factor happens to be correlated with a particular disease or disorder. The problem with epidemiology is that it can tell you if two things happen to be correlated with one another. It cannot provide a biological explanation for the coincidence of two data sets, so it cannot prove causation. In theory, epidemiology can be very useful – but only if you understand its limitations, and if you control for all the other variables. But it is extremely difficult to actually control for all those relevant factors. And where the purported effect is small, it is very difficult to distinguish between an actual relationship and a random difference between populations. Even epidemiologists admit that random differences often get confused with cause and effect relationships. Michael Thun of the American Cancer Society says, "With epidemiology you can tell a little thing from a big thing. What’s very hard to do is tell a little thing from nothing at all." In most cases, though, the only real alternative is to test substances or behaviors on lab animals. But toxicologists have a saying: "lab rats are not little people." Take the example of saccharin, reported in the early 1970s to cause bladder cancer in rats. Scientists pointed out that the cancer developed due to specific biological characteristics that humans don’t have. Nevertheless, consumer safety advocates started scaring people about saccharin. Products with saccharin in them had to be labeled. And a useful product that could have helped address the very real problem of obesity was all but forced out of the marketplace. A few years ago, saccharin was totally vindicated by the UN’s World Health Organization when it acknowledged that saccharin really didn’t cause cancer in humans – only rats. But that came thirty years too late. By themselves, neither epidemiology nor animal tests is really conclusive. Epidemiology asks the right questions, but answers them poorly. And in most cases, animal tests answer questions very well, but ask the wrong questions in the first place. In the end, strong conclusions should never be based upon the results of a single study. The best conclusions are drawn only on the basis of multiple studies using a variety of different methodologies that compile a large body of evidence. Next, let’s talk about Context. Given that scientific studies only produce little snapshots of data, even well-established results often have limited applicability outside the laboratory. For example, another shortcoming of animal tests is that, because normal exposure to most test substances isn’t enough to produce a large enough effect to study, toxicologists frequently use what’s known as the "Maximum Tolerable Dose" methodology. This is just what it sounds like: the lab animals are fed or injected with the maximum possible amount of the test substance that won’t kill them outright. But at such high doses, about half of all substances will eventually cause cancer. So, those results are not remotely related to real-world exposure to the chemicals that are tested. A good illustration is the scare about dioxin. A couple of years ago, ice cream maker Ben & Jerry’s was going to stop using bleached paper for its cartons because chlorine from the bleaching process creates dioxin as a by-product. According to the Ben & Jerry’s advertising, the "only safe level of dioxin exposure is no exposure at all." That sounds pretty scary, doesn’t it? No safe level. But, 500 years ago, the Swiss doctor Philippus Aureolus Paracelsus coined a saying that is still a fundamental principle of toxicology: "It’s the dose that makes the poison." Lots of things are dangerous at very high levels, but totally harmless at lower levels. And many things, like iron, zinc, even arsenic, and other minerals and chemicals are both potentially dangerous and essential for sustaining a healthy body. Paracelsus himself developed a treatment for syphilis that used mercury, which is toxic in high doses. Arsenic, the source of a big controversy lately because it shows up in drinking water, has been a component in lots of different medicines over the years. In fact, arsenic is a component of some "organic" pesticides, and was one of the primary anti-fungal agents used in French vineyards until it was phased out two years ago. And while it is a common reaction among laymen to presume that natural things are generally good and man-made things are generally bad, even synthetic chemicals – like industrial pollutants, pesticides, and dioxin – have safe and dangerous levels. In her book, Silent Spring, Rachel Carson wrote, "For the first time in the history of the world, every human being is now subjected to contact with dangerous chemicals, from the moment of conception until death." She prophesied a future where modern synthetic pesticides and other chemicals would cause epidemics of cancer. But, Rachel Carson was wrong. First, because age-adjusted rates of cancer are falling, not rising. And second, because every human being who ever lived has been exposed to a background dose of potentially dangerous chemicals and has eaten lots of them in every meal since the beginning of time. You see, plants are essentially little factories – making proteins, micronutrients, and other chemicals. They’ve evolved over time to produce certain chemicals as a way of fending off insect pests, bacteria, and fungi. There are more than a thousand natural chemicals in coffee. Of the 25 that have been tested, 19 of them have been found to cause cancer in rats. The same can be said about every other plant in the food chain – from apples to zucchinis. A single ounce of potato has about as much carcinogenic potency as the Environmental Protection Agency’s "safe level" of dioxin. According to cancer researcher Bruce Ames at the University of California at Berkeley, 99.99 percent of all pesticides in the human diet occur naturally in plants. And, ounce for ounce, natural pesticides are at least as potent – if not more so – than synthetic ones. Interestingly enough, plants that are grown organically tend to have even higher total levels of carcinogenic chemicals than plants grown with synthetic pesticides – because the plants produce more of their own chemicals when they’re bitten by insects. Don’t worry though. I suspect that many of you will be happy to know that Bruce Ames is a friend of Alice Waters and has been eating at her organic restaurant Chez Panisse since it opened 30 years ago. I’m not trying to scare you away from organic food. The point I’m trying to make is that the background rate of chemicals in both organic and conventional produce is so small as to be inconsequential. Now, let’s go back to Ben & Jerry’s ice cream for a moment and to what I think is a particularly amusing bit of irony. Dioxins are produced any time things with chlorine in them are burned, and even from some natural processes like forest fires and volcanic eruptions. They don’t break down easily in the environment. And they tend to accumulate in the fatty tissue of animals over time. Consequently, dioxins exist throughout the food chain, and are present in all dairy and meat products, including Ben & Jerry’s ice cream. A sample of Ben & Jerry’s vanilla ice cream tested at an independent lab contained 0.79 parts per trillion of dioxin, or about 200 times more than the "virtually safe dose" of dioxin determined by the EPA. So, if you really believed what Ben & Jerry say about dioxin, you should eat their cartons and throw away the ice cream. But that level of dioxin doesn’t make Ben & Jerry’s ice cream dangerous. For nearly every chemical it regulates, the Environmental Protection Agency sets safety levels that are so much smaller than the dose that would reasonably be expected to cause any harm. Permissible levels of most synthetic chemicals are typically one-thousand times lower than what’s been determined by scientific research to have no health effect. So, I’m certainly not suggesting that anyone give up Ben & Jerry’s ice cream. Except for the fat and the sugar, it’s perfectly safe. But this makes a good point about worrying too much about exposure to chemicals. Most people tend to think, like Rachel Carson did, that pesticides and other pollutants are the leading causes of cancer. But the number one cause of cancer – responsible for over 30 percent of all cases – is poor diet characterized by too many high glycemic carbohydrates and too few fruits and vegetables. Next is cigarette smoking responsible for about 30 percent of all cancer cases. Pollution and pesticides account for considerably less than one percent of all cancers – and those are mostly from heavy air pollutants in large underdeveloped cities like Calcutta and Shanghai. Of course, one might be tempted to say, "So what if pesticides are less dangerous than I thought? Why should we settle for any exposure at all?" To answer that question, let’s move on to the third and last point: Trade-Offs. As I mentioned at the very beginning of my talk, nothing is ever totally safe – if by safe you mean completely devoid of risk. Remember dihydrogen monoxide? Things are only more or less safe than other things. The point is that lots of things to which we’re totally accustomed have considerable risk associated with them. And in many cases, they’re far more dangerous than the new products or practices could replace them. By over-estimating the riskiness of new things, and under-estimating the riskiness of old things, you can actually trap yourself – and society – in a world that is unnecessarily dangerous. Probably the best example of how "erring on the side of caution" can actually lead to lower safety is the Food and Drug Administration’s pharmaceutical review process. On the one hand, FDA can approve a product that later turns out to be dangerous. This is the kind of problem we’re all familiar with and, consequently, the kind of mistake that FDA tries very hard never to make. On the other hand, FDA can deny approval to a product that actually is safe and effective – or even just sit on it hands doing nothing. When the FDA fails to approve a drug that could save lives – or waits for a complete assurance of safety – real people stay sick longer, and many of them die. But because regulatory agencies are so intent upon "erring on the side of safety," this type of mistake happens all the time. Even delaying the approval of things like new food additives, pesticides, and genetically-engineered crops can be just as detrimental to public health as putting them on the market too quickly. Pesticides, for example, boost farm productivity by as much as 30 to 40 percent and consequently make the fruits and vegetables that are necessary for a healthy diet much more affordable. Given all the safety factors that are built into the system, banning products that boost farm productivity is usually much riskier than those products are themselves. In decision-making, regulatory agencies generally try to balance the benefits and drawbacks of putting products on the market, in what is called a Cost-Benefit analysis. One problem is that this sounds like regulators are trying to put money on an equal footing with peoples’ lives. But it would be more accurate to describe good regulatory decision-making as a Risk-Risk analysis, where the risks of approval are weighed against the risk of refusal, and a decision is made that leads society in the direction of increased overall safety. Critics of this approach argue that it’s impossible to foresee all potential risks. As an alternative, they advocate what’s called the Precautionary Principle, which many of you may have heard about. The precautionary principle is the belief that, where there is some uncertainty about the safety of new products, regulators should ban or restrict them, even if the risks have not been demonstrated scientifically. It sounds intuitively appealing, doesn’t it? Look before you leap, and all that. But the caveat about risks not being demonstrated scientifically means that any hypothesized risk is enough to keep products from being used. In effect, advocates of the precautionary principle believe that new technologies should be kept off the market until they are proven to have no risk. But, as I discussed earlier, the limitations of science are such that the absence of a danger can never be proven. Precautionary Principle advocates respond that they’re just trying to ensure that products aren’t rushed onto the market without thinking through all the potential negative effects. But that’s exactly what traditional risk analysis does. If what you really want is to see that the potential dangers of a technology or behavior are taken into consideration before adopting it, you don’t need anything more than what’s already standard procedure for practically every government around the world. In the end, it is Risk-Risk analysis that is the true "look before you leap policy." It might even be said that advocates of the precautionary principle support a "look, but never leap" policy. After all, most of them even support restrictions on products like genetically-engineered crops, new generations of pesticides, and even the chlorination of drinking water – all cases where there are reams and reams of scientific evidence indicating no real hazard, and where the products they would replace are demonstrably less safe. Consider chlorination. Greenpeace activists convinced the Peruvian government to stop chlorinating its drinking water because by-products of the chlorination process pose a potential cancer risk. Now, on the one hand, Peruvians didn’t have to worry about the one in ten million chance of getting cancer from chlorine byproducts. But on the other hand, that decision contributed to the spread of a major cholera epidemic in the early 1990s, which afflicted more than 1.3 million people and killed at least 11,000. Or consider genetically-engineered plants, which critics call Frankenfood. We hear over and over about how moving genes between organisms could unwittingly introduce toxins or allergens into the food supply. Others are concerned about the possibility that new genes in plants could have negative environmental impacts. Here again, though, fear arises because genetic technology is poorly understood by people who express an unwarranted preference for what they believe is natural. What skeptics often don’t realize is that all types of plant breeding move genes from one organism to another. And even the wholly "natural" process of sexual reproduction gives rise to exactly the same potential risks. Take tomato breeding as an example. Wild tomatoes can be toxic to human beings. But wild tomatoes generally have better natural resistances to viruses and fungi than food-grade tomatoes. So, it is common for plant breeders to mate standard garden-variety tomato plants with wild relatives in order to move the genes coding for those resistances from the wild plant to the cultivated one. Unfortunately, the "natural" process of mating the two plants can just as easily transfer the toxin genes into the off-spring as it can transfer the resistance genes. Furthermore, ordinary sexual reproduction can also disrupt the normal functioning of other genes and routinely is used to introduce entirely new genes into the food supply – just like genetic engineering. Yet, no activists or regulators seem to be concerned about an impending "Attack of the Conventionally-Bred Killer Tomatoes." Why? Because over the years, plant breeders have developed ways of making sure that potentially harmful products never make it to market. On the other hand, genetic engineering – or gene splicing – lets plant breeders identify exactly which genes they want to transfer. It lets them isolate those genes from potentially harmful ones and insert them into another plant. And then lets them test to see that all the relevant genes are in normal working order. None of these assurances can be made with conventional breeding, which is why most scientists actually believe genetic engineering to be "safer," not less safe, than conventional methods. But the expanded range of modifications that can be made with genetic techniques means that agricultural productivity can be improved, farming practices can be changed to have a lighter impact on the environment, and the staple crops grown in less developed nations can be fortified with additional nutrients and made more resistant to special problems that plague tropical regions, like extremes of heat and drought. A friend and colleague of mine is a plant geneticist at Tuskegee University in Alabama. His research has helped produce a genetically engineered sweet potato variety that has five times the normal level of essential amino acids. And many other scientists at other universities and public sector research labs are working on engineering sweet potatoes for improved virus resistance. Once these varieties clear all the regulatory red tape, they’ll be used to help poor farmers in central Africa, for whom sweet potato is the primary source of dietary starch and proteins. But it could take quite a long time for improved sweet potatoes to ever help anyone. Two years ago, one of the key research facilities in this effort – the lab of Catherine Ives at Michigan State University – was firebombed by anti-biotechnology activists, setting back the entire project by years. In my opinion, this is the worst manifestation of mis-understood science. It is based on Invalid criticisms of the technology that are too often not placed in the Context of the broader knowledge about plant physiology, and pays no attention whatever to the Trade-Offs involved in giving up the technology. All too often, there are very real, human costs to our failure to consider all these factors when making relevant decisions. So, I hope I’ve alerted you to the importance of looking behind the headlines and considering these factors next time you read about a new scientific study. And, hopefully, I’ve helped you move a little farther down to road toward a better understanding and a better appreciation for science. Thank you.