February 8, 2016 11:17 AM
Are you a woman of childbearing age? Do you binge drink constantly and have unprotected sex on the reg? Well, the CDC wants you to know that you’re putting a potential child’s life at risk with your irresponsible behavior. This message was at the center of a mini-firestorm last week after the Centers for Disease Control and Prevention issued a press release telling women between 15- and 44-years-old they need to either quit drinking entirely or get on birth control. Some couldn’t understand why seemingly reasonable advice was so offensive. Luckily, there are many smart, snarky, female writers out there who leapt at the task of explaining why looking at women as existing in a state of “pre-pregnancy” was condescending, paternalistic, and tone deaf. It was all of those things, but it was also bad advice.
The point of the warning was to alert women to the fact that, if they’re not on birth control, they could be pregnant and that their drinking may cause an undetected fetus to develop Fetal Alcohol Spectrum Disorders (FASDs), very serious conditions that are entirely preventable.
Was the message well-intentioned? Of course, as are all of the CDC’s hyperbolic and sometimes incorrect guidelines and warnings. But it illustrates a fundamental problem with government health advice generally: take a complicated issue, for which the science is not “settled,” and turn it into a population-wide categorical directive. Rather than giving us nuanced information about the actual level of risk (the Fetal Alcohol Spectrum Disorders Center for Excellence puts at 10 in 1,000 births not 1 in 20 as the CDC claimed, by the way), and allowing women to make their own calculation about what risks are worth taking, they’d rather scare us into completely avoiding an activity that has any risk at all. Or put more succinctly, they rely on the precautionary principle.
Broadly stated, the precautionary principle holds that when the risks of an action are unknown it is best to entirely avoid it. But every choice we make has risks—even choosing to avoid something—and those risks must be weighed against potential benefits. For example, a woman may rather accept the small risk of a condom breaking on one of the six days a month when she could become pregnant, (necessitating the use of emergency contraception or pregnancy termination) than increase her risk of blood clots and cancer by using hormonal birth control or give up her social life and the possible cardiovascular benefits of consuming alcohol.
February 4, 2016 11:29 AM
The spread of the mosquito-transmitted Zika virus should be yet another wake-up call for public officials around the world. As a relatively new threat, Zika has captured headlines in a world where many insect-transmitted diseases continue to wreak havoc on public health. Unfortunately, the ability to control all such vector-borne diseases is hindered by more than our limited scientific understanding. Disease control is limited by the lack of political will to use all tools in our arsenal, including politically incorrect pesticides.
Zika has long been known to cause mild infections and rashes, but health officials are now investigating the possibility that it can cause birth defects when mothers are infected during pregnancy. The disease appeared in Brazil last spring and during 2015, the nation experienced a dramatic increase of babies born with neurodevelopmental problems associated with unusually small heads, a defect called microcephaly. Researchers are investigating whether the two phenomenon are connected. They are also investigating the possibility that Zika caused an increase of Guillain–Barré syndrome, an autoimmune disease.
Regardless of what they find, we already know that mosquito borne diseases cause a wide range of health effects that include neurological problems as well as immediately deadly infections. The impact in impoverished nations is devastating with diseases like Malaria and Dengue taking millions of lives every year.
January 19, 2016 1:56 PM
Most people accept as gospel the nutritional limits set by government organizations. So, when the Centers for Disease Control releases a report saying that 89 percent of Americans are consuming almost twice the daily recommended limit for sodium, we tend to pay attention. In last week’s Morbidity and Mortality Weekly, CDC researchers found that adult men and women in this country are eating about 50 to 100 percent more sodium than the recommended 2,300 mg daily limit, despite more than a decade of telling us to cut it out. And they suggest the way to finally get us to change our sodium munching ways is to convince food manufacturers to do it for us—to lower the content of processed foods. Considering we get on average 70 percent of our sodium from processed or prepared foods, this might reduce the amount of sodium we eat—might. But, the question that few seem to be asking is: will it make us healthier?
The problem is that, unlike salt and pepper, determining what constitutes a “healthy” sodium consumption range isn’t black and white. In fact, when you look at the levels of sodium consumed around the world, across cultures and economic levels, it becomes apparent that almost nobody on the planet is staying below the maximum sodium consumption levels set by the CDC and other health organizations. In fact, in 2014 The World Health Organization found that people in 181 out of 187 countries surveyed consume at least twice as much sodium as the WHO’s recommended 2 gram (or 2,000 mg) daily limit. So, is this a species-wide pandemic? Or is it possible that the government guidelines are just wrong?
January 14, 2016 9:38 AM
Growing up lactose intolerant, I was fond of saying that drinking milk post-infanthood was unnatural. Then I found out that humans aren’t the only ones in the animal kingdom to keep and care for another species in order to take its produce. This week, a writer at Jezebel wrote an amusing clickbait article—and an effective one if my Facebook feed is any indicator—echoing my childhood sentiment that adults drinking milk is weird and they shouldn’t do it.
Setting aside her really bad arguments (e.g., adult humans shouldn’t drink milk because no other animals do so), her “best,” or at least most reasonable, argument centers on the nutritional quality of milk. She contends that its high-fat, nutrient-dense nature makes milk perfect for babies in need of rapid weight gain and nutrition, but inappropriate for adults, especially since most of us already eat too much fat and fat consumption causes heart disease. “If you enjoy living, put the milk down,” she says. It’s funny, but based on current scientific evidence, dead wrong.
For decades the theory—or rather dogma—was that saturated fats (SFA) in foods caused cardiovascular disease (CVD). This idea that came from observational studies that linked consumption of foods high in SFA to increased CVD risk. And this is at the heart of the nutrition argument Jezebel and even the most recent USDA Dietary Guidelines make. While there is some research to back up the idea that consumption of foods containing saturated fat might increase heart health risks, emerging research has begun to cast doubt on the old wisdom. You may have seen articles with titles like “The Questionable Link Between Saturated Fat and Heart Disease” and “The Government’s Bad Diet Advice” in major news outlets asserting that the saturated fat myth has been “debunked.” It turns out that it’s not so simple and that not all saturated fats are created equal.
January 7, 2016 4:09 PM
As expected, the nutritional guidelines for 2015-2020 thankfully excised the long-standing warning against cholesterol-laden food in the wake of several decades of research demonstrating that the original warning was neither based on scientific evidence. However, the updated guidelines still advise Americans limit saturated fat and, in attempt to push Americans toward a plant-based diet, limit meat consumption. The consequences of such advice might not only fail to improve Americans’ diets, but may exacerbate the obesity problem in America.
While stopping short of recommending that Americans eat a plant-based diet for the health of our bodies and the environment (a proposed recommendation that set off a bit of a firestorm), the recommendations only implicitly advise people to eat less meat (using the euphemism of “saturated fat” and protein) and explicitly advise we eat more vegetables and other “under-consumed food groups.” While the recommendations aren’t as strong as some would like, there’s definite message within them: animal products and processed foods are bad, vegetables and fruits are good.
That message isn’t terrible. Americans could definitely stand to add more vegetables into their diet. But there’s a fundamental calculus that the dietary guidelines, and in fact most government nutritional advice, seem to not understand. There are only three fundamental macronutrients: fat, protein, and carbohydrate. To reduce one, you must increase one or both of the others. Vegetables are awesome (I’m a pescatarian myself), but they are also expensive and time-consuming compared to other kinds of meals. If HHS and USDA are sending the message that animal products (fat and protein) should be reduced what are Americans most likely to replace those calories with? The hope, of course, is that plant-based foods will replace meat calories, but it is easier and cheaper for families to replace them with carbohydrates.
While many assume that Americans eat more meat today than ever before and that this is a driving factor in obesity, this is incorrect. Our ancestors in the 19th century ate almost twice as much meat as we do today. In 1851, Americans ate between 150 and 200 pounds of meat per person per year (even slaves were allocated an average of 150 pounds of meat a year). Compare that to the 100 pounds of meat the average American adult eats now. The dietary recommendations advise that when we do eat dairy, it should be reduced fat and when we eat meat we should eat “lean,” meaning poultry or meat with fat trimmed off. However, Americans in 1851 almost never ate chicken or turkey, which were seen as “luxury” meats eaten only on special occasions. On the other hand, of the 100 pounds the average American eats today, about half is poultry. So, Americans now are eating half (or less) the amount of red meat as Americans in the 19th century and yet we are obese.
September 23, 2015 2:39 PM
In 2012, the Australian government instituted a plan tobacco packing requirement—that is, a generic package that removes all stylistic aspects of packaging: colors, imagery, corporate logos, and trademarks. In addition to legally required warnings, the only brand specific-print on the package allowed is the brand name in a mandated font size. The purpose of the Soviet-style packaging is to help reduce tobacco consumption by neutralizing any advertising technique used by the companies to woo costumers who’d otherwise avoid tobacco products. Unfortunately, it seems that plain packaging has failed to reduce tobacco use in Australia and might have even slowed the reduction that was already underway.
In the year following implementation of the plain packaging requirement, reports began to surface that Aussies purchased more cigarettes—59 million more—than in the previous year. Of course, others claimed that those numbers were wrong and that tobacco consumption had fallen post-plain packaging.
Christopher Snowdon recently shed light on how the interventionist policies are actually affecting consumer behavior. As he explains, the Australian Bureau of Statistics’ sales figures don’t show how many cigarettes were sold or how many people were smoking, but they do show the trend in (legal) tobacco sales. There was a long-term decline over the last few decades, which appears to have slowed in the first year of plain packaging. As for those claiming that plain packaging worked, Snowdon says this:
September 16, 2015 10:42 AM
Public health advocates love to make the case that “sinners,” those folks who drink, smoke, or eat “unhealthy” foods, cost society money and that gives bureaucrats the right to interfere in their lives. Users and abusers of these products cost taxpayers billions of dollars, they say. If we have to pay for it, that’s argument enough to justify tax increases, advertising restrictions, and sometimes outright product bans. But is it true?
A new study by Christopher Snowdon, director of lifestyle economics at the Institute of Economic Affairs, found that while drinkers in England cost around 3.9 billion pounds per year, they provide 10.4 billion in annual tax revenue. Therefore, drinkers in England are actually subsidizing non-drinkers to the tune of 6.5 billion pounds per year!
As with Snowdon’s previous study, The Wages of Sin Taxes, a good portion of the paper is dedicated to parsing out the difference between private costs and public costs. While advocates will claim that research shows alcohol use costs England up to 20 billion pounds a year, such statistics include items like lost productivity due to absenteeism, emotional distress, and the money spent on the product itself—none of which are costs borne by taxpayers.
August 24, 2015 3:53 PM
Last week a very interesting and, by all accounts, very well-done study made waves among the nutritional science community. For many years, the idea that reducing carbohydrates is the most effective way to reduce fat due to its effect on insulin has been rapidly gaining in popularity.
Prominent researchers like Dr. Robert Lustig (who famously called sugar a “poison”), and Gary Taubes (author of Good Calories, Bad Calories) have promoted the idea that it’s not just about how much you eat, but what you eat, that leads to obesity. Specifically, that carbohydrates and sugar cause a cascade of problems including insulin resistance, obesity, and type 2 diabetes. This new study, however, casts serious doubt on the hypothesized mechanism by which consumption of carbohydrates, in particular, would lead to these problems.
The study, led by Dr. Kevin D. Hall, was published in the highly respected Cell Metabolism journal on August 13 and found that restricting dietary fat led to body fat loss that was 68 percent higher than a diet that reduced the same number of calories through carbohydrates for obese adults. The study was small, with only 19 participants, and short, lasting only four weeks.
However, it was a well-designed and tightly controlled study. As neurobiologist Stephan Guyenet put it, “this study's methods were downright obsessive. The overall study design and diets were extremely tightly controlled, and the researchers took a large number of measurements using gold-standard methods.” The participants were randomly assigned to either the low-carb group or the low-fat group. After five days of baseline eating, the participants had their calories restricted by reducing either fat or carbohydrates by 30 percent (sugar was the same in both groups).
August 19, 2015 3:15 PM
It’s back to school season, which for many parents means spending money on new clothes, shuttling young people from sports games to ballet, and increasingly, worrying about the kind of nutrition their kids are getting when they’re away from the home.
This is understandable since they are inundated with hyperbolic headlines like “sugary drinks kill,” “death by salt,” and “processed meat causes cancer”. It’s enough to add a few gray hairs to any parent’s head. While it’s important to teach kids about proper nutrition and make sure they’re eating a balanced diet in and outside of the home, this kind of inflammatory rhetoric doesn’t help parents make healthy and realistic choices for their children.
So, here are a few tips to help you relax as you send your kids off into the great wide nutritional unknown.
Soda won’t kill your kids. There is no doubt that excessive consumption of sugary drinks through soda or fruit juice can easily lead to a calorie surplus and weight gain. However, the occasional can of sports drink after a soccer game isn’t likely to cause any damage.
You may have seen the headline announcing that a study says “Sugary Drinks May Kill 184,000 People Each Year.” It’s pretty scary, but it’s also pretty speculative and its methodology is questionable. The researchers used data from 62 self-reported surveys from only 51 countries between 1980 and 2010. They used “sugar availability,” to calculate consumption, presumably to account for the counties without adequate data. Rebecca Goldin, a Professor of Mathematical Sciences at George Mason University and Director of STATS.org (a group of researchers who work to evaluate and interpret statistical research for accurate reporting in the media) pointed out the many reasons people should be skeptical of this study, including a lack of transparency about how the researchers accounted for missing data such as sugar sources in the diet other than sugary drinks.
They also failed to say how addressed the uncertainty in the proportion of diabetes/cardiovascular disease caused by sugary drink consumption, and the uncertainty of the proportion of deaths caused by these diseases. When someone goes into the hospital with a heart attack and dies, it’s very difficult to say if it was his five decades of smoking, sedentary lifestyle, or the liter of coke he drank every week.
As Harry Cheadle over at Vice put it “X behavior causes Y deaths” headlines are always popular because people like numbers, and statements like that at least appear to quantify bad behaviors. Never mind if the numbers don't really make any sense.”
August 10, 2015 3:40 PM
Add it to the list of things that the government got wrong when it comes to nutrition: skipping breakfast may not make you fat. It turns out this apparent truism isn’t so true and the idea has only been in circulation for the last five years or so:
The notion that skipping breakfast might cause weight gain entered the Dietary Guidelines in 2010, during one of the reviews conducted every five years by experts to update its findings… [They] collected research on skipping breakfast. Some of it did, indeed, suggest that breakfast skippers may be more likely to gain weight.
But the evidence the experts on the Dietary Guidelines Advisory Committee relied on were observational. As Peter Whoriskey commented, “observational studies in nutrition are generally cheaper and easier to conduct. But they can suffer from weaknesses that can lead scientists astray.” And astray they went. When the Advisory Committee decided to enshrine their “breakfast-weight hypothesis” into the Dietary Guidelines, they cited only one randomized controlled trial, which found “no relationship with breakfast alone” and weight gain.
Last year, however, a team of researchers from Columbia University did a controlled trial to examine this breakfast hypothesis. They divided a large number of people into “oatmeal breakfast,” “frosted corn flakes breakfast,” and “no breakfast” groups. At the end of the trial they found that the breakfast-skippers lost more weight than the other groups.