Headlines continue to appear to claiming that a recent study has shown that the chemical bisphenol A increases the risk of miscarriage, which I addressed in a Forbes article last week. There are many problems with this research, such as the fact that it is not available in a published, peer-reviewed format. Check out my piece here for more details.
This issue raises a bigger concern about the state of science today, particularly when the research is related to chemical safety. Reliance on hard facts, scientific standards, and cautious conclusions seems to be withering away. Even well-schooled researchers have become involved in the game of activism and alarmism, using carefully chosen rhetoric to generate headlines and fear based on inconclusive and largely meaningless studies and even unpublished research.
There are some terms that should make you wary. Key among them are headlines that condemn a chemical because a study “links it” to or “suggests” its a problem or simply because the study is “consistent with” other equally unimpressive studies or even mere theories. Researchers increasingly use these phrases to describe weak statistical associations and weak studies that are often too small to provide much value.
In these studies researchers measure the strength of associations by assessing the “relative risk” of a chemical. This process compares groups of individuals with relatively high chemical exposures to groups of individuals with low or no exposures. If the high-exposure group(s) experiences more health ailments, researchers then report an association between the chemical and the illnesses they discover. They then engage in calculations to express the strength of that association numerically as a risk ratio. If the risk ratio is 1, then the study reports no difference between the groups. A relative risk of 2 suggests that the exposed group has risk that is two-times higher than the other group, a relative risk of 3 suggests that risk is three times higher, and so on. However, relative risks of 3 and below are generally considered weak associations and potentially the result of a mere statistical accident or researcher bias. Such associations do not establish cause-and-effect relationships and do not warrant alarm.
The news related to research on BPA and miscarriages is just one of many examples on one such weak statistical study being overblown in the press and even by its authors. Consider a couple more.
In 2010 numerous headlines appeared because of a study claiming that breast cancer is “linked to” synthetic fibers. News and blog headlines sounded the alarm: “Chemical Exposure Could Triple Breast Cancer Risk,” “Study Links Chemical Exposure to Breast Cancer,” and “Chemical Exposure: Science Takes it Seriously Where Breast Cancer Is Concerned.” Reuters reported:
In a study in Occupational and Environmental Medicine, a British Medical Journal title, the researchers found that women exposed to synthetic fibers and petrol products during the course of their work appeared to be most at risk.
“Occupational exposure to acrylic and nylon fibers, and to polycyclic aromatic hydrocarbons may increase the risk of developing postmenopausal breast cancer,” they wrote. But some experts commenting on the study expressed caution, saying such links can crop up by chance. “In a study of this sort positive associations often occur simply by chance,” said David Coggon, a professor of occupational and environmental medicine at Britain’s Southampton University. “They carry little weight in the absence of stronger supportive evidence from other research.”
The Canadian scientists conceded their findings could be due to chance, but also said they were consistent with [emphasis added]the theory that breast tissue is more sensitive to harmful chemicals if the exposure occurs when breast cells are still active — in other words, before a woman reaches her 40s.
As the story’s one critic points out, a relative risk level of 3 isn’t particularly strong and has a high probability of be being merely and accidental statistical association. And the Canadian researchers basically admit that reality but then move on to rationalize their claims with yet another tricky phrase: they say their findings are “consistent with” a theory about breast cancer tissue being sensitive to the chemicals. So what? That’s just a theory and their data does not validate it very well. You don’t even have to look at the study to see that the authors here are mincing words to increase confusion about the study result, which merely reports a weak association that is at best consistent with a theory about breast cancer risk.
Similarly, “suggestive” findings represent nothing much at all. Yet researchers use the term all the time to generate interest in a study whose findings are too weak to really matter. For example, in one study of the chemical Bisphenol A, researchers conclude: “Our study suggests that BPA could be a potential new environmental obesogen. Widespread exposure to BPA in the human population may also be contributing to the worldwide obesity epidemic.” Their relative risk number was very low: 2.32. And this “finding” was only discovered within a subset of their sample: Girls aged 9-12. The entire sample was 1,326 male and female children in school grades 4–12 in Shanghai, China. The study measured the BPA levels in these children’s urine and correlated that with obesity levels.
It looks like they had to really work the data to find a subset with positive relative risk and that risk number is very low—leaving a good probability that is a mere statistical accident. The authors rationalize their claim by noting: “Other anthropometric measures of obesity showed similar results.” And they note that, although they could not find any association between BPA and obesity in boys, the “gender difference of BPA effect was consistent [emphasis added] with findings from experimental studies and previous epidemiological studies.”
But the mere existence of “other studies” with “similar” or consistent” results does not make this study any more compelling. And there are plenty of studies that are inconsistent this one. For example, another study found a link between BPA and obesity in non-Hispanic white boys, but not girls or other boys. Those researchers too cherry picked a subset of their data to find that association.
There are many additional reasons to doubt this study, such as the fact that there is considerable evidence that the human body passes BPA quickly out of the body before it can have any effects. But these authors don’t mention those studies.
Here are some more examples highlighted elsewhere: