3 Statistical Errors Misleading Health Perceptions

Conducting scientific studies is never easy, and there are often major disasters along the way. A researcher accidentally spills coffee on a keyboard, destroying the data. Or one of the chemicals used in the analysis is contaminated, and the list goes on.

Author

  • Adrian Esterman

    Professor of Biostatistics and Epidemiology, University of South Australia

However, when we read the results of the study in a scientific paper, it always looks pristine. The study went smoothly with no hiccups, and here are our results.

But studies can contain errors, not all of which independent experts or "peer reviewers" weed out before publication.

Statistical stuff-ups can be difficult to find as it really takes someone trained in statistics to notice something wrong.

When statistical mistakes are made and found, it can have profound impacts on people who may have changed their lifestyle as a result of the flawed study.

These three examples of inadvertent statistical mistakes have had major consequences for our health and shopping habits.

1. Did you throw out your black plastic spoons?

Late last year, I came across a news article about how black plastic kitchen utensils were dangerous as they could potentially leak toxic flame-retardant chemicals into your food.

Being a natural sceptic, I looked up the original paper , which was published in the journal Chemosphere. The article looked genuine, the journal was reputable. So - like perhaps many other people - I threw out my black plastic kitchen utensils and replaced them with silicone ones.

In the study, the authors screened 203 household products (about half were kitchen utensils) made from black plastic.

The authors found toxic flame retardants in 85% of the products tested, with levels approaching the maximum daily limits set by the Environmental Protection Agency in the United States.

Unfortunately, the authors made a mistake in their calculations. They were out by a factor of ten. This meant the level of toxic chemicals was well under the daily safety limits.

In recent weeks, the authors apologised and corrected their paper.

2. Did you avoid HRT?

A landmark study raised safety concerns about hormone replacement therapy or HRT (now also known as menopausal hormone therapy). This highlights a different type of statistical error.

The Women's Health Initiative (WHI) study involved 10,739 postmenopausal women aged 50-79 recruited from 40 clinical centres in the US. It compared the health of women randomised to take HRT with those who took the placebo. Neither the researchers nor the women knew which treatment had been given.

In their 2002 paper , the authors reported higher rates of invasive breast cancers in the HRT group. They used a unit called "person-years". Person-years is a way to measure the total time a group of people spends in a study. For example, if 100 people are in a study for one year each, that makes 100 person-years. If someone leaves the trial after only six months, only that half-year is counted for them.

The authors showed a rate of 38 invasive breast cancers per 10,000 person-years in the HRT group, compared to 30 per 10,000 person-years in the placebo group. This gives a rate ratio of 1.26 (one rate divided by the other).

This fairly large increase in breast cancer rates, also expressed as a 26% increase, caused widespread panic around the world, and led to thousands of women stopping HRT.

But the actual risk of breast cancer in each group is low. The rate of 38 per 10,000 person-years is equivalent to an annual rate of 0.38%. With very small rates like this, the authors should really have used the rate difference rather than the rate ratio. The rate difference is one rate subtracted from the other, rather than divided by it. This equates to an annual increase of 0.08% breast cancer cases in the HRT group - much more modest.

The authors of the 2002 paper also pointed out that the 26% increase in the rate of breast cancer "almost reached nominal statistical significance". Almost is not statistical significance, and formally, this means there was no difference in breast cancer rates between the two groups. In other words, the difference between the two groups could have happened by chance.

The authors should have been more careful when describing their results.

3. Did Popeye's spinach change your meals?

Cartoon character Popeye is a one-eyed, pipe-smoking sailor with mangled English, in love with the willowy Olive Oyl. He is constantly getting into trouble, and when he needs extra energy, he opens a can of spinach and swallows the contents. His biceps immediately bulge, and off he goes to sort out the problem.

But why does Popeye eat spinach?

The story begins in about 1870, with a German chemist, Erich von Wolf or Emil von Wolff, depending on which version of events you read.

He was measuring the amount of iron in different types of leafy vegetables. According to legend, which some dispute , he was writing the iron content of spinach down in a notebook and got the decimal point wrong, writing 35 milligrams instead of 3.5 milligrams per 100 gram serve of spinach . The error was found and corrected in 1937.

By then the Popeye character had been created and spinach became incredibly popular with children. Apparently , consumption of spinach in the US went up by a third as a result of the cartoon.

This story had gained legendary status but has one tiny flaw . In a 1932 cartoon, Popeye explains exactly why he eats spinach, and it's nothing to do with iron. He says in his garbled English:

Spinach is full of Vitamin A. An'tha's what makes hoomans strong an' helty!

The Conversation

Adrian Esterman receives funding from the NHMRC, MRFF and ARC.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).