In a time where anti-vaccination campaigns and climate change sceptics are gaining attention, it’s become more important than ever to distinguish the truth from the myths.
Sir David Spiegelhalter has built a career sorting the lies from the statistics. In his new book, The Art of Statistics, the Cambridge Professor shares his passion for data, scientific evidence and risk, and how it's reported to us.
Back in 2005, a Stanford professor of medicine and statistics famously claimed that most published research findings are in fact false. However, Prof Spiegelhalter says false is a strong claim and exaggerated would be the more appropriate term.
“As a statistician, I would say that a lot of that is due to misuse of statistics. And it doesn't mean fraud, people maybe for the best possible reasons want to sort of egg the results a bit and turn it in their favour, because everybody likes having an impact.”
There’s quite a lot of questionable research practices going on, he says.
“And these are really being identified - searching through the data for some results, twiddling the studies, you go along a bit to look make it look more favourable, writing it up in a way that wasn't quite what you thought of when you started the study, and so on.”
There’s some improvements under way with the psychology field pulling together a pre-register of the objectives of studies prior to researchers conducting them so that data is not meddled with to prove an irrelevant point, he says.
Just about anything can be turned into a finding from research. For example, one study looked at the responses of a dead fish to pictures.
“The point is that if you search hard enough, you'll find something. And that was a wonderful example of that really searching hard enough and finding something completely absurd,” Prof Spiegelhalter says.
“My take on this is we have to be critical of what happens in science, but not to be cynical about what happens in science and to try to improve it.”
Clickbait and misleading headlines
Drawing results and findings that grab attention with headlines is quite a known phenomenon. But Prof Spiegelhalter does his best to combat misleading claims – just like when the UK food standards agency urged people not to eat burnt toast because it was carcinogenic.
“It made a great headline - a headline which lasted about four hours before the campaign was withdrawn due to the enormous amount of ridicule they received, quite a lot of it from my team.
“With the burnt toast. there is a suggestion it could lead to an increased risk of cancer, but no actual evidence and certainly no idea of the magnitude of it. In fact, the best estimates are that the actual risk, if it's any at all, is absolutely tiny. And so the idea of wagging their finger and telling us all how to cook our toast, I thought, was completely absurd.”
One ‘clickbait’ type of headline on a study went so far as to claim that going to university increased the risk of brain tumours, he says. The author, in fact, didn’t mention this but stated that wealthier men were known to get better health care and therefore were more likely to be diagnosed.
“The press officer said, ‘higher levels of education linked to brain tumours’. So even the press office didn't say they caused brains, they said linked to, which is a useful journalistic phrase.
“But by the time it got into popular newspapers wanting a clickbait headline it had become ‘why going to university causes more brain tumours’.
“It's like whispering tales going down the line, something that started off quite reasonable but by the time it got to the clickbait headline it turned into complete nonsense.”
Unpacking research claims
Misleading headlines can often cause people to panic and be anxious but it’s all about putting that risk into perspective, Prof Spiegelhalter says, in a way that distinguishes the absolute from the relative.
For example, he says studies that claim eating bacon every day is linked to an increased bowel cancer risk by about 20 percent – a relative risk - should be taken into consideration with the fact that 6 percent of people will have bowel cancer in their lifetime.
“That means that that 20 percent relative increase over those six percentage points … So that means that the absolute risk goes from six in 100, up to seven in 100.
“So 100 of us would have to eat a greasy bacon sandwich every day of our lives to get one extra case of bowel cancer. And that's putting the risk in perspective, it's not saying it's not there. It's just saying 'well, maybe we might be prepared to have an occasional bacon sammie and not be so worried'.”
Another factor that should be taken into account when weighing up the significance of a study’s result is the population versus individual level, he says.
“If people stopped eating processed meat completely then the rates of bowel cancer would go down at the population level. However, at an individual level, you wouldn't notice any change whatsoever. You'd never know whether the fact that you did or didn't get cancer was related to your eating bacon.”
One example is studies that claim getting pregnant earlier in the ovulation cycle favours having a male baby - and while that’s been the case, for instance, in post-war situations it’s not exactly a useful sex selection method, he says.
“Over the population level when you average over hundreds of thousands of people with babies being born each year, that's a substantial number of extra baby boys being born, and you can notice it when you take it over whole country, but you wouldn't notice it over a small group of people. You certainly wouldn't notice it at the individual level.
“If you had a coin, and instead of it being 50-50, it was 51-49 or 52-48, or even 53-47 chance of coming up heads you wouldn't know. You'd flip away for hours and you'd start seeing an imbalance at some point, but it would take a long time. And a single flip is like a single pregnancy - you'd never notice any influence at all.”
Similarly, a study in the UK observed that some areas had three times higher bowel cancer rates than others. Prof Spiegelhalter says these were in areas with small populations - meaning only a handful of cases in each area - and rather than showing long-term trends it probably was based largely on chance.
“So it's very easy, just by chance alone in one year, for there to be three times as many in one area than another. The next year it might very well have reversed: it was just chance variability, because when you’ve got small numbers of people - it's just like the individual in the population idea - there’s a huge amount of variability.”
Dispelling myths from stats
Ultimately, the misuse of statistics and distrust of science can lead to real harm for communities. Anti-vaccination campaigns in online communities and elsewhere are an illustration of this, Prof Spiegelhalter says.
“I think is a classic example of a correlation is not causation … just by chance alone, there's going to be numerous instances of diagnoses happening fairly soon after a vaccination, that's just going to happen over a large number of people.
“So they are correlated, but it doesn't mean that one causes the other. And there's no good evidence that there is a causal chain in that way. But it's very easy to pick and choose statistics which might suggest that it does, and that's what people have done. But the harm being caused by the measles epidemic is undoubted and it definitely could get worse.”
Some people may avoid the subject with anti-vaxxers in fear that it will backfire and only make them more adamant of their beliefs. Prof Spiegelhalter says the best way to combat the spread of misinformation is to correct people with delicacy.
“There are ways and means of going around trying to change their mind, [but] just saying, ‘oh, you're wrong’ essentially or ‘you're stupid’ is not going to help.
“The first thing, I think, is to understand other people's perspectives and have sympathy for their perspectives and to feel that you share their values about what is important in society and so on, to understand that and acknowledge their concerns.
“Listen to them, listen to people's concerns … but you have to find ways of saying, well, this is a misinterpretation of what's going on.”
Education on deciphering statistics is also key in trying to protect societies from the spread of false data and stories, he says.
“So often statistics are used to try to manipulate our emotions - to make us anxious, or to reassure us - and we need to be able to critique them. So, a critical view is important but what I desperately don't want is for that to turn into a cynical view … which is completely inappropriate, because then what have we got left? Just a whim of our emotions and what demagogue can persuade us if something is the case.
“We all need education as to not just believe everything we see on every feed that we get online. And some basic tricks for pulling apart a story, I think, are extremely valuable. The biggest thing is just pause before pressing that like button.”