Here's the next in our series of blogposts following on from our SITP talk about the harms of homeopathy. This is a guest post written by the marvellous @Skanky_fish of Evidence-Based Skepticism fame. Now, I'm going to be a bit bossy here and insist that, if you haven't already bookmarked her blog, you do so right this minute. Future blogposts in this series will just be posted on our respective blogs, but we'll be sure to post links to all of them so you don't miss any installments. so, without further ado, I shall hand you over to Nancy:
We often harp on about the evidence for homeopathy working or
otherwise, and I’m not going to touch on that here, because it’s been
covered beautifully by many more eloquent writers than me. What you
don’t often see though, is comment on the evidence for homeopathy doing
harm. In the last post in this series the lovely @SparkleWildfire
touched on medicalisation, an indirect harm that’s very real but tough
to quantify; but what about direct harms? I’m glad you asked…
In conventional medicine, randomised controlled trials are the best
kind of study we can do of a drug to see if it works and if it it’s
safe. What maybe doesn’t mentioned quite so often is that there’s an
even *better* form of evidence – the systematic review. These are
produced when someone sits down to do the very tough but remarkably
important job of finding every single scrap of evidence they can on a
given topic, and pooling it all together to try and get closer to the
definitive answer. The result is a document that represents the best
evidence possible for how well a drug (or anything else, for that
matter) works, and how safe it is.
One of the biggest and most respected sources of these systematic reviews is the Cochrane Collaboration,
who cover all areas of medicine. Happily, they also have a few reviews
related to homeopathy, and that seems as good a place to start as any.
The most recently published is:
Homeopathic Oscillococcinum® for preventing and treating influenza and influenza-like illness
The authors searched multiple databases of medical literature,
covering a time period dating back to the mid-60s and all the way up
until August 2012. That’s a lot of literature. Out of all the results
they found six randomised, placebo-controlled trials of Oscillococcinum
that were similar enough to be directly compared. Since we’re not
really interested in efficacy in this review, I’ll skip straight to the
safety part: out of these six trials, including a total 1,523 people,
there was one reported adverse event. One. It happened to be a
headache. Let’s stop and think about that for a moment.
A good quality randomised controlled trial collects every single
adverse event that happens to every single patient. And the use of the
term “adverse event” is very deliberate, because it includes absolutely
everything unexpected and unwelcome that happens (and here’s the key
part) whether or not it’s likely to be related to taking the drug.
That might sound counter-intuitive, but the reason is simple – we want
to pick up every possible side effect of drugs, and sometimes side
effects are…weird. So it might sound odd to include as an adverse event
that someone got hit by a bus, but what if the drug they were taking
made them dizzy, or confused, or clumsy? It’s not unreasonable to
suggest that any one of those things could end up in getting you
involved in a traffic accident. So every single little thing is
recorded, and once the trials is over you do some sums to work out the
key question – are these things *more likely to happen in the people who
took the drug*? If 20 people broke a leg but they were equally spread
out among the trial groups then nothing further needs to be said; if 19
of them were on the drug being studied then there might be something to
worry about. The flip side of that of course is that if 19 were in the
placebo group, you might want to wonder if the drug is (perhaps
unintentionally) promoting better balance and co-ordination, for example
(or if everyone in the placebo group was a keen but inept snowboarder).
Is that one single adverse event out of over 1,500 people taking
Oscillococcinum starting to look fishy yet? What about if I drop in the
snippet that some of the people involved (327, to be precise) took the
remedy every day for four weeks, to see if it stopped them from getting
flu in the first place? How many times in four weeks would an average,
healthy person experience something that you could call an adverse event
– a headache, a tummy upset, indigestion, a strained ankle, a touch of
insomnia? I’ve had three of those things in the last 24 hours, and I
wouldn’t say I’m a particularly remarkable individual.
So hopefully you can see from this that there’s simply a huge,
yawning hole in the evidence about safety in homeopathy. There are ways
and means to address this (though they’re far from perfect), and I’ll
address one of those in my next post in this series.