The Evidence Explosion

26 April 2013 by Tania Browne, posted in Evidence Based Medicine

When I was young, British TV often showed the old hospital comedies of the 1950s and ‘60s. They were a big hit when Britain still regarded its National Health Service as quite new, the envy of health systems across the world. A perfect setting for hilarious japes, suave doctors and pretty nurses. It was during this time that I first met Sir Lancelot Spratt.

Sir Lancelot was an exaggeration of course, but he embodied everything that was wrong and ever-so-slightly scary about hospitals at the time.

He relied only on his years of experience, and his memories of patients he’d treated in the past, to diagnose and treat people.

Nobody dared question his authority. He knew what was best, and anyone who challenged him would be shouted down. He was Emperor of The Wards.

He relied on the way he’d been taught, so very long ago, even though knowledge and best practice had probably changed. A lot. There were no continuous training programmes in Sir Lancelot’s day.

At one time, the National Health Service was full of Sir Lancelots. In the early 1970s, a man called Archie Cochrane decided he wasn’t very happy with that. Director of the Medical Research Council Epidemiology Research Unit in Cardiff, Cochrane published a book called Effectiveness and Efficiency: Random reflections on Health Services. And in it, he expressed a pretty revolutionary idea:

That what treatments we have available in a health service, the procedures we follow, and the way we treat patients should actually be based on proper evidence that they work,  gathered from testing on the wider population.

I know it sounds incredible but that wasn't always the case, even now. In 1990 two researchers made a historical review of the US healthcare system and found that in the 1970s, only around 10-20% of procedures were evidence based. Thankfully by the time of their study the proportion had improved  - to 21%.

Archie Cochrane's ideas were developed into a working model by teams in North Carolina and Toronto in the late 1980s and early 1990s. In the present day Evidence Based Medicine is finally standard practise in many international health care systems. But it still has its detractors.

The definition of EBM, in the words of How to Read a Paper author Trish Greenhalgh, is

  “the use of mathematical estimates of the risk of benefit and harm, derived from high quality research on population samples, to inform clinical decision making in the diagnosis, investigation or management of individual patients.”

Put that way it sounds a bit dry (though the book is far from it), but it makes sense that the experience of lots of people is going to be more helpful than the personal experience of one doctor, and EBM is about the methods of collecting all that information together, deciding which evidence we should pay the most attention to, then how to use it most effectively in a particular case.

One of those who whipped Cochrane’s ideas into practical shape, Dave Sackett, summarised EBM in five essential steps:

1) We need to turn our problem into a question we can answer

2) We need to track down the best evidence to answer the question with

3) We need to take a look at our evidence to judge  a) how close to the truth it is,

and

b) how it will be useful to our particular problem

4) We need to actually do it

5) We need to evaluate our performance - how well did we do?

We’ll be coming back to look at all these points in more detail in future posts.

The critics of EBM claim it’s little better than a cook book, that it dismisses experience and reduces medicine, and patients along with it, to numbers. But in reality it’s a balance between intuition (experience) and science (trial data).

No doctor could deny that intuition plays a role in their every day treatment decisions, yet surprisingly little research has been done into the way intuition actually works. When you first learn, you stick to the rules. You're worried that if you stray from the path something terrible might happen - and let's be honest, in medicine that's not a bad way for a new doctor to think. But as you gain experience you start to think of your actions in what we call "scripts" - stereotypical case histories where everything goes according to plan and the patient is nice and uncomplicated. Finally, you reach a level of expertise. You collect alternative stories, where the patient was more complicated or the procedure a bit different, either from your own experience or from other people telling you about theirs.

Evidence Based Medicine is still about these scripts and alternative stories - but on a massive, scientifically measured scale. We should never forget that every data point is a person - the woman who found a lump in her breast while trying on a dress in a shop changing room. The man whose wife nagged him to see the doctor when he tried to ignore the blood in his urine. The child who’s needed special care since birth due to the chance, cruel combination of  genes. And it’s because it’s people, with loved ones and lives to lead, that we need to use the very best evidence that we can gather.

Yesterday the UK charity Sense About Science launched a new campaign, called Evidence Based Medicine Matters, to give EBM a higher profile in the public imagination. The campaign already has the support of 20 medical Royal Colleges and the Royal Pharmaceutical Society, and yesterday they launched a new booklet at a House Of Lords reception, saying "they will continue to strive towards a solid evidence base for treatments because this gives doctors and patients the best foundation on which to base decisions".

The booklet, launched by Sense About Science in association with the Academy of Medical Royal Colleges, gives 15 shining examples of where Evidence Based Medicine has made a big difference to real people - that woman in the changing room, that man peeing blood. I think Martin Astbury, President of the Royal Pharmaceutical Society, put it best:

"We believe evidence based medicine is the key to the success of modern healthcare. Modern medicine faces challenges every day from therapies that escape rigorous scrutiny: consider the scandal of £4M the NHS spends on homeopathy every year. By continually striving towards a solid evidence base for treatments we give doctors and patients the best foundation on which to base decisions. It is the ongoing process of testing treatments and collecting evidence that moves medicine forward. It is the basis for the extraordinary improvements in life expectancy and quality of life we have seen in the last century."

You can read a lot more in the Sense About Science pdf "Evidence Based Medicine Matters" here:

 

You might be wondering, if your only experience of healthcare is as a patient, why Evidence Based Medicine should mean something to you. It's quite simple. Firstly you'll benefit from it - you want to know that when you're ill, you're going to have the best and most effective treatment. But also, there's something a bit more subtle to be learned here. How many people, when being diagnosed with an illness, nowadays turn to the Internet? The proliferation of information (and blatant misinformation) is huge. By learning how to gather and weigh evidence as patients, we can make sense of the sheer volume of websites, PDFs and information sources out there. EBM can help us to make sense of our own health.

Evidence matters. Medicine matters. Combining the two can never be less than a good thing. And while the Sir Lancelots can still teach us valuable lessons with their years of experience, they should never be able to overrule the experience of Science.


13 Responses to “The Evidence Explosion”

  1. Pebbles Reply | Permalink

    * In the early 1970s, a man called Archie Cochrane decided he wasn’t very happy with that. Director of the Medical Research Council Epidemiology Research Unit in Cardiff, Cochrane published a book called Effectiveness and Efficiency: Random reflections on Health Services."

    Remarkable then that the 96 season study of flu vaccination by Cochraine who looked at the 10 top claims for its efficacy found that the evidence for efficacy was 'implausable at best'. The top claim for flu jab by the NHS and proper doctors is that it halves winter deaths. Cochraine found from stats that flu like illnesses, and that is not lab confirmed even, only account for about 10% winter deaths. Cochraine conclude that for the NHS claim that flu jab halves winter deaths to be true the jab would have to have an impact on road traffic accidents!

    Oddly the NHS is still spending £15 million of taxpayer's money on flu jab woo, unless of course the vaccine believer, carry on mentality, is still rife in the health service with the mantra 'of course we all know vaccines save lives'. I think not, what about the evidence Tania? or is it that doctor's experience overriding?

    "Yesterday the UK charity Sense About Science launched a new campaign, called Evidence Based Medicine Matters, to give EBM a higher profile in the public imagination."

    NonSense about science is about as sensible and scientific as Donald Duck, how they got charitable status is a joke.

    If this blog is about what epidemiologists do, I wonder what you do, make this all up from lots of anecdotes because I just can't see the science in it.

    • Tania Browne Reply | Permalink

      What I do is clearly on my "About" page. What YOU do, apart from spend a large amount of time picking apart my blog, promoting a shameful woo agenda, and sailing very close to the wind when it comes to personal insults, is a lot less clear because you are hiding behind a veil of anonymity. Come out! It would be fascinating! What are YOUR scientific qualifications, what peer reviewed published studies can you point me to on the topic, and what sites do you regularly post on where you discuss your beliefs?

  2. Heinz Reply | Permalink

    Hi Tania
    Its good to see that you are including empirical and anecdotal evidence within your description of EBM. The "gold standard" RCT falls short due to the decline effect and health care providers need to have some freedom when evaluating individuals health challenges so they can provide bespoke care packages for patients allowing for the individuality that comes with human form and existence.
    Most RCT based study is woeful when put into clinical practice, and the common sense approach replaced for what is "current thinking" which is then reversed several years down the line for what then is the new current thinking, meanwhile the patient suffers.
    Even the retired editor of the NEJM could take no more as the articles written and published that health care providers base their information on where found to be woefully inaccurate and misleading, this means that basically Doctors are basing their clinical decisions based on advertising media.

    • Tania Browne Reply | Permalink

      Hi Heinz, I'd be really interested to read more about the failures of RCTs when going clinical, do you know of any OA material on that? I'd love to read more. As I've said in the piece I think it's a fine balance between intuition and the numbers which any EBM practitioner would agree with, it's never a case of either / or and real patients involve one heck of a lot of grey areas. Hope that comes over clearly!

  3. pebbles Reply | Permalink

    "Hi Heinz, I'd be really interested to read more about the failures of RCTs when going clinical, do you know of any OA material on that?" Tania

    Well there isn't enough time to do this justice but how about flu vaccine and Vioxx for starters. The former is still being used despite 96 season study from Cochraine telling us it's useless, the latter killed 260,000 people before insurance payouts overtook profits.

    Then of course there are anti depressants, most of which don't even beat placebos

    • Tania Browne Reply | Permalink

      I've already commented in the flu vaccine issue, and I'm aware of the terrible situation with Vioxx which was mainly due to cherrypicking - in that case, burying bad results by stopping the trial early. What I mean is clinical trials where, for instance, sampling errors and bias in methodology (eg lab bench but no "real" outcomes) has lead to a drug not having as great an effect as an RCT would suggest. I'm planning on doing a post on RCTs in future, Heinz, and would be very interested in any info you have on meta-analyses or systematic reviews that show areas RCTs haven't actually been that great. Contrary to what Pebbles may think, I am open to contradictions as long as they are backed by the scientific method :)

  4. Pebbles Reply | Permalink

    SSRI's have consistently been shown to be expensive placebos

  5. Pebbles Reply | Permalink

    " I'm aware of the terrible situation with Vioxx which was mainly due to cherrypicking - in that case, burying bad results by stopping the trial early. " Tania

    So what about the endemic cherrypicking in the vaccine issue Tania, hyping up risks like in measles and then selling the 'solution'. The recent madness in Wales is a good example, totally disproportionate response.

    • Tania Browne Reply | Permalink

      I have no doubt that cherry picking in vaccine trials happens, it does in a lot of clinical trials and there's no real reason to suggest that vaccines would be an exception. I'd be very interested to read about studies where trial data is clearly missing from meta-analyses, could you cite me the studies you're pulling your info from? Thanks. I've written about AllTrials for The Guardian in the past and am interested in forest plots where outliers are clearly missing etc :)

      As for what you describe as "the madness" in Wales, I simply saw that as an outbreak-response vaccination campaign, correctly implemented based on the latest WHO guidelines and evidence that they're very effective in controlling outbreaks. Marinović et al (Emerging Infectious Diseases. Sep2012, Vol. 18 Issue 9), Broutin et al (Microbes and Infection.April2005 Vol.7 Issue 4), Grais et al (Epidemiology and Infection, August2006 Vol.4 Issue 4), Grais (again) et al (JoRS Interface. January2008 Vol. 5 Issue 18) and Alberti et al (International Health. March2010 Vol.2 Issue 1) all show that outbreak response vaccination campaigns are very effective in reducing the size of outbreaks. You can say it was a silly over reaction all you like, but the fact is that there are still 30m cases of measles worldwide every year and even in high income nations the mortality rate is between 0.1 and 0.2%. And if you think it was an excuse for pharma to sell sell sell, it's also well worth pointing out that vaccines only produce about 3% of drug companies income. Big money to us, but small potatoes to them

  6. Pebbles Reply | Permalink

    "but the fact is that there are still 30m cases of measles worldwide every year and even in high income nations the mortality rate is between 0.1 and 0.2%." Tania

    This is not true. The mortality rate in the third world cannot be added into a total worldwide mortality rate, that is fudging facts to make a point. In all the cases in Wales the mortality rate was zero, if we compared this to an equally sized outbreak in Africa the mortality rate would not be so.

    Back to sanitation and nutrition Tania, that's the distinguising factor and you still have a problem with seeing that.

    I am not surprised you have written for the Guardian, they tow the party line on vaccination everytime, like the BBC.

    • Tania Browne Reply | Permalink

      Pebbles, you are talking nonsense. You refuse to back up your wild claims with any links to where you're getting your "information" from, you're making silly accusations about me and you still refuse my invitations to come out from behind your anonymity and numerous IP addresses.

      As you won't back up your claims with evidence and any discussions we have are completely circular, I will now be deleting comments from you as and when I see them. As I've said before, no doubt you'll see this as a sign that you've "won" but the simple fact is, I've given you plenty of opportunity to be courteous and provide literature to back yourself up and you won't because you either can't, or you don't understand the scientific method and how it works. I have limited time to spend on my blog and I would rather spend it researching and writing new material than arguing with someone who doesn't seem to understand the terms of the discussion. Goodbye.

Leave a Reply


× 7 = twenty eight