The post-modern assault on evidence-based medicine

Steve Simon

2007-12-18

I never got around to submitting an article to Skeptic Magazine, but here is an update of something I wrote back in 2007 about post-modern philosophy and evidence based medicine.

Introduction.

There’s an old joke about philosophy: The famous French philosopher, Rene Descartes walks into a bar. The bartender asks “Do you want something to drink?” to which Descartes replies “I think not.” Then poof! he disappears.

This is the sort of joke that appeals to those of us who see philosophy as a bunch of silly old fools talking about things that have no practical impact on the world. I certainly am one of the people who Bertrand Russell refers to as

“many men, under the influence of science or of practical affairs, are inclined to doubt whether philosophy is anything better than innocent but useless trifling, hair-splitting distinctions, and controversies on matters concerning which knowledge is impossible." Source: The Skeptic’s Dictionary

But I am also inclined to agree when he later asserts that

“The man who has no tincture of philosophy goes through life imprisoned in the prejudices derived from common sense, from the habitual beliefs of his age or his nation, and from convictions which have grown up in his mind without the co-operation or consent of his deliberate reason."

So it is both with a sense of trepidation, but also a sense of excitement that I want to tackle a recent philosophical critique of evidence based practice.

Post-modern philosophers have been attacking many institutions that provide (from the post-modern perspective) a false sense of objectivity in their work. It was only a matter of time before they would turn their attention on evidence based practice (EBP), an approach that tries to incorporate greater use of objective research into the practice of healthcare. Their rhetoric is surprisingly harsh, but unnecessarily so. All the criticisms that post-modern writers lay at the feet of EBP are criticisms that EBP itself has been able to successfully identify. EBP is perfectly capable on its own to understand the areas where objectivity is an illusion and to take corrective action.

What is evidence based practice?

Evidence Based Practice has radically changed our health care system. This is an umbrella term for Evidence Based Medicine, Evidence Based Nursing, Evidence Based Dentistry, Evidence Based Mental Health, and so forth. The term Evidence Based Practice is preferred by many because it incorporates the interests of all health care professionals.

There are many competing definitions for EBP. A classic definition from one of the most popular books on the topic is

“the integration of best research evidence with clinical expertise and patient values." [Source: Sackett 2000][sa00]

The three pillars of EBP (best evidence, clinical expertise, and patient values) are all important, but some definitions will fail to emphasize all three elements equally. Interestingly, the first edition of this popular reference uses

“integrating individual clinical expertise with the best available clinical evidence from systematic research," [Source: Sackett 1997][sa97]

a definition that fails to mention patient values at all.

The five steps in applying EBP in a clinical situation are:

Source: Ohio University (Link is broken)

Most proponents and critics of EBP focuses only on the third step, critical appraisal. I fall into this trap myself often. It’s easy to do because this is the most visible and concrete aspect of EBP. This article may have the unfortunate tendency to perpetuate the reductionist notion that EBP is all about critical appraisal.

What was life like before EBP?

It is very important to place EBP in context by noting what it has replaced. Before EBP became prominent, changes in medicine occurred when a small group of respected experts opined that changes were needed. This is practice is labeled eminence-based medicine.

“Eminence based medicine—The more senior the colleague, the less importance he or she placed on the need for anything as mundane as evidence. Experience, it seems, is worth any amount of evidence. These colleagues have a touching faith in clinical experience, which has been defined as ‘making the same mistakes with increasing confidence over an impressive number of years.’ The eminent physician’s white hair and balding pate are called the “halo” effect." Source: Isaacs 1999

The Cochrane Collaboration.

At the forefront of the EBP movement is the Cochrane Collaboration. The Cochrane Collaboration is organization of a large number of medical professionals who donate their services to produce evidence based systematic reviews on a variety of health topics. Some of the readers may be more familiar with the term meta-analysis. Meta-analysis constitutes the statistical tools used to quantitatively combine the results of multiple research studies. A systematic overview is a careful and reproducible method for gathering all available research on a particular topic, which may or may not include a quantitative pooling (meta-analysis) as part of the process. Thus, systematic overview is a more general term.

The Cochrane Collaboration uses a fairly rigid set of guidelines, again developed by medical professionals, to insure uniformity and a high level of quality for these systematic reviews.

The Cochrane Collaboration was named in honor of a famous physician, Archie Cochrane, who argued that

“because resources would always be limited, they should be used to provide equitably those forms of health care which had been shown in properly designed evaluations to be effective. In particular, he stressed the importance of using evidence from randomised controlled trials (RCT’s) because these were likely to provide much more reliable information than other sources of evidence." Source: Cochrane website

Post modern critique of EBP.

Any movement that makes major changes to the way that health care is provided is going to have its share of supporters and critics. But recently, EBP has come under a vigorous criticism assault from writers who apply post-modern techniques in their criticism. I’m a big fan of both EBP and post-modern philosophy, though both groups, particularly the latter, will sometimes take their arguments to ridiculous extremes. I also see EBP as a logical tool for investigating and validating some of the social, political, and historical influences on the process of combining best available evidence with clinician knowledge and patient values.

Some critics attempt to deconstruct EBP with a sense of dispassion

“more useful than either arguing for or against it [EBP], is to understand the policy background and sociological reasons for its emergence and spread” [Source: Traynor 2002][tr02]

“postmodernism fundamentally challenges the apparent ‘objectivity’ of evidence-based practice but it does not challenge the fundamental rules for acquiring and testing evidence. Rather it is the selection of questions to be asked and answered by evidence-based practice/practitioners that is the true limitation. This is the ground upon which fruitful argument can be had about the significance of evidence without undermining the requirement that there be evidence and standards to judge such evidence." [Source: Griffiths 2005][gr05]

Others are harshly critical. The most vicious of these assaults ([Holmes 2006][ho06]) describes EBP as “microfascism," an “ossifying discourse," and a “hegemony” that calls for “vigilant resistance.” This criticism misses the mark and totally mischaracterizes the EBP movement. This criticism does, however, provide an opportunity to understand how EBP fits into a post-modern view of the world.

The heart of the critique is that EBP is “dangerously exclusionary” because it relies on rules developed by the Cochrane Collaboration. And the Cochrane Collaboration (according to Holmes), will accept only randomized clinical trials and therefore rejects 98% of the available evidence.

The charge of fascism.

The analogy between EBP and fascism deserves to be put aside quickly. The problem with analogies to fascism; they rapidly lose any sense of perspective. The prospect of an insurance company using an EBP argument to deny women under the age of 50 a free mammogram is indeed bad, but can it compare to life under a brutal dictator like Saddam Hussein? You can tell that the authors are nervous about using such a strong term. They immediately qualify the term by pointing out that the “fascism of the masses” (as practiced by Hitler and Mussolini) has been replaced by microfascisms,

“polymorphous intolerances that are revealed in more subtle ways."

But micro doesn’t mean less serious. In fact the authors point out that microfascisms are

“less brutal, [but] they are nevertheless more pernicious."

You and I would probably look on charges of fascism with an air of bemusement, but at least one journalist noted the historical work of Archie Cochrane, who fought true fascists rather than inventing imaginary fascists.

“But Archie Cochrane, on the other hand, pioneering epidemiologist, inspiration for the Cochrane Library, a prisoner of war for four years in Nazi Germany, who has, from his abstracted position, probably saved more lives than any doctor you know, might see it differently, since in 1936, he went to Spain to join the International Brigade, and fight the fascists of General Franco. Now, what did you do with your summer holidays?" Source: Goldacre 2006

It’s unfortunate that the charge of fascism has gained so much prominence, because it immediately polarizes the debate about post-modern philosophy and EBP. There are serious charges hiding behind the nasty language, and it is these charges rather than the harsh rhetoric that deserve a careful review.

What is post-modern philosophy?

Postmodern philosophy is difficult to define in a brief form. One common theme among most post-modern writers is a distrust of the grand narrative, an all encompassing system used to identify universal truths (see, for example, [Lyotard 1988][ly88]). Religious and superstitious beliefs characterize the grand narrative of the pre-modern world. Science is held up as the prime example of the grand narrative of the modern world, though Marxism and Freudian theory can also be cited as examples of grand narratives. In the post-modern world, grand narratives are eschewed. They represent an attempt of those in power to define ways of knowing in an exclusionary way that preserves their power by marginalizing dissenting voices. Postmodern philosophy has been derided as the belief that there are no universal truths, but a fairer characterization is that they believe that there is no single path to establishing truth. Instead, truth is established in a way that is dependent on social context.

Post-modern philosophy is harshly critical towards logical positivism, the belief that

“that all meaningful statements must consist solely of empirically verifiable facts” Source: Dtd thesis weblog

and empiricism, the belief that

“all hypotheses and theories must be tested against observations of the natural world, rather than resting solely on a priori reasoning, intuition, or revelation” Source: Wikipedia

Post-modern analysis commonly relies on a process called deconstruction for critical analysis. This is a process that

“works to demonstrate how concepts or ideas are contingent upon historical, linguistic, social and political discourses, to name but a few." (Holmes 2006)

It is tempting to use this word negatively as a synonym for destruction , but that oversimplifies the notion of deconstruction.

Post-modern philosophers have noted that any text or narrative has absences or omissions, not necessarily because the writer is sloppy, but because writing about anything except the utterly trivial will require more space and time to produce a complete account than any author is capable of. Deconstruction examines these absences and omissions. Much like Debussy defined music as the spaces between the notes, post-modern philosophers derive the meaning of a text by the the unstated assumptions that these texts are built upon.

What do post-modern critics want to replace EBP with?

Holmes uses the analogy of the rhizome. A ginger bulb is an example of a rhizhome. The rhizhome is a good analogy for post-modern nursing because

“the rhizome is open at both ends. It has no central or governing structure; it has neither beginning nor end. As a rhizome has no centre, it spreads continuously without beginning or ending and basically exists in a constant state of play. It does not conform to a unidirectional or linear reasoning. The rhizome challenges the sense of a unique direction because it emerges and grows in simultaneous, multiple ways."

“The rhizomatic thought permits to bear ambivalence, allegory, chaos and diversity because the thinker is not attached to an official structure, a rigid pattern, an imposed and straightforward stream of thought. It is postmodern, in its very essence. This type of ‘counter-thought’ offers new possibilities because it does not follow a logic characterized by dichotomy or binary positions."

“the rhizomatic thought would acknowledge, accept and promote multiple discourses within nursing, even if they compete with one another."

“We believe that the existing paralysing ‘nursing order’, as it pertains to knowledge development, must be replaced by ‘nursing chaos’. From chaos will emerge a brand new and fragmented order, one that will dare to tolerate multiplicities of thoughts." [Source: Holmes 2004][ho04]

Another post-modern critic uses the term “methodological pluralism.”

“Codes for the production of nursing knowledge have been skewed towards knowledge that is statistically verifiable, rupturing the methodological pluralism that the nursing community had previously accepted as suitable for the production of nursing knowledge." [Source: Winch 2002b][w20b]

No conspiracy theories, please.

First it’s worth noting that the Cochrane Collaboration is not a shadowy organization that secretly pulls the strings of the entire research community. Cochrane Collaboration reviews are not immune from criticism (Alderson 2003, Eysenbach 2005). To the extent that the research community has adopted models developed by the Cochrane Collaboration, it is largely out of respect for the quality of work that they produce (Oleson 1998).

Reliance solely on RCTs?

Reliance solely on RCTs is indeed exclusionary, but perhaps what is being exercised here is not exclusion but discretion. The belief of many in the EBP movement is that a single well-conducted randomized trial will trump any number of observational studies (research comparisons that do not use randomization).

Is this belief true?

The most common argument for the superiority of RCTs is (irony of ironies) anecdotal evidence. There are specific medical therapies (two commonly cited examples are mammary artery ligation and post-menopausal hormone replacement therapy) which that achieved wide acceptance on the basis of observational research but were later discredited by carefully conducted RCTs.

If there is any strong evidence about RCTs, it is evidence that would indicate that RCTs really aren’t that much better than observational studies (Concato 2000). While there is still debate in the research community about the meaning of this study, I take it as evidence that the folks who conduct observational studies have learned how to do them with sufficient care that their credibility approaches that of randomized studies. Much of the anecdotal evidence against observational studies comes from decades ago. The mammary artery ligation studies, for example were performed in 1959. Many observational studies back then relied on historical controls and did not incorporate complex adjustments to control for bias.

An article with the hilarious title “Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomised controlled trials” mocks the rejection of therapies not studied by randomized trials.

“As with many interventions intended to prevent ill health, the effectiveness of parachutes has not been subjected to rigorous evaluation by using randomised controlled trials. Advocates of evidence based medicine have criticised the adoption of interventions evaluated by using only observational data. We think that everyone might benefit if the most radical protagonists of evidence based medicine organised and participated in a double blind, randomised, placebo controlled, crossover trial of the parachute." (Smith 2003)

All of this would be a scathing indictment of EBP, except for the fact that EBP does not rely solely on RCTs. Holmes badly misreads one of his sources here. The 98% comes from a quote by David Sackett, in his famous book about EBP (Sackett 2000). Sackett claimed claimed that only 2% of the published research is sound with valid conclusions. If you read the whole book, however, you will see that nowhere does Sackett equate randomized with sound/valid nor observational with unsound/invalid. In fact, a careful reading of this book will show you that there are criteria both for randomized and observational studies that you can use to gauge the validity of either type of study. Sackett does suggest that randomized studies are more appropriate for evaluating efficacy and observational studies for evaluating harm, but even here, it is obviously more a convention to simplify the presentation. Sackett recognizes that randomized trials are sometimes more appropriate for some studies of harm and observational studies are sometimes more appropriate for some studies of efficacy.

“Evidence based medicine is not restricted to randomised trials and meta-analyses. It involves tracking down the best external evidence with which to answer our clinical questions. To find out about the accuracy of a diagnostic test, we need to find proper cross sectional studies of patients clinically suspected of harbouring the relevant disorder, not a randomised trial. For a question about prognosis, we need proper follow up studies of patients assembled at a uniform, early point in the clinical course of their disease. And sometimes the evidence we need will come from the basic sciences such as genetics or immunology. It is when asking questions about therapy that we should try to avoid the non-experimental approaches, since these routinely lead to false positive conclusions about efficacy. Because the randomised trial, and especially the systematic review of several randomised trials, is so much more likely to inform us and so much less likely to mislead us, it has become the “gold standard” for judging whether a treatment does more good than harm. However, some questions about therapy do not require randomised trials (successful interventions for otherwise fatal conditions) or cannot wait for the trials to be conducted. And if no randomised trial has been carried out for our patient’s predicament, we must follow the trail to the next best external evidence and work from there." [Source: Sackett 1996][sa96]

What evidence does the Cochrane Collaboration accept?

Does the Cochrane Collaboration only accept randomized trials? Well yes and no. The Cochrane review relies on the best available evidence. Although there are exceptions to this rule (see below), the Cochrane Collaboration will generally exclude non-randomized studies from the systematic overview if good quality randomized trials are available. If randomized trials are not available, or if they are uniformly flawed, then they use non-randomized evidence. This is not unlike the coffee drinker who always chooses cream over milk for their coffee if both are available, but will tolerate milk as a last resort.

An interesting example where the Cochrane Collaboration did mix randomized and observational data occurs in a systematic overview that examines the benefits of breastfeeding.

“We selected all internally-controlled clinical trials and observational studies comparing child or maternal health outcomes with exclusive breastfeeding for six or more months versus exclusive breastfeeding for at least three to four months with continued mixed breastfeeding until at least six months. Studies were stratified according to study design (controlled trials versus observational studies), provenance (developing versus developed countries), and timing of compared feeding groups (three to seven months versus later)." Source: Cochrane website

If any group would reject evidence based on observational studies, it would be those very same post-modern philosophers. A study like the breastfeeding study mentioned above would suffer (from a post-modern perspective) from logical positivism since randomized and observational studies both represent an attempt to apply scientific methods to uncover an objective reality.

Does Cochrane exclude qualitative data?

Perhaps the dangerously exclusionary EBP is excluding qualitative research. There are methods to combine qualitative and quantitative research (Jick 1979), but it is unclear how to do this in the context of EBP. Although there are a few systematic overviews that summarize qualitative research studies (such as Munro 2007), this has been a self-acknowledged failing of EBP. In response, the Cochrane Collaboration has developed the Cochrane Qualitative Research Methods Group.

“The Cochrane Qualitative Research Methods Group develops and supports methodological work on the inclusion in systematic reviews of findings from studies using qualitative methods and disseminates this work within and beyond the Collaboration’s Review Groups. The Cochrane Qualitative Methods Group focuses on methodological matters arising from the inclusion of findings from qualitative studies into systematic reviews." Source: Joanna Briggs website

The role of individual patient preference.

Perhaps EBP is dangerously exclusive because is does not allow a role for individual patient preferences. This is belied by the classic definition of EBP which incorporates individual patient preference, as well as exhortations among EBP proponents not to ignore the individual patient (Sackett 2000). There is an effort to encourage value based medicine, which involves greater consideration of patient values (Brown 2005), but this is not a rejection of EBP but rather incorporates it under a broader umbrella.

“Values-based medicine can incorporate all the other paradigms of medicine, including scientific and evidence-based medicine, within it, because it can include anything that contributes to human security and flourishing.” (Source: www.mja.com.au/public/issues/177_06_160902/lit10253_fm.html)

The use of intuition.

Unmentioned by Holmes (2006) in this list of other ways of knowing is the word “intuition.” But intuition is mentioned by other post-modern critics.

“Methods that are likely to capture the intuitive base of nursing, such as the case study or conversational analysis, are not credible within the current evidence-based practice framework." [Source: Winch 2002a][w20a]

“Ultimately, the postmodern ironist reader of the research report must make a judgement without criteria, based on her own practical wisdom or ‘prudence’." Source: Rolfe 2006

What role should intuition play in the practice of health care? My hunch is that intuition is overrated as a way of knowing. I could cite a whole bunch of empirical studies that criticize our process of intuition, but that research would not be all that persuasive to someone who views intuition as an important way of knowing.

One irreconcilable problem is deciding how to deal with two people offering conflicting intuitions? You can choose the one who agrees with my current intuition (which makes you ask, why did you bother searching for those other intuitive insights) or you could choose the intuitive insight from the person you respect and trust more. This brings us back to the days of eminence based medicine.

I know of several ways to reconcile two conflicting empirical studies. You can split the difference, you can search for a methodological explanation of the disparity, you can use a third replication to break the tie, or you can select the results from the empirical study that is higher on the research hierarchy. Unlike intuition, empirical data lends itself to resolution of conflicting viewpoints.

The best argument I can make against reliance on intuition comes from a very pragmatic argument. Think back to the last time that you were able to convince a skeptical audience that your viewpoint was correct. Were they persuaded by your powers of intuition? Or did it take something else? There’s a famous saying among statisticians:

“In God we trust, all others bring data." Source: Oxford Reference

That’s an quote that falls decidedly in the empiricist camp, but it rings true in many situations. Most of us do not have the persuasive power of Oprah Winfrey, who send authors to skyrocketing fame and fortune with just a brief mention of their books. The rest of us need to present persuasive arguments and most persuasive arguments rely on data rather than personal intuition. Perhaps the greatest empiricist of the 19th century was Florence Nightingale.

“Florence Nightingale had exhibited a gift for mathematics from an early age and excelled in the subject under the tutorship of her father. She had a special interest in statistics, a field in which her father, a pioneer in the nascent field of epidemiology, was an expert. She made extensive use of statistical analysis in the compilation, analysis and presentation of statistics on medical care and public health. Nightingale was a pioneer in the visual presentation of information. Among other things she used the pie chart, which had first been developed by William Playfair in 1801. After the Crimean War, Nightingale used the polar area chart, equivalent to a modern circular histogram or rose diagram, to illustrate seasonal sources of patient mortality in the military field hospital she managed. Nightingale called a compilation of such diagrams a “coxcomb”, but later that term has frequently been used for the individual diagrams. She made extensive use of coxcombs to present reports on the nature and magnitude of the conditions of medical care in the Crimean War to Members of Parliament and civil servants who would have been unlikely to read or understand traditional statistical reports. In her later life Nightingale made a comprehensive statistical study of sanitation in Indian rural life and was the leading figure in the introduction of improved medical care and public health service in India. In 1859 Nightingale was elected the first female member of the Royal Statistical Society and she later became an honorary member of the American Statistical Association." Source: Wikipedia

Florence Nightingale made major changes in health care in a time when women’s voices were routinely ignored and when the profession of nursing was in its infancy. She couldn’t make any serious headway by making intuitive arguments, but had to present hard facts. Today’s world is different, of course, but the persuasive nature of data still remains strong.

But the point to remember is the EBP does not force you to abandon your intuition. Your intuition is part of your clinical judgment, and clinical judgment is explicitly acknowledged in the definition of EBP. I wouldn’t want it any other way. If all of the research said that a certain therapy was good for you, but your doctor or nurse had a nagging suspicion that you were an exception, I’d want to hear about it, wouldn’t you?

“Good doctors use both individual clinical expertise and the best available external evidence, and neither alone is enough. Without clinical expertise, practice risks becoming tyrannised by evidence, for even excellent external evidence may be inapplicable to or inappropriate for an individual patient. Without current best evidence, practice risks becoming rapidly out of date, to the detriment of patients."” (Sackett 1996)

Does EBP allow outsiders to dictate to the nursing profession?

Some postmodern critics see EBP as a way for outsiders to meddlein the practice of nursing.

“If nursing was to marginalize itself within initiatives promoting evidence-based practice …, the possibility exists that research on nursing practice would come to be led by others." Source: Bonnell 1999

In contrast, other post-modern philosophers criticize such a concern as linear (the ultimate post-modern insult) and prescriptive and correctly point out that a true post-modern perspective would encourage

“transdisciplinarity, diversity and plurality." [Source: Holmes 2004][ho04]

After all, you can’t complain about dangerous exclusions out of one side of your mouth while excluding other voices from consideration because they are not nursing voices.

This fear of outsider influence is not new. When the practice of meta-analysis became popular in the mid 1990’s there was a surge of resentment among doctors because all their hard work designing and running clinical trials became a mere piece of data in the meta-analysis that often included (horror of horrors!) a statistician as the lead author.

“Meta-analysis has provoked acrimony in every discipline-from psychology to physics- where it has been applied. ‘To some people,’ say Richard Kronmal, a biostatistician at the University of Washington, ‘it seems like little more than an attempt by statisticians to put themselves on the top of the totem pole. Individual researchers with their individual experiments see themselves reduced to becoming a cog in the great statistical wheel. And they’re saying, well, no, that’s not how science works.'" [Source: Mann 1990][ma90]

This resentment subsided when the statisticians recognized that they could not produce credible meta-analysis without substantial medical input, and the process of producing meta-analytic studies became a cross-disciplinary team effort.

Is EBP a surrogate for cost containment at the expense of quality of care?

There is certainly a lot of concern about how EBP is a covert way of introducing cost containment into the health care system. One postmodern critic of EBP argues that it

“is driven by the economic imperatives of the healthcare system." [Source: Winch 2002b][w20b]

It is certainly debatable whether EBP encourages cost reduction at the expense of quality health care, but if this is happening, this clearly is an abuse of the EBP methodology.

Some fear that evidence based medicine will be hijacked by purchasers and managers to cut the costs of health care. This would not only be a misuse of evidence based medicine but suggests a fundamental misunderstanding of its financial consequences. Doctors practising evidence based medicine will identify and apply the most efficacious interventions to maximise the quality and quantity of life for individual patients; this may raise rather than lower the cost of their care. (Sackett 1996)

Deconstructing EBP using EBP.

The post-modern criticisms that EBP excludes observational data, denigrates qualitative data, ignores individual patient preference, or substitutes cost containment for quality health care are all weak charges. Proponents of EBP have already recognized these issues and have taken steps to address them. It is worth noting that EBP does a better job of deconstructing itself than the postmodern critics have done.

For example, what influence does money have on EBP? Quite a bit. Researchers with financial ties to pharmaceutical companies are more likely to produce publications favorable to that company’s products (Stelfox 1998) and have a greater tendency to write conclusions that are not supported by the data. [[Reference needed]]. Published conflicts of interest also influence reader’s perceptions of the articles (Schroter 2004).

What influence does the dominance of the English language have on EBP? Articles published in English, compared to articles published in other languages are more likely to produce positive results (Egger 1997). Ignoring non-English language publications is a dangerously exclusionary practice that was discovered through EBP.

What impact does the availability of information on the Internet have on EBP? Articles published with full free text on the web (FUTON) are more often cited than their counterparts without full text on the net. This is known as FUTON bias, and can potentially lead to seriously flawed conclusions (Murali 2004; Wentz 2002).

I noted earlier the evidence based explorations of the relative value of randomized versus observational studies.

Is it fair to use a flawed tool, EBP, to evaluate the flaws of EBP?

There are limitations to any methodology, including EBP. EBP relies on the best available evidence, but the production of evidence is a social process that is influenced by economic considerations, social norms and expectations, political meddling, and so forth. If no one uses women as research subjects, then EBP cannot make intelligent recommendations about the treatment of women. But post-modern critics err when they claim that EBP proponents

“seldom question the authority of their own discourses, but deploy them unknowingly."

or that EBP

“entails an inability to appreciate (even tolerate) contrasting ideas and/or ‘see a bigger picture’." [Source: De Simone 2006][de06]

These statement is demonstrably false. EBP is a system with built-in self-correction. If EBP produces bad conclusions, one can study this and propose remedies using EBP itself.

In fact, EBP is, to a large extent, self-critical and self-correcting. It’s ironic that Holmes (2006) uses words like “hegemonic” to describe EBP. EBP actually disbanded the prior hegenomy of eminence based medicine, a hegemony, it is worth noting, that excluded for the most part the voices of women, the voices of people from developing countries, and the voices of non-physicians.

Today, thanks to EBP, the process of impacting clinical practice is much more democratic. If you don’t like the status quo, conduct your own research and publish it. Unless you conduct this research sloppily, it will get into the next systematic overview. If you don’t trust the people conducting systematic overviews, conduct your own systematic overview.

I won’t pretend that it’s easy to conduct your own systematic overview, but for the record, a systematic overview is scientific in that its methods are transparent and open to replication. The recipe is out there for anyone who has the stamina to follow it.

If you believe that a systematic overview is dangerously exclusionary, replicate it but with a broader set of inclusion criteria. And finally, if you don’t like the rules used by the Cochrane Collaboration, study the empirical impact of using a different set of rules. If EBP is as dangerous as claimed in Holmes, then there ought to be plenty of empirical studies out there that could demonstrate this. I hope that the post-modern critics don’t think that only non-quantitative tools can be used to critique EBP, because if they did, they would be dangerously exclusionary.

EBP is not perfect, but it is the wholesale rejection of this methodology by postmodern critics that is dangerously exclusionary. The threat to quality health care comes not from EBP, but from the

“association of quantitative/experimental methods with lots of unsavoury philosophical and methodological baggage [that] prevents them from giving these methods a full consideration." (Bonell 1999).

What can EBP learn from post-modern philosophy?

While I disagree strongly with many of the published post-modern criticisms of EBP, I still think that there is much that can be gained by viewing EBP from a post-modern perspective.

It is a mistake, for example, to presume that a systematic overview provides objective results. The Cochrane Collaboration, for example, produced a systematic overview of the use of mammography screening Issue 4, 2001. The original review examined a small number studies and rejected others for poor methodological quality. The summary of these studies showed that mammograms did not produce an overall benefit to these women. Dissenting voices claimed that the exclusion of additional studies was wrong and when this data was combined with the studies used in the Cochrane analysis, mammography was shown to be justified (Olson 2001). So which result is correct? You can argue back and forth, but I would point out that both approaches have merit and that if two reasonable competing approaches produce conflicting results, the only safe conclusion is that the research base is ambiguous. One commentary on this controversy notes that

“even when scientists tried very hard to be rigorous and methodologically sound they brought some subjectivity into their work. “Despite all the efforts we make even when we undertake rigorous systematic reviews, interpretations may differ. Different people faced with the same raw data will not necessarily come to the same conclusion,” Professor Liberati said." [Source: Mayor 2001][ma01]

and a Lancet editorial on the controversy points out that

“even in the best organisations raw evidence alone is sometimes insufficient to influence opinion." [Source: Horton 2001][ho01]

If one systematic overview produces such difficulty, it is safe to conclude that others may as well. I do believe that most systematic overviews provide a sufficient approximation to reality that they can be treated as objective. But one should never neglect the potential for ambiguity, even in an open, transparent, and repeatable process like a systematic overview.

A commentary on a different controversy reminds us that the process of producing a systematic overview is not so simple that you just turn the crank and out pops the result.

“In the end there is no escape from a return to “the expert,” who tells us which trial to believe, not only on the basis of methodology but also on the basis of insights in pathophysiology, pharmacology, and perhaps type of publication (supplements, special interest or “throw away” journals, etc). All that we can ask from the expert is a careful explanation of what arguments he or she used in accepting or dismissing the evidence from certain trials."” Jan P Vandenbroucke, Experts’ views are still needed, BMJ 1998;316:469 (7 February)

There is also a mistaken belief among some that if you can reduce something to a quantitative value, you have produced an objective evaluation. There is a famous quote by Lord Kelvin,

“In physical science the first essential step in the direction of learning any subject is to find principles of numerical reckoning and practicable methods for measuring some quality connected with it. I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in your thoughts advanced to the state of Science, whatever the matter may be." [Source: Zapoto Productions website][zap1

and another by Leonardo da Vinci

“No human investigation can be called real science if it cannot be demonstrated mathematically."

but as much as I would love to believe these quotes, I know in my heart that I can’t. I always have to remind myself of the competing quote,

“The government is very keen on amassing statistics. They collect them, add them, raise them to the nth power, take the cube root and prepare wonderful diagrams. But you must never forget that every one of these figures comes in the first instance from the village watchman, who just puts down what he damn well pleases." Sir Josiah Stamp [Source: Broken link]

and

“Not everything that can be counted counts, and not everything that counts can be counted." - Albert Einstein

The tendency to view a statistic as an objective measure is best tempered by reading the excellent book “Damned Lies and Statistics. Untangling Numbers from the Media, Politicians, and Activists.” (Best 2001).

“Every statistic must be created, and the process of creation always involves choices that affect the resulting number and therefore affect what we understand after the figures summarize and simplify the problem. People who create statistics must choose definitions-they must define what it is they want to count-and they must choose their methods-the ways they will go about their counting. Those choices shape every good statistic, and every bad one. Bad statistics simplify reality in ways that distort our understanding, while good statistics minimize that distortion. No statistic is perfect, but some are less imperfect than others. Good or bad, every statistic reflects its creators’ choices." Source: Best 2001, page 161

Finally, while EBP is a good methodology, it is always subject to being hijacked by groups with hidden agendas. There is a term

“Sackettisation … the artificial linkage of a publication to the evidence-based medicine movement in order to improve sales." Source: BMJ 2000;320:1283

that can equally apply for people with other motivations.

Conclusion.

The post-modern criticism of Evidence Based Practice is grossly overstated. EBP has flaws and an appreciation of post-modern philosophy can help you recognize these flaws. But EBP is a vast improvement over the previous reliance on a small group of experts. EBP is broadly democratic. It is open, transparent, and repeatable. It is able to recognize its own flaws, and can take actions to correct itself.

Bibliography

Phil Alderson, Iain Chalmers. Survey of claims of no effect in abstracts of Cochrane reviews. BMJ. 2003;326(7387):475. Available at: http://www.bmj.com/content/326/7387/475.short.

Joel Best. Damned Lies and Statistics: Untangling Numbers from the Media, Politicians, and Activists. 1st ed. University of California Press; 2001. More information at the publisher’s website.

C Bonell. Evidence-based nursing: a stereotyped view of quantitative and experimental research could work against professional autonomy and authority. J Adv Nurs. 1999;30(1):18-23. Article is behind a pay wall

Melissa M. Brown, Gary C. Brown, Sanjay Sharma. Evidence-Based To Value-based Medicine. 1st ed. American Medical Association Press; 2005. Available through Amazon

John Concato, Nirav Shah, Ralph I. Horwitz. Randomized, Controlled Trials, Observational Studies, and the Hierarchy of Research Designs. N Engl J Med. 2000;342(25):1887-1892. Available in html format

John De Simone. Reductionist inference-based medicine, i.e. EBM. J Eval Clin Pract. 2006;12(4):445-449. Article is behind a paywall

M Egger, T Zellweger-Zähner, M Schneider, et al. Language bias in randomised controlled trials published in English and German. Article is behind a paywall

Gunther Eysenbach, Per Egil Kummervold. “Is Cybermedicine Killing You?” - The Story of a Cochrane Disaster. J Med Internet Res. 2005;7(2):e21. Available in html format

Ben Goldacre. Objectionable ‘objectives’. The Guardian. 2006. Available in html format

Dave Holmes, Amélie Perron, Gabrielle Michaud. Nursing in corrections: lessons from France. J Forensic Nurs. 2007;3(3-4):126-131. Article is behind a paywall

Dave Holmes, Dan Warner. The anatomy of a forbidden desire: men, penetration and semen exchange. Nurs Inq. 2005;12(1):10-20. Article is behind a paywall

David Isaacs, Dominic Fitzgerald. Seven alternatives to evidence based medicine. BMJ. 1999;319(7225):1618. Available in html format.

Todd D. Jick. Mixing Qualitative and Quantitative Methods: Triangulation in Action.. Administrative Science Quarterly. 1979;24(4):602-11. Article is behind a paywall

Salla A Munro, Simon A Lewin, Helen J Smith, et al. Patient Adherence to Tuberculosis Treatment: A Systematic Review of Qualitative Research. PLoS Med. 2007;4(7):e238. Available in html format

Narayana S Murali, Hema R Murali, Paranee Auethavekiat, et al. Impact of FUTON and NAA bias on visibility of research. Mayo Clin. Proc. 2004;79(8):1001-1006. Article is behind a paywall

Ole Olsen, Philippa Middleton, Jeanette Ezzo, et al. Quality of Cochrane reviews: assessment of sample from 1998. BMJ. 2001;323(7317):829 -832. Available in html format

Amélie Perron, Carol Fluet, Dave Holmes. Agents of care and agents of the state: bio-power and nursing practice. J Adv Nurs. 2005;50(5):536-544. Article is behind a paywall

G. Rolfe. Judgements without rules: towards a postmodern ironist concept of research validity. Nurs Inq 2006: 13(1); 7-15. Artilce is behind a paywall

Sackett DL. Pronouncements about the need for “generalizability” of randomized controlled trial results are humbug. Control. Clinical Trials 2000: 21; 82S.

Gordon C S Smith, Jill P Pell. Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomised controlled trials. BMJ. 2003;327(7429):1459 -1461. Article is behind a paywall

H T Stelfox, G Chua, K O’Rourke, A S Detsky. Conflict of interest in the debate over calcium-channel antagonists. N. Engl. J. Med. 1998;338(2):101-106. Available in html format

Sharon E. Straus, Finlay A. McAlister. Evidence-based medicine: a commentary on common criticisms. CMAJ. 2000;163(7):837-841. Available in html format

Sarah Winch, Debra Creedy, And Wendy Chaboyer. Governing nursing conduct: the rise of evidence-based practice. Nurs Inq. 2002;9(3):156-161. Article is behind a paywall

You can find an earlier version of this page on my old website.