0
Special Article |

What the Surgeon of Tomorrow Needs to Know About Evidence-Based Surgery FREE

Ronald V. Maier, MD
[+] Author Affiliations

Author Affiliations: Division of Trauma and General Surgery, Harborview Medical Center, and Department of Surgery, University of Washington, Seattle.


Arch Surg. 2006;141(3):317-323. doi:10.1001/archsurg.141.3.317.
Text Size: A A A
Published online

Professionalism is at the core of surgical practice. A major component of professionalism is a lifelong involvement in continuing medical education to ensure the physician is optimally prepared to make the decisions necessary to provide excellent patient care. However, with the explosion of information, new technology, and advanced surgical techniques, it has become virtually impossible to keep up with this wealth of new knowledge. Traditional didactic continuing education techniques have been shown to be inadequate to cause appropriate changes in practice management. It has become necessary to significantly alter our implementation of new knowledge in the clinical arena to ensure optimal patient care.

A major mechanism for translation of new knowledge to the bedside is evidence-based medicine (EBM). There is currently sufficient evidence that the use of EBM in the appropriate setting leads to an improvement in patient care and, in most cases, a decrease in misuse of expensive and critical medical resources. However, appropriate training is needed for optimal use of EBM. Not all EBM is of equal quality. Numerous analyses, particularly systematic surveys, have subsequently been shown to be inaccurate and/or incomplete, thus leading to inappropriate patient care. Evidence-based medicine is not cookbook medicine. Practicing EBM requires a highly trained, astute, and experienced clinician using insightful and objective analyses of the best clinical information available to synthesize the optimal patient care plan. It is imperative that the surgeon of today become familiarized with the concepts of EBM. Physicians must be involved to ensure that the implementation of EBM is accomplished appropriately to achieve the best outcome for each individual patient.

A challenge for every surgical specialist is to meet the increasing expectations of quality of care from patients and society. With increasing recognition of these expectations, professional societies and organizations that oversee training, testing, and certification have developed an infrastructure of competencies required of any physician or surgeon. Among these are professionalism and its long-standing tenet of providing only the highest quality care by any individual physician for each patient. An explicit component of professionalism is the lifelong commitment to continuing medical education and the application of this knowledge, along with extensive empirical knowledge, for the optimal care of the individual patient. The long-standing approach to educational updates via traditional continuing medical education has been proven to be inadequate to significantly affect either physician practices or health outcomes of their patients.1 The explosive development of new knowledge in care, technical applications, and holistic treatment of the patient, along with an ingrained, persistent resistance of the physician to adopt evidence-based data, proves that traditional approaches to modifying clinical practice are insufficient and inadequate. In addition, although many providers believe that they do apply the latest evidence in the care of their patients, there are ample data to conclude that they do not. Considerable lags exist between the confirmation that an evidence-based therapy is proven beneficial and the ultimate translation of the practice into the clinical care of the individual patient.24 Simultaneously, ineffective therapies continue to be used despite evidence that they should not, often leading to potential harm for the patient and the use of expensive resources, producing unnecessary cost to society.5,6

The phrase evidence-based medicine was developed by a group of physicians at McMaster University in Hamilton, Ontario, in the early 1990s.7,8 But what is EBM? The meaning of EBM has changed since that time and continues to evolve.9 Currently, the practice of EBM involves the integration of clinical expertise or “professional wisdom” with the best available external clinical evidence in making decisions about how to deliver care to an individual patient. How well EBM will succeed in translating clinical research into practice or the ultimate impact of EBM on patient care and outcomes is not known.10,11 However, currently, EBM is not, by any measure, adequately integrated into today's surgical practice.12,13

Physicians frequently find themselves confronted with choices in the care of their patients but are unable to predict the ultimate consequences. Thus, it becomes imperative that physicians make conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients. This approach to patient care is the practice of EBM using a process of lifelong self-directed learning.7,14,15 Evidence-based medicine identifies gaps in our knowledge, conducts efficient literature searches to potentially fill those gaps, critically appraises the evidence, and applies the evidence to direct patient care, a process that has been employed by outstanding physicians for decades. Professional wisdom involves use of both clinical skills and clinical judgment, which are vital for determining whether the evidence (eg, the EBM guideline) applies to the individual patient at all and, if so, what is the best evidence? The evidence can range from randomized controlled trial (RCT) or systematic review, to case-control study, case series, a consensus of old professors, individual cases (anecdotes), or something “a friend (or grandma) once told me.”

Obviously, not all evidence is created equal. In assessing scientific evidence, methodologies have been developed to identify levels of evidence or classes of quality of clinical-derived evidence.16,17 Randomized controlled trials (true experiments) or well-designed meta-analyses (systematic reviews) are considered best. Next are quasi-experimental cohort studies using comparison groups, typically with good case mix adjustment. Retrospective (pre-post) comparisons are less valid and are followed by noncausal correlational studies, case series, and single case anecdotes.18 Level 1 evidence consists predominantly of prospective RCTs and systematic reviews of RCTs and is considered the ideal evidence. Levels 2 and 3 involve prospective clinical trials, including cohort and case-controlled studies, and systematic reviews of each type. Level 4 evidence includes case series and poor quality cohort or case-control studies. Level 5 includes retrospective data collection analyses, case reviews, and consensus expert opinions or unproven physiologic or basic observations. Meta-analyses then use these criteria to develop grades of recommendations based on the evidence from various levels of confidence, including grade A, consistent level 1 studies; grade B, consistent level 2 and 3 studies or extrapolations from level 1; grade C, level 4 studies or extrapolations from level 2 and 3 studies; and grade D, level 5, primarily expert opinions or inconsistent other data (Table 1 and Table 2).19

Table Graphic Jump LocationTable 1. Assessing Levels of Evidence19
Table Graphic Jump LocationTable 2. Recommendation Based on Evidence19

Well-conducted clinical trials form the cornerstone of evidence-based practice and have the potential to affect numerous patients and to modify medical care of entire segments of the population. However, sound clinical research, particularly RCTs, remains scarce, particularly in the surgical literature.10,20 Less than 5% of surgical interventions are currently supported by well-conducted RCTs. Case series reports and other retrospective data make up more than 80% to 85% of the published clinical evidence and support of practice standards.21 The view of society is represented by the cover of the New York Times Magazine, which stated, “half of what doctors know is wrong.”22

To compound this lack of level 1 supporting evidence, surgery continues to evolve rapidly with a near-overwhelming speed of advances in basic understanding of disease processes, cutting-edge techniques, and pharmacologic advances. It has become no longer acceptable to practice surgery based on the age-old philosophy of “the way I learned it.” Implementation of EBM leads to guidelines and protocols that minimize the recognized marked variations in care with the accompanying marked variations in patient outcome. Through the EBM process, minimizing systematic errors leads to an overall improvement in the acute and long-term outcome of our surgical patients.35

However, one must remember that EBM is not cookbook medicine. Evidence informs but does not replace clinical expertise. Evidence-based medicine requires extrapolation to the specific clinical setting and the patient's unique biology and disease process. Also, EBM is not cost-cutting medicine. In fact, when efficacy becomes paramount, costs may rise, not fall, particularly for select patient populations and in specific geographic practice patterns. Lastly, EBM is not restricted solely to randomized trials and meta-analyses. The practice of EBM requires tracking down the “best external evidence” available to apply to individual clinical questions.10 Some clinical questions do not require randomized trials. Surgical treatments may of necessity rely on observational or cohort studies or standards of empirical practice proven to be safe over time.23 The benefit of a parachute in surviving a jump from a high-flying aircraft has never been proven by an RCT.

The implementation of evidence-based clinical care is a challenging, multifaceted problem. Applying EBM to patient care represents a 4-step process: creating evidence, summarizing evidence, disseminating evidence, and implementing evidence into practice.7,15,24,25 Problems occur at each step that interfere with and delay this translation process. Evidence of treatment effectiveness comes from several formats. However, by far the best evidence of efficacy comes from RCTs.16,17 Unfortunately, there is a paucity of rigorous RCTs underlying clinically applied treatment plans. To a large degree, treatment plans are based on less rigorous observational case-control or small series and, in many cases, empirical observations, all of which have been shown to lead to either erroneous overconclusion or outright inappropriate treatment.10,18,24,26 In the realm of direct clinical care, the most common valid approach is large observational studies with sufficiently long follow-up periods to truly assess the impact on prognosis. In addition, large cross-sectional or cohort studies can evaluate the validity of variously applied clinical care paradigms. Because of the extreme expenses involved, RCTs have primarily focused on industry-funded testing of efficacy for new therapeutic agents. However, even in this setting, the population assessed is frequently tightly focused and limited. Duration, dosing, and other critical components are frequently inadequately explored because of the restraints of time, expense, and population acceptance. Subsequent off-label use frequently becomes a problem and potential risk. Similarly, the reliability and utility diagnostic tests are rarely conducted. They usually rely on surrogate end points and not truly on patient outcomes. These analyses are also infrequently blinded, thus allowing for the introduction of significant bias. The ultimate test of a diagnostic modality as capable of both modifying therapy and improving outcome is not explored and is rarely feasible in an RCT format.27,28

The second phase involves assessment of the available evidence on which to base clinical care. Because of the frequent absence of level 1 evidence, the surgeon must use other approaches to access the quality of the evidence. This is usually accomplished with a rigorous, systematic review to locate and assess all relevant studies to address a given question of clinical care.16,2932 These reviews require a strategy that is sensitive enough to ensure proper capture and retrieval of all relevant published trials. To be comprehensive, multiple strategies and mechanisms are used. Medline and the Cochrane Clinical Trials Registry frequently use major databases and summaries available to the clinician or investigator.17,33 Randomized controlled trials are specifically identified and searchable in Medline since 1991, and computer searches can frequently identify RCTs prior to this date. Despite these approaches, an ongoing problem in the assessment of all systematic surveys is the lack of ability to encompass and identify all pertinent studies regarding a clinical question.34 Publication bias frequently leads to an overestimate of benefit. The next major challenge is that, in the absence of a large RCT with narrow confidence intervals, any conclusion frequently requires the analytical handling of multiple studies, frequently with conflicting outcomes, using an evidence-based summary or a meta-analysis approach. Evidence-based summaries, using commonly accepted data quality scoring systems, present the current state of the evidence and level of confidence in the data without combining results in a quantitative fashion.19,24 Numerous clinical guidelines have been developed based on this approach. Several major surgical specialty professional organizations have contributed to this process and product. Although these EBM guidelines have great appeal in theory, they unfortunately rapidly degrade into expert consensus due primarily to the lack of high-quality clinical trials.10 In contrast, meta-analyses combine data from different studies to increase statistical power and arrive at a summary estimate (and degree of uncertainty) of the particular effectiveness of a particular intervention.24 The major problem with meta-analysis is that the rules of analysis are not always followed or strictly applicable; they are dependent on the study and, thus, lead to conclusions which, at times, are found to be incorrect in subsequent RCTs or meta-analyses of additional data.35,36 Frequent problems using meta-analysis are the inclusion of heterogeneous patient populations because of widely varying patient inclusion criteria, along with inadequately sized populations or exclusions that alter the direction of the analysis and selective focus of the findings and statistical significance that make the conclusions not broadly applicable or clinically relevant.

The process of meta-analysis continues to evolve with methods being developed to control for more accurate evaluation despite the heterogeneity of these studies.16,37 However, with the current inconsistencies and difficulties, it is difficult to use meta-analyses as the “gold standard.” One must remember to always assess the quality of the data.26 A poorly designed study is worse than no study at all, and a poorly designed meta-analysis can be outright dangerous. Always ask whether your patient is similar to the study population.38 In assessing guidelines, one must identify the source and, importantly, recognize whether there are conflicts of interests involved in addition to a purely scientific assessment underlying or driving the development of the guidelines, particularly when available evidence appears to be minimal.29,30 In addition, in the realm of surgery, invasive techniques are not reliably tested by RCTs. Inability to truly compare and test rapidly evolving surgical techniques can lead to erroneous conclusions and actually delay implementation of the most effective therapy.23

Although historically, guidelines relied primarily on consensus and expert opinion, most currently developed guidelines attempt to use the EBM approach, summarizing the published literature. Frequently, both evidence summaries and meta-analyses have been and will continue to be used to develop clinical practice guidelines.25,2931,39 The Institute of Medicine uses mechanisms similar to these in systematically assessing the literature to develop guidelines to assist practitioners in the care of their patients. Two other major sources for the development of evidence-based guidelines are the Agency for Healthcare Research and Quality (AHRQ) and the Cochrane Collaboration.33,40,41 The AHRQ is developing evidence-based practice guidelines focused on common, expensive, and significant clinical conditions available at http://www.guideline.gov. The Cochrane Collaboration is an international effort to systematically identify, retrieve, and formally summarize, primarily in the form of meta-analyses, the evidence from all interventions currently known in medicine. These guideline derivatives are available at http://www.cochrane.org (Table 3).

Table Graphic Jump LocationTable 3. Evidence-Based Medicine Sources

A major stumbling block to implementing EBM is the dissemination of the summarized evidence and subsequent guidelines to physicians in a useful way and acceptable format. Lack of easy and timely access to the evidence remains a significant hurdle for the practitioner. Various methods of dissemination have been attempted, including routine and highlighted publications from national organizations such as the American College of Surgeons (ACS), http://www.facs.org/tsi/index.html; specialty journal publications for specific fields; and web-based clearinghouses, such as http://www.guideline.gov/ from the AHRQ. In addition, ongoing attempts to use traditional continuing medical education have been heavily relied on as a means to disseminate EBM to providers. Despite this variety of efforts, it is remarkable that, while evidence is increasingly available and documented as widely disseminated to the appropriate physician population, there has been a lack of change in practice based on this information.26 Furthermore, dissemination of this evidence by the traditional, time-honored mechanisms of continuing medical education has been shown to be incapable of changing practice patterns.1 For example, among patients with heart attacks who are considered “ideal candidates” for β-blockers, the number that actually received the drug in various settings ranged from 5% to 92% of patients.4,42 Other guidelines, such as mammography once every 2 years for women aged 65 to 69 years and hemoglobin monitoring and annual eye examinations for diabetics, are largely ignored or unachievable.42

The ability to implement evidence-based guidelines is currently a critical issue. Improved education in searching for EBM during medical school and residency training and full access to information to assess and use evidence-based patient care are lacking for all physicians, and surgical specialists in particular.43 Necessary skills include defining the patient problem exactly, effectively searching and critically assessing current information, and appropriately applying the knowledge to care for an individual patient.44,45

A lack of implementation of evidence-based care involves both commission and omission at the level of the physician and institution. Changing physician behavior, especially in nonacademic environments, is particularly challenging, and no single method, even in an optimal environment, has ensured its success. The major limitations have been the cost- and labor-intensive nature of these practices. In addition, although aggressive intervention has been shown to lead to an increase in knowledge and temporary change, there is little documentation that persistence in change in clinical practice occurs. Currently, 3 techniques are thought to be most effective and have been used in various settings via a multifaceted intervention program (combining 2 or more of the intervention mechanisms), including data collection on the use of EBM guidelines and direct feedback of results to individual practitioners; academic detailing (in which an expert visits physicians individually to provide objective information on a particular topic in brief educational sessions); and reminders or prompting through electronic medical record flags, mass institutional mailings, and other mechanisms.43 To create an environment of easy access and ongoing site of practice education to help maintain practice change, linkages for personal digital assistants, and other direct-link electronic medical record constructs are being developed to link EBM to the ongoing care of an individual patient.46 Elimination of the need for proactive searches for evidence and replacement by automatic passive linkage to EBM in patient clinical site electronic medical records might overcome a large portion of physician resistance by minimizing time and effort requirements. However, although demonstrated to be effective in varying scenarios, none of these interventions has been uniformly evaluated in a significant number of surgical specialist settings, and the overall effectiveness is far from conclusive. It is entirely possible that the efficacy of any of these approaches may vary with the patient disease, the culture of the providers, and the clinical setting. Until these variables and the effectiveness have been established in the unique setting of each of these confounding variables, there will be a need for continuing evolution of mechanisms to optimize change in physician behavior in response to evidence-based patient care. With the explosion of knowledge and the ongoing variability of the EBM results, highly filtered and regulated systems are needed to provide succinct, current, and thoroughly objective EBM summaries for clinician assimilation. In addition, well-controlled, prospective studies in the various institutional (academic vs community) settings and in each of the surgical specialties are required to ensure that optimal approaches are chosen for this potentially very expensive process of behavior modification. It is imperative that surgeons take a lead, not only in the generation of data testing clinical care, but also in accepting and implementing appropriate new guidelines: in addition to helping their patients, they can serve as role models for all physicians. Surgeons have a tradition of adopting new approaches and techniques to improve care, and implementing EBM is yet one more goal.

National surgical organizations have increasingly focused on using EBM to enhance the practice and outcome of surgical care. The ACS are exemplary in this process but are not alone. Efforts are occurring in all surgical specialties through national and regional societies and organizations. Since 1913, when Ernest Amory Codman, a founder of the ACS, urged the tracking of patient outcomes to assess the quality of surgical care, the college has increasingly focused on the impact of evidence-based practice on surgical outcomes.47 In 1922, Charles L. Scudder, as chair of the fracture committee, developed measurements of outcome after discharge from the hospital. In 1995, the ACS Board of Regents convened a task force on the use of outcomes in surgical practice in the maintenance of competency, and as a consequence of these efforts, in 2001 the ACS was reorganized with 1 of the 4 major components of the ACS becoming the Office of Evidence-Based Surgery. These efforts are mirrored by all surgical specialties and their supporting organizations, including residency review committee requirements and board certification eligibility, under the aegis of the American Board of Medical Specialties. Surgical specialties without American Board of Medical Specialties certification are guided by the efforts of oversight groups, such as the Society of American Gastrointestinal and Endoscopic Surgeons, who sets guidelines for laparoscopic training programs. Organizations such as the Eastern Association for the Surgery of Trauma have developed evidence-based guidelines for 17 different patient injury categories (Table 3). The Brain Trauma Foundation has, over the last 10 years, developed an evidence-based Neurotrauma Management Guideline (Table 3). In addition, voluntary organizations such as the Northern New England Cardiovascular Study Group and Intermountain Health Systems have been able to create an administrative infrastructure to implement best practices and have reduced morbidity and mortality.48

At the payor level, the use of evidence-based guidelines has been 1 attempt by the Leapfrog Group and other major payors to capture quality care.49 Although implementation is frequently linked to a simultaneous increase in savings, there may not necessarily be a cost saving. However, not following guidelines has been shown to be wasteful because unproven treatments offer no benefit or, at best, no proven benefit; examples include prescribing antibiotics for routine upper respiratory viral infections, using expensive brand-name proton pump inhibitors instead of generic H2 blockers for acid reflux, and using brand-name analgesics instead of generic over-the-counter agents. At the federal level, the AHRQ has developed experimental approaches in general medicine for the improvement of patient care based on EBM through the development of PORTs (patient outcomes research teams) and EPCs (evidence-based practice centers).

A major national EBM effort specific to all surgical practitioners is the Surgical Care Improvement Project, which was initiated in August 2005 (http://www.medqic.org/scip). This multiyear project is being launched nationally by the Centers for Medicare & Medicaid Services (CMS) to reduce morbidity and mortality from surgical complications. The focus of this project is the prevention of surgical site infections in addition to building on the previous accomplishments of reduction in perioperative cardiac, respiratory, and venous thromboembolic complications. The goal of this project is to reduce surgical complications by 25% nationally by the year 2010. The Surgical Care Improvement Project is a national quality partnership of numerous organizations committed to improving patient safety by reducing postoperative complications based on data showing that of the 42 million operations performed in the United States each year, up to 40% have associated postoperative complications, such as infection, thromboembolic events, respiratory complications, and adverse cardiac events. Through previous evaluations, a significant percentage of these complications are known to be preventable. Target areas are surgical site infections, adverse cardiac events, venous thromboembolism, and postoperative pneumonia. Surgical site infections account for approximately 15% of all hospital-acquired infections and are among the most common complications of surgical care. The interventions based on evidence to prevent surgical site infections are (1) administering prophylactic antibiotics within 1 hour prior to surgery, (2) selecting appropriate prophylactic antibiotics according to clinical guidelines, (3) discontinuing prophylactic antibiotics within 24 hours after the end of surgery, and (4) controlling perioperative serum glucose in major cardiac surgical patients. Adverse cardiac events are complications occurring in 2% to 5% of patients undergoing noncardiac surgery and as many as 34% of patients undergoing vascular surgery. Interventions consist of administration of β-blockers to eligible (1) major noncardiac surgical patients during the perioperative period; (2) patients with coronary artery disease and arteriosclerotic cardiovascular disease during perioperative periods; and (3) surgical patients who require chronic β-blocker therapy. Venous thromboembolization occurring as deep vein thrombosis after approximately 25% of all major surgical procedures performed without prophylaxis has a pulmonary embolism rate of 7%. The assessment for risk and application of appropriate perioperative prophylaxis for venous thromboembolization is also a major focus. Lastly, postoperative pneumonia occurs in 9% to 40% of patients with an associated mortality of 30% to 45%. Documented to prevent this complication following major surgical procedures requiring postoperative ventilator support is elevation of the head of the bed greater than or equal to 30°. The Surgical Care Improvement Project partnership believes that meaningful reduction in complications will require surgeons, along with the entire health care team, including hospital executives, to work together to make surgical care improvement a high priority.

Another approach to enhancing patient care and stimulating behavior modification in physician practice is exemplified by the CMS development of publicly released “report cards” on patient care quality and patient perspectives on the quality of care received during a hospital stay.50,51 A component of the assessment of quality of care will include use of and compliance with EBM guidelines, such as those outlined earlier, including use of β-blockers, perioperative antibiotics, and prophylaxis for venous thromboembolization. Medicare is requesting public comment on the standard information to be released as a publicly reported assessment of the quality of health care. The CMS hopes to provide useful information about the quality of care patients receive from their physicians to help patients make informed decisions about their care. These comparisons will identify strengths and weaknesses of health care providers so they can improve the quality of care their patients receive. Currently, the CMS has submitted standardized measures called Ambulatory Care Measures to the National Quality Forum for review and comment (http://www.qualityforum.org). Commitment to these measures will be used to pay physicians to monitor, report on, and improve the care provided to Medicare beneficiaries. An additional set of survey questions to measure patient perspectives on the care they receive, called the Hospital Consumer Assessment of Healthcare Providers and Systems, was also submitted to the National Quality Forum and will be published in the Federal Registry, asking for public comment and input. Although the initial focus is primarily the care available in the nation's hospitals, there will also be a subsequent focus on the quality of care provided in doctors' offices by individual physicians. As part of the Hospital Quality Initiative, the CMS intends to publicly report a broad set of hospital clinical measures along with measures of patient perspectives regarding hospital care. The Hospital Consumer Assessment of Healthcare Providers and Systems has been pilot-tested in 3 states and in small-scale field testing with extensive psychometric analysis and confirmation of validity. The focus of the survey is communication, responsiveness of staff, pain control, cleanliness, and discharge information. The CMS began data collection by hospitals using the Hospital Consumer Assessment of Healthcare Providers and Systems in 2005.

The Hospital Consumer Assessment of Healthcare Providers and Systems initiative is merely an extension of the already functioning National Hospital Quality Alliance begun in 2003, a public/private effort on quality reporting that provides the support for the development of Medicare's Hospital Quality Initiative. In December 2004, the CMS began posting updated quality information reported by nearly 4000 hospitals on 10 hospital measures (data available at http://www.cms.hhs.gov). The hospital quality data are now available on the CMS Web site for consumers at http://www.medicare.gov. The proposed additional focus on ambulatory care quality should measure improvements of care for such clinical conditions as coronary artery disease, heart failure, diabetes, high blood pressure, asthma, behavioral health, prenatal care, and preventive care by using EBM guidelines. The CMS expected the approved markers to be incorporated into ongoing quality improvement efforts and demonstrations in 2005. Subsequently, these data will be extended to capture specific physician outcomes, including all surgical specialties, and, ultimately, released as a physician-specific quality of care report card, including the use of EBM.

An additional major force that will be driving the implementation of EBM by physician providers will be payor-induced financial incentives (ie, the carrot). In December 2004, the Medicare Payment Advisory Commission agreed “to recommend Congress enact a pay-for-performance for hospitals, physicians. . . . ” Although this process is beginning in the adult medicine outpatient ambulatory setting, there is no doubt that it will quickly evolve to include bonus pay for delivering higher-quality surgical care, which will most likely be defined in a variety of ways. The Institute of Medicine, in To Err Is Human: Building a Safer Health System, identified that most third-party payment systems do not recognize and reward safety or quality.52 This concept of pay-for-performance is currently being tested in several scenarios. The premise is that providing financial bonuses for regularly following guidelines will diminish the variation in delivery of care for any given condition, and ultimately the payor will save money. Again, this leverage is felt necessary because of the ongoing inability to induce widespread changes in physician practice, even though there is increasing and overwhelming evidence that physicians who follow EBM guidelines provide care with better outcomes. Financial inducements are thought to provide the ultimate leverage against the extreme resistance to change in the physician community.

The implementation of pay-for-performance has initially centered on detailed expectations for ambulatory outpatient care, including breast and cervical cancer screening, diabetes care, well-child visits, use of asthma medications, and use of appropriate generic drugs. Each proposal uses individualized metrics, but all are based currently on the HEDIS (Health Plan Employer Data and Information Set) metrics developed by the National Committee for Quality Assurance. In January 2005, a 3-year Medicare research project called the Physician Group Practice Demonstration offered extra financial rewards from Medicare to 11 large multispecialty practices for better coordination of part A and part B services. Medicare will allocate bonus payments between efficiency improvements and measurable improvements in quality-of-patient-care processes and outcomes as shown by national claims databases. The criteria on which the assessment will be judged appear at http://www.ncqa.org. With the “800-pound gorilla,” Medicare, driving this process, it is inevitable that it will expand throughout the medical field, including production and pay-for-performance for surgical specialties as well (Table 3). Currently, the ACS, on behalf of all surgeons, is pushing to put quality measures in the hands of surgeons and their institutions by applying the National Surgical Quality Improvement Program, developed by the Veterans Administration Health System, to the private sector. Currently, the National Surgical Quality Improvement Program is the only prospective, peer-controlled, risk-adjusted database for 30-day surgical outcomes. Implementation of the National Surgical Quality Improvement Program in the Veterans Administration system has been associated with significant improvement in both surgical morbidity and mortality throughout the system (http://www.nsqip.org).53

In conclusion, professionalism requires a surgeon to develop a lifetime commitment to continuing medical education. A competent surgeon reads the literature, but not all of it. Be discerning. For decisions about therapy for an individual patient, focus on RCTs; evidence-based, systematic reviews; and EBM guidelines. Using professional wisdom, apply what you read, understanding the limitations and applicability to your unique patient. Surgeons need to understand the process of EBM and incorporate it into their daily practices to optimize patient outcomes.

Correspondence: Ronald V. Maier, MD, Department of Surgery, University of Washington, Harborview Medical Center, 325 9th Ave, Box 35-9796, Seattle, WA 98104-2499 (ronmaier@u.washington.edu).

Accepted for Publication: August 1, 2005.

Davis  DAThomson  MAOxman  ADHaynes  RB Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA 1995;274700- 705
PubMed
Simpson  EBeck  CRichard  HEisenberg  MJPilote  L Drug prescriptions after acute myocardial infarction: dosage, compliance, and persistence. Am Heart J 2003;145438- 444
PubMed
Antman  EMLau  JKupelnick  BMosteller  FChalmers  TC A comparison of results of meta-analyses of randomized control trials and recommendations of clinical experts: treatments for myocardial infarction. JAMA 1992;268240- 248
PubMed
Auerbach  ADGoldman  L Beta-blockers and reduction of cardiac events in noncardiac surgery: clinical applications. JAMA 2002;2871445- 1447
PubMed
Hebert  PCWells  GBlajchman  MA  et al.  A multicenter, randomized, controlled clinical trial of transfusion requirements in critical care: Transfusion Requirements in Critical Care Investigators, Canadian Critical Care Trials Group. N Engl J Med 1999;340409- 417
PubMed
Corwin  HLGettinger  APearl  RG  et al.  The CRIT Study: anemia and blood transfusion in the critically ill—current clinical practice in the United States. Crit Care Med 2004;3239- 52
PubMed
Evidence-Based Medicine Working Group, Evidence-based medicine: a new approach to teaching the practice of medicine. JAMA 1992;2682420- 2425
PubMed
Sackett  DLRosenberg  WMGray  JAHaynes  RBRichardson  WS Evidence based medicine: what it is and what it isn’t. BMJ 1996;31271- 72
PubMed
Ubbink  DTLegemate  DA Evidence-based surgery. Br J Surg 2004;911091- 1092
PubMed
Naylor  CD Grey zones of clinical practice: some limits to evidence-based medicine. Lancet 1995;345840- 842
PubMed
Hancock  HCEasen  PR Evidence-based practice: an incomplete model of the relationship between theory and professional work. J Eval Clin Pract 2004;10187- 196
PubMed
Guller  UDeLong  ER Interpreting statistics in medical literature: a vade mecum for surgeons. J Am Coll Surg 2004;198441- 458
PubMed
Toedter  LJThompson  LLRohatgi  C Training surgeons to do evidence-based surgery: a collaborative approach. J Am Coll Surg 2004;199293- 299
PubMed
Bhandari  MDevereaux  PJMontori  VCina  CTandan  VGuyatt  GH Users' guide to the surgical literature: how to use a systematic literature review and meta-analysis. Can J Surg 2004;4760- 67
PubMed
Cook  DJSibbald  WJVincent  JLCerra  FB Evidence based critical care medicine: what is it and what can it do for us? Evidence Based Medicine in Critical Care Group. Crit Care Med 1996;24334- 337
PubMed
Cook  DJSackett  DLSpitzer  WO Methodologic guidelines for systematic reviews of randomized control trials in health care from the Potsdam Consultation on Meta-Analysis. J Clin Epidemiol 1995;48167- 171
PubMed
Chalmers  I The Cochrane Collaboration: preparing, maintaining, and disseminating systematic reviews of the effects of health care. Ann N Y Acad Sci 1993;703156- 165
PubMed
Diehl  LFPerry  DJ A comparison of randomized concurrent control groups with matched historical control groups: are historical controls valid? J Clin Oncol 1986;41114- 1120
PubMed
Phillips  BBall  CSackett  D  et al.  Oxford Centre for Evidence-based Medicine Levels of Evidence.  Oxford, England Center for Evidence-Based Medicine2001;
Smith  R Where is the wisdom . . . ? BMJ 1991;303798- 799
PubMed
Hardin  WD  JrStylianos  SLally  KP Evidence-based practice in pediatric surgery. J Pediatr Surg 1999;34908- 913
PubMed
 Half of what doctors know is wrong. New York Times Magazine. March15 2003;1
Kral  JGDixon  JBHorber  FF  et al.  Flaws in methods of evidence-based medicine may adversely affect public health directives. Surgery 2005;137279- 284
PubMed
Christakis  DADavis  RRivara  FP Pediatric evidence-based medicine: past, present, and future. J Pediatr 2000;136383- 389
PubMed
Cook  DJGreengold  NLEllrodt  AGWeingarten  SR The relation between systematic reviews and practice guidelines. Ann Intern Med 1997;127210- 216
PubMed
Feinstein  ARHorwitz  RI Problems in the “evidence” of “evidence-based medicine”. Am J Med 1997;103529- 535
PubMed
Jaeschke  RGuyatt  GSackett  DL Users' guides to the medical literature. III. How to use an article about a diagnostic test. A. Are the results of the study valid? Evidence-Based Medicine Working Group. JAMA 1994;271389- 391
PubMed
Jaeschke  RGuyatt  GHSackett  DL Users' guides to the medical literature. III. How to use an article about a diagnostic test. B. What are the results and will they help me in caring for my patients? The Evidence-Based Medicine Working Group. JAMA 1994;271703- 707
PubMed
Hayward  RSWilson  MCTunis  SRBass  EBGuyatt  G Users' guides to the medical literature. VIII. How to use clinical practice guidelines. A. Are the recommendations valid? The Evidence-Based Medicine Working Group. JAMA 1995;274570- 574
PubMed
Shaneyfelt  TMMayo-Smith  MFRothwangl  J Are guidelines following guidelines? The methodological quality of clinical practice guidelines in the peer-reviewed medical literature. JAMA 1999;2811900- 1905
PubMed
Wilson  MCHayward  RSTunis  SRBass  EBGuyatt  G Users' guides to the Medical Literature. VIII. How to use clinical practice guidelines. B. What are the recommendations and will they help you in caring for your patients? The Evidence-Based Medicine Working Group. JAMA 1995;2741630- 1632
PubMed
Balk  EMBonis  PAMoskowitz  H  et al.  Correlation of quality measures with estimates of treatment effect in meta-analyses of randomized controlled trials. JAMA 2002;2872973- 2982
PubMed
Bero  LRennie  D The Cochrane Collaboration: preparing, maintaining, and disseminating systematic reviews of the effects of health care. JAMA 1995;2741935- 1938
PubMed
Haynes  RBWilczynski  NMcKibbon  KAWalker  CJSinclair  JC Developing optimal search strategies for detecting clinically sound studies in MEDLINE. J Am Med Inform Assoc 1994;1447- 458
PubMed
LeLorier  JGregoire  GBenhaddad  ALapierre  JDerderian  F Discrepancies between meta-analyses and subsequent large randomized, controlled trials. N Engl J Med 1997;337536- 542
PubMed
Cappelleri  JCIoannidis  JPSchmid  CH  et al.  Large trials vs meta-analysis of smaller trials: how do their results compare? JAMA 1996;2761332- 1338
PubMed
DerSimonian  RLevine  RJ Resolving discrepancies between a meta-analysis and a subsequent large controlled trial. JAMA 1999;282664- 670
PubMed
Guyatt  GHRennie  D Users' guides to the medical literature. JAMA 1993;2702096- 2097
PubMed
Bergman  DA Evidence-based guidelines and critical pathways for quality improvement. Pediatrics 1999;103 ((suppl E)) 225- 232
PubMed
Condon  RE Cochrane review and meta-analysis. J Am Coll Surg 2004;198498- 500
PubMed
Bero  LAGrilli  RGrimshaw  JMHarvey  EOxman  ADThomson  MA Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ 1998;317465- 468
PubMed
Wennberg  JEFisher  ESSkinner  JS Geography and the debate over Medicare reform. Health Aff (Millwood) 2004; (Suppl Web Exclusive) W96- 114
PubMed
Noble  JBithoney  WMacDonald  P  et al.  The core content of a generalist curriculum for general internal medicine, family practice, and pediatrics. J Gen Intern Med 1994;9 ((suppl 1)) S31- S42
PubMed
Sackett  DLRichardson  WSRosenberg  WMHaynes  RB Evidence-Based Medicine: How to Practice and Teach EBM.  New York, NY Churchill Livingston1997;250
Ebell  MHMessimer  SRBarry  HC Putting computer-based evidence in the hands of clinicians. JAMA 1999;2811171- 1172
PubMed
Sackett  DLStraus  SE Finding and applying evidence during clinical rounds: the “evidence cart”. JAMA 1998;2801336- 1338
PubMed
Donabedian  A Twenty years of research on the quality of medical care: 1964-1984. Eval Health Prof 1985;8243- 265
PubMed
O'Connor  GTPlume  SKOlmstead  EM  et al.  A regional intervention to improve the hospital mortality associated with coronary artery bypass graft surgery: the Northern New England Cardiovascular Disease Study Group. JAMA 1996;275841- 846
PubMed
Birkmeyer  JDDimick  JB Potential benefits of the new Leapfrog standards: effect of process and outcomes measures. Surgery 2004;135569- 575
PubMed
Cleary  PDEdgman-Levitan  S Health care quality: incorporating consumer perspectives. JAMA 1997;2781608- 1612
PubMed
Douglas  CHDouglas  MR Patient-friendly hospital environments: exploring the patients' perspective. Health Expect 2004;761- 73
PubMed
Kohn  LTCorrigan  JMDonaldson  MSInstitute of Medicine Committee on Quality of Health Care in America,edsTo Err Is Human: Building a Safer Health System. Washington, DC National Academy Press2000;312
Khuri  SFDaley  JHenderson  WG The comparative assessment and improvement of quality of surgical care in the Department of Veterans Affairs. Arch Surg 2002;13720- 27
PubMed

Figures

Tables

Table Graphic Jump LocationTable 1. Assessing Levels of Evidence19
Table Graphic Jump LocationTable 2. Recommendation Based on Evidence19
Table Graphic Jump LocationTable 3. Evidence-Based Medicine Sources

References

Davis  DAThomson  MAOxman  ADHaynes  RB Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA 1995;274700- 705
PubMed
Simpson  EBeck  CRichard  HEisenberg  MJPilote  L Drug prescriptions after acute myocardial infarction: dosage, compliance, and persistence. Am Heart J 2003;145438- 444
PubMed
Antman  EMLau  JKupelnick  BMosteller  FChalmers  TC A comparison of results of meta-analyses of randomized control trials and recommendations of clinical experts: treatments for myocardial infarction. JAMA 1992;268240- 248
PubMed
Auerbach  ADGoldman  L Beta-blockers and reduction of cardiac events in noncardiac surgery: clinical applications. JAMA 2002;2871445- 1447
PubMed
Hebert  PCWells  GBlajchman  MA  et al.  A multicenter, randomized, controlled clinical trial of transfusion requirements in critical care: Transfusion Requirements in Critical Care Investigators, Canadian Critical Care Trials Group. N Engl J Med 1999;340409- 417
PubMed
Corwin  HLGettinger  APearl  RG  et al.  The CRIT Study: anemia and blood transfusion in the critically ill—current clinical practice in the United States. Crit Care Med 2004;3239- 52
PubMed
Evidence-Based Medicine Working Group, Evidence-based medicine: a new approach to teaching the practice of medicine. JAMA 1992;2682420- 2425
PubMed
Sackett  DLRosenberg  WMGray  JAHaynes  RBRichardson  WS Evidence based medicine: what it is and what it isn’t. BMJ 1996;31271- 72
PubMed
Ubbink  DTLegemate  DA Evidence-based surgery. Br J Surg 2004;911091- 1092
PubMed
Naylor  CD Grey zones of clinical practice: some limits to evidence-based medicine. Lancet 1995;345840- 842
PubMed
Hancock  HCEasen  PR Evidence-based practice: an incomplete model of the relationship between theory and professional work. J Eval Clin Pract 2004;10187- 196
PubMed
Guller  UDeLong  ER Interpreting statistics in medical literature: a vade mecum for surgeons. J Am Coll Surg 2004;198441- 458
PubMed
Toedter  LJThompson  LLRohatgi  C Training surgeons to do evidence-based surgery: a collaborative approach. J Am Coll Surg 2004;199293- 299
PubMed
Bhandari  MDevereaux  PJMontori  VCina  CTandan  VGuyatt  GH Users' guide to the surgical literature: how to use a systematic literature review and meta-analysis. Can J Surg 2004;4760- 67
PubMed
Cook  DJSibbald  WJVincent  JLCerra  FB Evidence based critical care medicine: what is it and what can it do for us? Evidence Based Medicine in Critical Care Group. Crit Care Med 1996;24334- 337
PubMed
Cook  DJSackett  DLSpitzer  WO Methodologic guidelines for systematic reviews of randomized control trials in health care from the Potsdam Consultation on Meta-Analysis. J Clin Epidemiol 1995;48167- 171
PubMed
Chalmers  I The Cochrane Collaboration: preparing, maintaining, and disseminating systematic reviews of the effects of health care. Ann N Y Acad Sci 1993;703156- 165
PubMed
Diehl  LFPerry  DJ A comparison of randomized concurrent control groups with matched historical control groups: are historical controls valid? J Clin Oncol 1986;41114- 1120
PubMed
Phillips  BBall  CSackett  D  et al.  Oxford Centre for Evidence-based Medicine Levels of Evidence.  Oxford, England Center for Evidence-Based Medicine2001;
Smith  R Where is the wisdom . . . ? BMJ 1991;303798- 799
PubMed
Hardin  WD  JrStylianos  SLally  KP Evidence-based practice in pediatric surgery. J Pediatr Surg 1999;34908- 913
PubMed
 Half of what doctors know is wrong. New York Times Magazine. March15 2003;1
Kral  JGDixon  JBHorber  FF  et al.  Flaws in methods of evidence-based medicine may adversely affect public health directives. Surgery 2005;137279- 284
PubMed
Christakis  DADavis  RRivara  FP Pediatric evidence-based medicine: past, present, and future. J Pediatr 2000;136383- 389
PubMed
Cook  DJGreengold  NLEllrodt  AGWeingarten  SR The relation between systematic reviews and practice guidelines. Ann Intern Med 1997;127210- 216
PubMed
Feinstein  ARHorwitz  RI Problems in the “evidence” of “evidence-based medicine”. Am J Med 1997;103529- 535
PubMed
Jaeschke  RGuyatt  GSackett  DL Users' guides to the medical literature. III. How to use an article about a diagnostic test. A. Are the results of the study valid? Evidence-Based Medicine Working Group. JAMA 1994;271389- 391
PubMed
Jaeschke  RGuyatt  GHSackett  DL Users' guides to the medical literature. III. How to use an article about a diagnostic test. B. What are the results and will they help me in caring for my patients? The Evidence-Based Medicine Working Group. JAMA 1994;271703- 707
PubMed
Hayward  RSWilson  MCTunis  SRBass  EBGuyatt  G Users' guides to the medical literature. VIII. How to use clinical practice guidelines. A. Are the recommendations valid? The Evidence-Based Medicine Working Group. JAMA 1995;274570- 574
PubMed
Shaneyfelt  TMMayo-Smith  MFRothwangl  J Are guidelines following guidelines? The methodological quality of clinical practice guidelines in the peer-reviewed medical literature. JAMA 1999;2811900- 1905
PubMed
Wilson  MCHayward  RSTunis  SRBass  EBGuyatt  G Users' guides to the Medical Literature. VIII. How to use clinical practice guidelines. B. What are the recommendations and will they help you in caring for your patients? The Evidence-Based Medicine Working Group. JAMA 1995;2741630- 1632
PubMed
Balk  EMBonis  PAMoskowitz  H  et al.  Correlation of quality measures with estimates of treatment effect in meta-analyses of randomized controlled trials. JAMA 2002;2872973- 2982
PubMed
Bero  LRennie  D The Cochrane Collaboration: preparing, maintaining, and disseminating systematic reviews of the effects of health care. JAMA 1995;2741935- 1938
PubMed
Haynes  RBWilczynski  NMcKibbon  KAWalker  CJSinclair  JC Developing optimal search strategies for detecting clinically sound studies in MEDLINE. J Am Med Inform Assoc 1994;1447- 458
PubMed
LeLorier  JGregoire  GBenhaddad  ALapierre  JDerderian  F Discrepancies between meta-analyses and subsequent large randomized, controlled trials. N Engl J Med 1997;337536- 542
PubMed
Cappelleri  JCIoannidis  JPSchmid  CH  et al.  Large trials vs meta-analysis of smaller trials: how do their results compare? JAMA 1996;2761332- 1338
PubMed
DerSimonian  RLevine  RJ Resolving discrepancies between a meta-analysis and a subsequent large controlled trial. JAMA 1999;282664- 670
PubMed
Guyatt  GHRennie  D Users' guides to the medical literature. JAMA 1993;2702096- 2097
PubMed
Bergman  DA Evidence-based guidelines and critical pathways for quality improvement. Pediatrics 1999;103 ((suppl E)) 225- 232
PubMed
Condon  RE Cochrane review and meta-analysis. J Am Coll Surg 2004;198498- 500
PubMed
Bero  LAGrilli  RGrimshaw  JMHarvey  EOxman  ADThomson  MA Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. The Cochrane Effective Practice and Organization of Care Review Group. BMJ 1998;317465- 468
PubMed
Wennberg  JEFisher  ESSkinner  JS Geography and the debate over Medicare reform. Health Aff (Millwood) 2004; (Suppl Web Exclusive) W96- 114
PubMed
Noble  JBithoney  WMacDonald  P  et al.  The core content of a generalist curriculum for general internal medicine, family practice, and pediatrics. J Gen Intern Med 1994;9 ((suppl 1)) S31- S42
PubMed
Sackett  DLRichardson  WSRosenberg  WMHaynes  RB Evidence-Based Medicine: How to Practice and Teach EBM.  New York, NY Churchill Livingston1997;250
Ebell  MHMessimer  SRBarry  HC Putting computer-based evidence in the hands of clinicians. JAMA 1999;2811171- 1172
PubMed
Sackett  DLStraus  SE Finding and applying evidence during clinical rounds: the “evidence cart”. JAMA 1998;2801336- 1338
PubMed
Donabedian  A Twenty years of research on the quality of medical care: 1964-1984. Eval Health Prof 1985;8243- 265
PubMed
O'Connor  GTPlume  SKOlmstead  EM  et al.  A regional intervention to improve the hospital mortality associated with coronary artery bypass graft surgery: the Northern New England Cardiovascular Disease Study Group. JAMA 1996;275841- 846
PubMed
Birkmeyer  JDDimick  JB Potential benefits of the new Leapfrog standards: effect of process and outcomes measures. Surgery 2004;135569- 575
PubMed
Cleary  PDEdgman-Levitan  S Health care quality: incorporating consumer perspectives. JAMA 1997;2781608- 1612
PubMed
Douglas  CHDouglas  MR Patient-friendly hospital environments: exploring the patients' perspective. Health Expect 2004;761- 73
PubMed
Kohn  LTCorrigan  JMDonaldson  MSInstitute of Medicine Committee on Quality of Health Care in America,edsTo Err Is Human: Building a Safer Health System. Washington, DC National Academy Press2000;312
Khuri  SFDaley  JHenderson  WG The comparative assessment and improvement of quality of surgical care in the Department of Veterans Affairs. Arch Surg 2002;13720- 27
PubMed

Correspondence

CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Topics
PubMed Articles
JAMAevidence.com