October 2004 Bulletin

Evidence-based practice: Bridging the gap between aspirations and reality

By Michael Goldberg, MD

On June 11, the AAOS Board of Directors held a strategic discussion on the issue of evidence-based practice. The discussion encompassed the aspirations, the reality and the gap between them in orthopaedic practice. This article presents both background information and a summary of that discussion.

Evidence-based practice (EBP) is the integration of best research evidence with clinical expertise and patient values.

In this definition, “best research evidence” refers to clinically relevant research, especially patient-centered clinical research into the efficacy and safety of therapeutic, rehabilitative and preventive regimens. However, the orthopaedic reality is that there are few randomized, controlled clinical trials; the literature is of lower quality; bias exists in technology reporting; and orthopaedic surgeons have limited skills in grading literature.

“Clinical expertise” is defined as the ability to use clinical skills and past experience to identify each patient’s unique health state and diagnosis; their individual risks and benefits of potential interventions; and their personal values and expectations. Again, there is a gap between this definition and the orthopaedic reality, which is that technical skills drive the concept of expertise, many incentives exist to use the newest technologies, learning curves are a reality and surgical skill levels do differ.

“Patient values” refers to the unique preferences, concerns and expectations that each patient brings to a clinical encounter, which must be integrated into clinical decisions. The orthopaedic reality is that the wishes of patients must be given due consideration (for example, preferences on use of blood products, enhancing sports performance and cultural determinants of care).

The clinical quality improvement cycle

The clinical quality improvement cycle begins with evidence analysis. From this analysis, evidence-based guidelines and performance measures are developed. Then, outcomes data are collected and used to shape education and policy, as shown in the above graphic.

The AAOS has begun this process by analyzing the evidence and developing evidence-based guidelines and performance measures. The current AAOS guidelines, although evidence-based, were developed under methodology implemented in 1999, which is now outdated and needs revision.

In 2002, the AAOS took the first step into the performance measurement arena. At the request of the Centers for Medicare and Medicaid Services (CMS) and for the Doctors Office Quality project, the AAOS, in collaboration with the Physician Consortium for Performance Improvement and other relevant specialty groups, developed performance measures for osteoarthritis of the knee. These measures were finalized and approved by the AAOS Board of Directors last year. Performance measures are becoming increasingly important not only to quality improvement initiatives, but to regulators and payors like CMS in their pay-for-performance initiatives.

Additionally, the AAOS has successfully developed outcomes instruments that are still being widely used in research and in some clinical settings. This role will continue through our participation in the American Joint Replacement Registry (AJRR), which will collect outcomes data for use in shaping practice and policy, and in educating fellows to provide the highest quality care to patients.

Why bother?

Despite the best efforts of the AAOS to develop evidence-based tools for orthopaedic practices, there remains a disconnect between the Academy’s leadership and the general membership. For example, the materials developed for the “Improving Musculoskeletal Conditions in America,” although excellent and evidence-based, are not being used.

Implementing evidence-based practice may be the right thing to do, but many orthopaedists don’t consider that alone to be a good enough reason to do it. It is becoming increasingly apparent that the demands of external agencies (such as CMS, health insurers and certification boards) will provide the impetus for implementing evidence-based practices. This suggests that the AAOS ought to rethink the relationships among fellows, the Academy and external agencies.

Currently the AAOS develops products for use by the fellowship while external agencies make demands on orthopaedic surgeons. But what if the AAOS encouraged members to help develop tools that external agencies would use? This would shift the balance and put orthopaedists in a position to collaborate in the development of the measures upon which they would be judged.

Consider the worker’s compensation treatment guidelines developed by the American College of Occupational and Environmental Medicine (ACOEM) for managed care companies in California. Most of the ACOEM guidelines are not evidence-based; instead they are expert opinion guidelines. The AAOS needs to develop evidence-based guidelines to support appropriate care for patients with musculoskeletal conditions. The department of socioeconomic and state society affairs is working with research staff to prepare a business plan for developing treatment guidelines for worker’s comp. The AAOS is developing a statement supporting the use of evidence-based guidelines only, and opposing guidelines that are not evidence-based.

If not this, what?

If AAOS members rarely use evidence-based practice to make decisions, what do they base their clinical decision-making on? Most members make decisions based on their training, their mentors, a desire to implement new technologies or increased payment. Rarely does a surgeon say, “I do it this way because the evidence shows…” or “If I modify the recommended procedure, I do so because of my own skill set or my patient’s preferences.”

The emergence of pay-for-performance initiatives will have some effect on decision-making. These criteria will soon be applied to orthopaedics, and will have a powerful impact on the practice of medicine in the future. Other factors that can influence the adoption of evidence-based practice decisions include maintenance of certification requirements, research assistance, meetings and symposia, and ongoing educational articles.

Standards of care and who determines them, in the long run, are the foundation on which evidence-based guidelines are established. Using such guidelines, corporate benefit managers can evaluate their health care provider organizations. Organizations such as the LeapFrog Group are also interested in evidence-based guideline development, as part of their patient safety efforts. And the AAOS can tie evidence-based guidelines into the efforts of the patient safety and expert witness programs.

Next steps

Evidence-based medicine is part of the “AAOS in 2010” plan and an important issue. After a year of considering technology assessment and discussions with vendors, the Health Technology Assessment Project Team submitted its report on a technology assessment program. The project team made the following recommendations:

  1. Take it outside. Any assessment and/or rating practice must be outsourced from the Academy. The legal arena is ripe with antitrust issues so something should be established that can be applied consistently with expertise. An impact analysis will be needed.

  2. Training requirements are important and should be noted.

  3. Keep the rating system simple, as it will facilitate fellowship utilization.

  4. Education is key. The Academy should provide education on technology assessment to the fellowship, whether or not one is done.

  5. The AAOS has come a long way in 10 years. It is an opinion leader and will better serve patients and members if evidence-based medicine is included in the fabric of the organization.

Michael Goldberg, MD, is orthopaedist-in-chief at the Tufts-New England Medical Center and chair of the AAOS Evidence-Based Practice Committee. He can be reached at mgoldberg@tufts-nemc.org

By Michael Goldberg, MD

Suggestions for future steps

AAOS Public Position

  1. Make an unambiguous commitment to EBP.

  2. Create a policy statement.

  3. Support evidence studies even if results are negative or unpopular.

  4. Value evidence over opinion.

  5. Support data collection for quality improvement and accountability.

  6. Be outspoken about the use of non-evidence-based guidelines by payors or regulators.

Evidence-Based Products

  1. Retrain evidence analysis work groups.

  2. Produce evidence-based guidelines with evidence-based performance measures.

  3. Outsource technology assessment.

  4. Support American Joint Replacement Registry data collection.

  5. Repackage the Functional Outcomes Questionnaires and post on Web site.


  1. Conduct courses on levels of evidence and other EBP technologies.

  2. Include levels of evidence on all AAOS statements and policies.

  3. Include levels of evidence on all free papers at AAOS annual meeting.

  4. Include levels of evidence in OKU.

  5. Include technology assessment at AAOS Learning Center courses.

  6. Include questions about AAOS Evidence-Based Guidelines on OITE.

  7. Offer CME credit for participating in evidence- based data collection programs.

  8. Develop educational modules for members and residents.

  9. Evidence-based decision-making.

  10. WW“Consent for Surgery” forms to assess patient values.

  11. Integrate AAOS guidelines and products into resident and fellowship training.

Seek Collaboration

  1. Establish consistent methodology for levels of evidence with JBJS.

  2. Take lead in developing evidence-based guidelines and performance measures with others (eg: DVT, workers comp).

  3. Develop evidence-based products for use by external regulatory agencies and payors.

  4. Have participation in national data collection efforts count for ABOS Maintenance of Certification (eg: Joint Registry, CMS/Premier, DOQ)

Home Previous Page