October 2002 Bulletin

What makes a competent surgeon?

Survey examines orthopaedists’ view of recertification

By Diane Thome

The topic of recertification is second only to reimbursement in getting an orthopaedic surgeon’s attention. Recertification has been a fixture on the radar screen of orthopaedists, since 1986. Although a relatively recent event in the history of orthopaedics the concept of recertification has become one of its most controversial.

Are cognitive abilities a true barometer of surgical skills? Is the exemption of certain orthopaedists from the recertification process equitable? Beyond the individual orthopaedist’s motivation, is there a need for a system to assess competency?

The theory of measuring competency via examination is evolving. In March 2000 the American Board of Medical Specialties Assembly adopted Maintenance of Certification that suggests the practice of measuring competency solely through examination is insufficient. True documentation of competency requires:

The American Board of Orthopaedic Surgery (ABOS) currently provides six options for the examination portion of the recertification process:

In response to members’ concerns about the recertification process, the AAOS Board of Directors–in conjunction with the ABOS–developed a joint task force to review the current examination system. The task force developed and conducted the "Maintenance of Competence Survey" with the marketing department of the AAOS. The goal of the survey was to provide a baseline understanding of orthopaedists’ perceptions of the recertification process.

Methodology

In July 2001 the AAOS marketing research staff conducted a focus group of participants in a skills course at the Orthopaedic Learning Center in Rosemont, Ill. In September 2001 they also conducted two teleconferences with Board of Councilor members. The purpose of this "qualitative" research was to uncover the issues and perceptions of orthopaedists regarding the current recertification process and possible alternative methods.

This information provided the foundation for a survey that would measure the extent to which these attitudes and perceptions existed among AAOS fellows. On April 19, 2002 the survey was mailed to a random sample of 3,000 active fellows in the following categories:

The sample was proportionate to the total population of AAOS fellows in these categories at the time. A total of 973 fellows responded by May 3, 2002 for a 32 percent response rate. Generalizing their responses to the total of 8,144 fellows in these categories, the maximum error range is 2.9 percentage points at the 95 percent confidence interval.

Demographics

Sample selection insured that fellows with varying degrees of exposure to the recertification process were represented. Over half (52 percent) of respondents had experienced the recertification process at least once, but 23 percent had not yet recertified. The ABOS granted permanent certificates to 25 percent of the respondents but approximately 3 percent of this group recertified anyway.

Fellows between the ages of 44 and 54 comprised the predominant group of respondents (72 percent). Seventeen percent of respondents were under the age of 44, and 11 percent were age 55 and older.

A house divided

Overall, fellows were not averse to the concept of assuring continued orthopaedist competency. Over three-fourths (76 percent) agreed it is necessary to have some type of system in place.2 However, surgeons not required to recertify were less inclined to perceive a need for a system to insure competency (63 percent agreement).

What fellows questioned was the fairness and the effectiveness of the current recertification system in measuring orthopaedist competency. Fellows took no clear position on these issues. Approximately three in ten either agreed or disagreed that the current system is effective or fair.

Experience influenced respondents’ perspectives. Among fellows familiar with the recertification process, 41 percent agreed the current system is fair and 38 percent agreed it was effective. In contrast, only 21 percent of those with no recertification experience perceived the recertification process as a fair means of measuring orthopaedic competency, and 24 percent thought it was an effective means.

Among the 23 percent of fellows who had not yet recertified, differing perceptions of the effectiveness of recertification were driven by a fundamental difference in philosophy. Of the 56 respondents who agreed the current system is effective, 52 also accepted recertification as a necessary part of being an orthopaedist. However, of the 71 respondents who disagreed the system is effective, only 23 accepted recertification as necessary.

Current Recertification System

Rating

Effective Measure (%)

Fair Measure (%)

Agree

32

32

Neutral

26

26

Disagree

30

28

Don’t Know

10

11

No Answer

2

3

Total

100

100

N = 973

N = 973

Fear factor

Apprehensive is perhaps the best way to describe orthopaedists who have not yet sat for a recertification exam. Eight in 10 anticipated the recertification process would consume a significant amount of their time (82 percent) and would also be a distraction from their practices (79 percent) as well as a distraction from their personal lives (78 percent). Fellows awaiting recertification also were concerned about the monetary costs. Only 20 percent believed these costs were appropriate and reasonable.

Recertification veterans

Not all fellows had experiences that lived up to these expectations. Recertification veterans who agreed that the current system is effective, on average, spent less time and money in the process of recertification than those who disagreed with the statement.

Fellows reported that they lost, on average, 3.7 days from their practices for recertification exam preparation and 1.7 days sitting for the exam itself. Total direct costs for recertification, including all fees, transportation, lodging and meals, averaged $2,679.

Exam options

Of the six examination options available the two most "popular" options among fellows who had recertified at least once were the general clinical written exam at the AAOS Annual Meeting (the choice of 37 percent) and the computerized general orthopaedic exam (31 percent). Unfortunately, respondents holding time-limited certificates (whether recertified or not) did not consider these two options particularly relevant, fair or valid means of evaluating orthopaedic knowledge.

On a 5-point agreement scale where "1" means "disagree completely" and "5" means "agree completely" mean ratings of the written exam ranged from 2.42 for relevance to orthopaedic practice to 3.06 for fairness. The computerized general exam garnered mean ratings of 2.57 for relevancy and 3.15 for fairness.

Only 10 percent of recertified fellows utilized the practice-based oral exam and only one respondent utilized the practice audit exam for recertification. Yet, relative to their other options, time-limited respondents judged these two exams to be the fairest, most valid and most relevant.

Why did time-limited fellows opt for exam methods they considered less fair, valid or relevant to their practices? Convenience emerged as a key attribute of the general written exam (45 percent agreed), the computerized general exam (60 percent) and the computerized subspecialty exam (41 percent).

Recertified Fellows
Mean Ratings of Agreement
(5-point scale)

 

Written Exam

Computerized
General Exam

Computerized
Subspecialty
Exam

Practice- Based
Oral Exam

CAQ in
Hand
Surgery*

Practice Audit
Exam

Convenient

3.37

4.08

3.70

2.17

2.95

2.60

Fair

3.06

3.15

3.20

3.64

3.50

3.61

Valid

2.94

3.01

3.02

3.52

3.40

3.53

Revelant

2.42

2.57

3.01

3.84

3.57

3.74

N

445-532

423-511

301-397

345-435

62

263-370

*Evaluated only by fellows holding a CAQ in Hand Surgery

Another way?

The AAOS and the ABOS are considering a joint effort to develop alternative means of evaluating maintenance of competence in addition to the current pathways. Respondents reported their reactions to potential recertification "through a combination of practice evaluation, continuing medical education, professional standing and an examination process that would be phased over ten years rather than all at one time."

Nearly two-thirds of fellows (65 percent) agreed they were interested in new pathway development and would like to see it proceed. Only 14 percent disagreed.

Fellows viewed AAOS involvement as an important aspect of any alternative pathway. They reported that AAOS development of educational programs based on ABOS-developed curriculum (78 percent agreement) and AAOS involvement in self-assessment as part of recertification (76 percent agreement) would be imperative. Application of CME course work–self-study (73 percent), lecture-based (65 percent) or "hands-on" (63 percent)–would also be important facets of a new pathway.

We asked for it

An open-ended question was included in the survey soliciting feedback on the issue of maintenance of competence. The pages of verbatim comments that resulted attest to the importance of this issue to AAOS fellows. These comments were summarized to the extent possible with the following results.

Fellows’ frustrations with the expense of the recertification process, in terms of both time and money, comprised 22 percent of comments. Another 20 percent consisted of suggested alternative forms of evaluation, i.e., CME credits and outcomes assessment.

For 13 percent of respondents the validity of the written test, or of measuring competency in general, was in question. Relevancy was an issue for the 8 percent of fellows requesting more subspecialty focused exams. Although 4 percent of respondents reiterated the fairness of the current system, 11 percent were clearly frustrated by the perceived lack of fairness in "grandfathering." Finally, 8 percent of those who made comments felt the whole recertification process was unnecessary.

Diane Thome is AAOS manager, marketing research.

  1. Nahrwold David L."The competence movement." Bulletin of the American College of Surgeons. Vol.85. No. 11. pp. 14-18.
  2. Throughout this article "agreement" refers to ratings of "5" and "4" on a 5-point scale where "5" means "agree completely" and "1" means "disagree completely." "Disagreement" refers to ratings of "1" and "2."


Home Previous Page