The Problem with Certifying Many Medical Specialties

September 9, 2016

For many medical specialties, like those in surgery, it seems foolish to require providers to take a paper certification test.

I wrote a blog some time back about the certification of PAs and what it means. Our certifying agency, the National Commission on the Certification of Physician Assistants (NCCPA), had recently proposed a dramatic change in the way that certification worked with much added testing, and questionable metrics, that they said was "requested" by mainly specialty PAs. The NCCPA recently announced that, effective immediately, self-assessment and performance improvement continuing medical education (CME) was no longer required. They added that the NCCPA is committed to  looking at a “broader range of potential changes” to the recertification process based on the tsunami of feedback received from rank and file PAs.

We are not alone in this conundrum. Maintenance of Certification (MOC) is a very hot topic in the physician world as well. Specialty boards seem to be in a race as to who can expand requirements for maintaining board certification faster. This has created an onerous burden for us all, and forced us to spend more time, money, and effort jumping through hoops that don’t necessarily provide any assurance to hospitals, other healthcare facilities or the public that "boarded" and certified providers are "better" or “safer” in care delivery in the modern healthcare system.

It is the date that we all brought to the dance, however, and sometimes you have to dance even when you don’t want to. PAs are a unique element of the healthcare delivery team in that we specialize after graduation and certification. There are residencies for PA in specialty practice, but they are not required to practice in a specialty. This has been one of the strengths of the profession, as well as being one of the major components of high job satisfaction among PAs.

I am the poster boy for the mobility of the profession. Since I graduated in 1981, I have served in six specialties. I’m currently practicing in plastic and reconstructive surgery, and plan to retire from this in two years. When I think about what the NCCPA was trying to do, I really wondered about the value of a "test" in determining if someone in the operating room (OR) is "competent." I get the self-assessment and performance improvement elements that went with our recent change to a ten-year recertification cycle.

A good clinician and provider is always compelled to look for deficits in their knowledge of medicine, as well as their physical skills, and seek ways in which to improve them and fill the gaps. Yet, testing is another matter. We all know people who can ace tests, but should never have a sharp instrument in their hand.

I have been thinking about this a lot as I practice in my surgical world, as (what I consider to be) a competent first assist and surgical PA on a team performing more than 700 cases per year at our community hospital. I don’t know how you test people on paper for surgical skill. You can certainly test people for surgical knowledge in dealing with the diagnosis of surgical conditions, complications, and other medical conditions. But, there is an intangible in the skill of applying that knowledge to the care of patients on the surgical ward, and in the OR.

I was joking with a general surgeon this past week that surgical skill is like pornography. I know it when I see it, but it is hard to define in words (to paraphrase Supreme Court Justice Potter Stewart). Surgery is a complex world of ever expanding specialties and technologies, in which no paper test could ever hope to keep up. When I think about what makes a good surgeon (and I have worked with and observed many), these are the elements of competence:

Knowledge. Surgeons of any type are the product of years of training, both didactic and "hands on." This method of training this type of specialist is tried and true, and in nearly every instance (not always!), produces surgeons ready to competently care for their patients.

Experience. Goes hand in hand with the above. Knowledge has to be applied to the real world, under real world conditions. Surgical diagnosis is a complex skill, with many nuances. You have to be able to recognize the subtle patterns of surgical disease.

Judgement. This is a combination of the two above. After operating with my surgeon for seven years over many thousands of cases, I can competently perform the majority of surgeries in our repertoire. I can competently decide when to operate on many cases. However, I will never approach the skill and training of my surgeon, and I gladly leave the heavy lifting of judgement to the superior skill and training of my surgeon partner.

Manual dexterity. Either you have it, or you don’t. This is a critical skill that speaks for itself, and can’t be applied expertly without the three above.

How do you fully measure the above except by direct observation and experience with another provider? You can develop metrics for some of the above, but you can’t, in my opinion, measure that totally of surgical skill and ability through the eyes of a testing agency.