Article Type
Changed
Thu, 03/28/2019 - 15:35
Display Headline
The Complexities of Competency
Questions have arisen as to when a clinician is “fully competent” to see patients—but what does that mean, and how do we measure it?

Rarely do I post online as a knee-jerk reaction! But recently, a topic hit me right in the middle of the forehead. I received an email from a colleague who asked:

“Is there any study looking at how long a PA or NP needs after completing his/her training to be fully competent? I’m at a hospital board meeting and one member is suggesting ‘midlevels’ need three more years of training, at the expense of the institution hiring them.”

I must admit that I was at a loss as to how to respond! (Not least because I dislike the term midlevel.) Lately, competency has been a hot topic as hospitals and large health care organizations hire more new graduates and want to know how long it will take for them to get up to speed within the institution. Competence is thus defined as how long it takes these PAs/NPs to become fully functional in a particular setting. It’s a narrow, specific question rather than a broad, philosophical one—but it begs the competency question, does it not?

Let’s start with the definition of competency. I had to laugh when I consulted Merriam-Webster, which says competency is “the quality or state of being functionally adequate.” Now, that is what I strive to be … “adequate”!  

I prefer Norman’s definition of professional competence: “The habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and community being served. Competence builds on a foundation of basic clinical skills, scientific knowledge, and moral development.”1

He goes on to say that competence has multiple functions: cognitive (using acquired knowledge to solve real-life problems); integrative (using biomedical and psychosocial data in clinical reasoning); relational (communicating effectively with patients and colleagues); and affective/moral (the willingness, patience, and emotional awareness to use these skills judiciously and humanely). I was particularly struck by a final comment that competence is “developmental, impermanent, and context-dependent.”1 Competence is certainly developmental in the context of lifelong learning. If it is indeed impermanent (temporary, transient, transitory, passing, fleeting), then it must be evaluated frequently. There is no argument that it is context-dependent, whether by level of care, specialty knowledge required, or institution.

Clearly, competence is complex. While the PA and NP professions have developed and published clinical competencies in the past decade (which mirror and parallel those of our physician colleagues), how do we actually demonstrate them?

Continue for competency definitions >>

 

 

Patricia Benner developed one of the best-known competency definitions in 1982 with her Novice to Expert model, which applied the Dreyfus Model of Skill Acquisition to nursing. It has been widely used as a tool to determine “expertise.”2,3 Her model describes the five levels of expertise as
Novice: A beginner with little to no experience. Novices face the inability to use discretionary judgment and require significant supervision.
Advanced beginner: Able to demonstrate marginally acceptable performance based on some real-life experience.
Competent: Has usually been on the job for two to three years. At this level, the clinician has a sense of mastery and the ability to cope with and manage many aspects of patient care.
Proficient: Able to grasp clinical solutions quicker and able to hone in on accurate regions of the problem faster.
Expert: No longer relies on analytics to connect to understanding of the problem but has an intuitive grasp and is able to zero in on all aspects of the problem at hand without any wasteful or unfruitful possibilities.3

Benner maintains that knowledge accrues over time in clinical practice and is developed through dialogue in relationship and situational contexts.4 Of note, clinical experience is not the mere passage of time or longevity within a clinical experience but rather the actual level of clinical interaction. The clinician, therefore, may move forward or backward, depending on the situation.

In 2011, Chuck defined six levels of competency, postulating that for each we find ways to scale the learning curve. It is where we are on the curve that determines our competence in a skill set. His six levels include
Naïve/Newcomer: Exhibits little observable knowledge, skill, or sincere interest
Intermediate: Has received minimal but not sufficient training to exhibit a core set of knowledge, skills, or ­interest
Proficient: Has completed sufficient training (usually through a set of required classes) to reliably reproduce a core set of knowledge and skills, but requires further training when confronted with situations in which it needs to be applied
Confident: Has above-average knowledge and skills and demonstrates appropriate confidence in adapting to new situations that challenge those skills
Master: Demonstrates consistent excellence in knowledge and skills and can appropriately seek affirmation and criticism to independently develop additional skills
Expert: Has received external validation of superior quality knowledge and skills and is considered an innovator, leader, or authority in a specific area.5

In the Chuck model, levels 1 and 2 would be prematriculants and students. You can see variations of this learning curve in different situations, whether it is a new clinician in the emergency department (ED) or an experienced clinician moving to a new practice.

So when is a clinician (specifically, a PA or NP) fully competent to see patients? This question is undoubtedly being asked more than we realize, and both professions should develop a serious answer to it. Are we doing enough research to make an objective argument in response? No matter how we answer, I think it is important to note that our respective professions have excellent patient care outcomes, even when taking into account the particular clinician level (novice through expert).

This is a challenging topic because what we do requires factual knowledge and the consistent, appropriate application of that knowledge. We know how to measure factual knowledge, more or less, but assuredly we don’t know how to measure the latter (possibly the more important part). In my opinion, we need a pragmatic approach to determine whether a clinician is competent and continues to be so.

One method is to do what is known as a 360 survey. Here’s how it might work: All coworkers of a particular clinician would be surveyed on the perceived elements of clinical competence, including knowledge, application of knowledge, efficiency, ability to make decisions, and attitude toward patients. Every person in the department—say, the ED—could anonymously complete the survey. (This would include nurses, techs, other PAs/NPs, housekeeping, on-call members of the medical staff—literally everybody, although not all of them will be capable of making some of these determinations.) Then the ED director would let the clinician review and discuss the feedback. Everyone in the department would know he or she would be similarly evaluated.6

This is the most brutal, yet fair and efficient, way to assess competency in its broadest sense. Will all opinions be factually substantiated? No! But what better technique do we have, at least for now?

But wait! Perhaps competence is not the end game. Perhaps competence is really a minimum standard. Competence (albeit novice) is measured by completion of the PA or NP curricula (meeting the course objectives) and passage of board/licensure exams, just as, essentially, physician competence is.

Most, if not all, would agree that mastery is achieved by the acquisition of knowledge coupled with sound practice and experience. Mastery or expertise, some say, is what we should focus on, the achievement of which is quite individual. All clinicians can move toward mastery, but not all will actually achieve it. Therefore, how can we mandate a minimum standard, beyond competence, for PAs and NPs but not for other providers?

So, after all the rhetorical ranting about when a PA or NP becomes fully competent, the answer is … It depends! There are too many moving parts. I would suggest that competency is the starting point and mastery (expertise) is a journey.

What do you think? Share your thoughts with me via PAEditor@frontlinemedcom.com.

REFERENCES
1. Norman GR. Defining competence: a methodological review. In: Neufeld VR, Norman GR, eds. Assessing Clinical Competence. New York, NY: Springer; 1985:15-35.

2. Gentile DL. Applying the novice-to-expert model to infusion nursing. J Infus Nurs. 2012;35(2):101-107.

3. Benner P. From novice to expert. Am J Nurs. 1982;82(3):402-407.

4. Brykczynski KA. Patricia Benner: caring, clinical wisdom, and ethics in nursing practice. In: Alligood MR, ed. Nursing Theorists and Their Work. 8th ed. St Louis, MO: Elsevier. 2014; 120-146.

5. Chuck E. The competency manifesto: part 3. The Student Doctor Network. www.student doctor.net/2011/04/the-competency-mani festo-part-3. Accessed November 11, 2014.

6. Lepsinger R, Luca AD. The Art and Science of 360-Degree Feedback. 2nd ed. San Francisco, CA: Jossey-Bass; 2009. 

References

Article PDF
Author and Disclosure Information

Randy D. Danielsen PhD, PA-C, DFAAPA

Issue
Clinician Reviews - 24(12)
Publications
Topics
Page Number
6-8
Legacy Keywords
competence, competency, novice, expert, experience, knowledge, training, education, Patricia Benner, skill set, feedback, evaluation, assessment, patient care, standard
Sections
Author and Disclosure Information

Randy D. Danielsen PhD, PA-C, DFAAPA

Author and Disclosure Information

Randy D. Danielsen PhD, PA-C, DFAAPA

Article PDF
Article PDF
Questions have arisen as to when a clinician is “fully competent” to see patients—but what does that mean, and how do we measure it?
Questions have arisen as to when a clinician is “fully competent” to see patients—but what does that mean, and how do we measure it?

Rarely do I post online as a knee-jerk reaction! But recently, a topic hit me right in the middle of the forehead. I received an email from a colleague who asked:

“Is there any study looking at how long a PA or NP needs after completing his/her training to be fully competent? I’m at a hospital board meeting and one member is suggesting ‘midlevels’ need three more years of training, at the expense of the institution hiring them.”

I must admit that I was at a loss as to how to respond! (Not least because I dislike the term midlevel.) Lately, competency has been a hot topic as hospitals and large health care organizations hire more new graduates and want to know how long it will take for them to get up to speed within the institution. Competence is thus defined as how long it takes these PAs/NPs to become fully functional in a particular setting. It’s a narrow, specific question rather than a broad, philosophical one—but it begs the competency question, does it not?

Let’s start with the definition of competency. I had to laugh when I consulted Merriam-Webster, which says competency is “the quality or state of being functionally adequate.” Now, that is what I strive to be … “adequate”!  

I prefer Norman’s definition of professional competence: “The habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and community being served. Competence builds on a foundation of basic clinical skills, scientific knowledge, and moral development.”1

He goes on to say that competence has multiple functions: cognitive (using acquired knowledge to solve real-life problems); integrative (using biomedical and psychosocial data in clinical reasoning); relational (communicating effectively with patients and colleagues); and affective/moral (the willingness, patience, and emotional awareness to use these skills judiciously and humanely). I was particularly struck by a final comment that competence is “developmental, impermanent, and context-dependent.”1 Competence is certainly developmental in the context of lifelong learning. If it is indeed impermanent (temporary, transient, transitory, passing, fleeting), then it must be evaluated frequently. There is no argument that it is context-dependent, whether by level of care, specialty knowledge required, or institution.

Clearly, competence is complex. While the PA and NP professions have developed and published clinical competencies in the past decade (which mirror and parallel those of our physician colleagues), how do we actually demonstrate them?

Continue for competency definitions >>

 

 

Patricia Benner developed one of the best-known competency definitions in 1982 with her Novice to Expert model, which applied the Dreyfus Model of Skill Acquisition to nursing. It has been widely used as a tool to determine “expertise.”2,3 Her model describes the five levels of expertise as
Novice: A beginner with little to no experience. Novices face the inability to use discretionary judgment and require significant supervision.
Advanced beginner: Able to demonstrate marginally acceptable performance based on some real-life experience.
Competent: Has usually been on the job for two to three years. At this level, the clinician has a sense of mastery and the ability to cope with and manage many aspects of patient care.
Proficient: Able to grasp clinical solutions quicker and able to hone in on accurate regions of the problem faster.
Expert: No longer relies on analytics to connect to understanding of the problem but has an intuitive grasp and is able to zero in on all aspects of the problem at hand without any wasteful or unfruitful possibilities.3

Benner maintains that knowledge accrues over time in clinical practice and is developed through dialogue in relationship and situational contexts.4 Of note, clinical experience is not the mere passage of time or longevity within a clinical experience but rather the actual level of clinical interaction. The clinician, therefore, may move forward or backward, depending on the situation.

In 2011, Chuck defined six levels of competency, postulating that for each we find ways to scale the learning curve. It is where we are on the curve that determines our competence in a skill set. His six levels include
Naïve/Newcomer: Exhibits little observable knowledge, skill, or sincere interest
Intermediate: Has received minimal but not sufficient training to exhibit a core set of knowledge, skills, or ­interest
Proficient: Has completed sufficient training (usually through a set of required classes) to reliably reproduce a core set of knowledge and skills, but requires further training when confronted with situations in which it needs to be applied
Confident: Has above-average knowledge and skills and demonstrates appropriate confidence in adapting to new situations that challenge those skills
Master: Demonstrates consistent excellence in knowledge and skills and can appropriately seek affirmation and criticism to independently develop additional skills
Expert: Has received external validation of superior quality knowledge and skills and is considered an innovator, leader, or authority in a specific area.5

In the Chuck model, levels 1 and 2 would be prematriculants and students. You can see variations of this learning curve in different situations, whether it is a new clinician in the emergency department (ED) or an experienced clinician moving to a new practice.

So when is a clinician (specifically, a PA or NP) fully competent to see patients? This question is undoubtedly being asked more than we realize, and both professions should develop a serious answer to it. Are we doing enough research to make an objective argument in response? No matter how we answer, I think it is important to note that our respective professions have excellent patient care outcomes, even when taking into account the particular clinician level (novice through expert).

This is a challenging topic because what we do requires factual knowledge and the consistent, appropriate application of that knowledge. We know how to measure factual knowledge, more or less, but assuredly we don’t know how to measure the latter (possibly the more important part). In my opinion, we need a pragmatic approach to determine whether a clinician is competent and continues to be so.

One method is to do what is known as a 360 survey. Here’s how it might work: All coworkers of a particular clinician would be surveyed on the perceived elements of clinical competence, including knowledge, application of knowledge, efficiency, ability to make decisions, and attitude toward patients. Every person in the department—say, the ED—could anonymously complete the survey. (This would include nurses, techs, other PAs/NPs, housekeeping, on-call members of the medical staff—literally everybody, although not all of them will be capable of making some of these determinations.) Then the ED director would let the clinician review and discuss the feedback. Everyone in the department would know he or she would be similarly evaluated.6

This is the most brutal, yet fair and efficient, way to assess competency in its broadest sense. Will all opinions be factually substantiated? No! But what better technique do we have, at least for now?

But wait! Perhaps competence is not the end game. Perhaps competence is really a minimum standard. Competence (albeit novice) is measured by completion of the PA or NP curricula (meeting the course objectives) and passage of board/licensure exams, just as, essentially, physician competence is.

Most, if not all, would agree that mastery is achieved by the acquisition of knowledge coupled with sound practice and experience. Mastery or expertise, some say, is what we should focus on, the achievement of which is quite individual. All clinicians can move toward mastery, but not all will actually achieve it. Therefore, how can we mandate a minimum standard, beyond competence, for PAs and NPs but not for other providers?

So, after all the rhetorical ranting about when a PA or NP becomes fully competent, the answer is … It depends! There are too many moving parts. I would suggest that competency is the starting point and mastery (expertise) is a journey.

What do you think? Share your thoughts with me via PAEditor@frontlinemedcom.com.

REFERENCES
1. Norman GR. Defining competence: a methodological review. In: Neufeld VR, Norman GR, eds. Assessing Clinical Competence. New York, NY: Springer; 1985:15-35.

2. Gentile DL. Applying the novice-to-expert model to infusion nursing. J Infus Nurs. 2012;35(2):101-107.

3. Benner P. From novice to expert. Am J Nurs. 1982;82(3):402-407.

4. Brykczynski KA. Patricia Benner: caring, clinical wisdom, and ethics in nursing practice. In: Alligood MR, ed. Nursing Theorists and Their Work. 8th ed. St Louis, MO: Elsevier. 2014; 120-146.

5. Chuck E. The competency manifesto: part 3. The Student Doctor Network. www.student doctor.net/2011/04/the-competency-mani festo-part-3. Accessed November 11, 2014.

6. Lepsinger R, Luca AD. The Art and Science of 360-Degree Feedback. 2nd ed. San Francisco, CA: Jossey-Bass; 2009. 

Rarely do I post online as a knee-jerk reaction! But recently, a topic hit me right in the middle of the forehead. I received an email from a colleague who asked:

“Is there any study looking at how long a PA or NP needs after completing his/her training to be fully competent? I’m at a hospital board meeting and one member is suggesting ‘midlevels’ need three more years of training, at the expense of the institution hiring them.”

I must admit that I was at a loss as to how to respond! (Not least because I dislike the term midlevel.) Lately, competency has been a hot topic as hospitals and large health care organizations hire more new graduates and want to know how long it will take for them to get up to speed within the institution. Competence is thus defined as how long it takes these PAs/NPs to become fully functional in a particular setting. It’s a narrow, specific question rather than a broad, philosophical one—but it begs the competency question, does it not?

Let’s start with the definition of competency. I had to laugh when I consulted Merriam-Webster, which says competency is “the quality or state of being functionally adequate.” Now, that is what I strive to be … “adequate”!  

I prefer Norman’s definition of professional competence: “The habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and community being served. Competence builds on a foundation of basic clinical skills, scientific knowledge, and moral development.”1

He goes on to say that competence has multiple functions: cognitive (using acquired knowledge to solve real-life problems); integrative (using biomedical and psychosocial data in clinical reasoning); relational (communicating effectively with patients and colleagues); and affective/moral (the willingness, patience, and emotional awareness to use these skills judiciously and humanely). I was particularly struck by a final comment that competence is “developmental, impermanent, and context-dependent.”1 Competence is certainly developmental in the context of lifelong learning. If it is indeed impermanent (temporary, transient, transitory, passing, fleeting), then it must be evaluated frequently. There is no argument that it is context-dependent, whether by level of care, specialty knowledge required, or institution.

Clearly, competence is complex. While the PA and NP professions have developed and published clinical competencies in the past decade (which mirror and parallel those of our physician colleagues), how do we actually demonstrate them?

Continue for competency definitions >>

 

 

Patricia Benner developed one of the best-known competency definitions in 1982 with her Novice to Expert model, which applied the Dreyfus Model of Skill Acquisition to nursing. It has been widely used as a tool to determine “expertise.”2,3 Her model describes the five levels of expertise as
Novice: A beginner with little to no experience. Novices face the inability to use discretionary judgment and require significant supervision.
Advanced beginner: Able to demonstrate marginally acceptable performance based on some real-life experience.
Competent: Has usually been on the job for two to three years. At this level, the clinician has a sense of mastery and the ability to cope with and manage many aspects of patient care.
Proficient: Able to grasp clinical solutions quicker and able to hone in on accurate regions of the problem faster.
Expert: No longer relies on analytics to connect to understanding of the problem but has an intuitive grasp and is able to zero in on all aspects of the problem at hand without any wasteful or unfruitful possibilities.3

Benner maintains that knowledge accrues over time in clinical practice and is developed through dialogue in relationship and situational contexts.4 Of note, clinical experience is not the mere passage of time or longevity within a clinical experience but rather the actual level of clinical interaction. The clinician, therefore, may move forward or backward, depending on the situation.

In 2011, Chuck defined six levels of competency, postulating that for each we find ways to scale the learning curve. It is where we are on the curve that determines our competence in a skill set. His six levels include
Naïve/Newcomer: Exhibits little observable knowledge, skill, or sincere interest
Intermediate: Has received minimal but not sufficient training to exhibit a core set of knowledge, skills, or ­interest
Proficient: Has completed sufficient training (usually through a set of required classes) to reliably reproduce a core set of knowledge and skills, but requires further training when confronted with situations in which it needs to be applied
Confident: Has above-average knowledge and skills and demonstrates appropriate confidence in adapting to new situations that challenge those skills
Master: Demonstrates consistent excellence in knowledge and skills and can appropriately seek affirmation and criticism to independently develop additional skills
Expert: Has received external validation of superior quality knowledge and skills and is considered an innovator, leader, or authority in a specific area.5

In the Chuck model, levels 1 and 2 would be prematriculants and students. You can see variations of this learning curve in different situations, whether it is a new clinician in the emergency department (ED) or an experienced clinician moving to a new practice.

So when is a clinician (specifically, a PA or NP) fully competent to see patients? This question is undoubtedly being asked more than we realize, and both professions should develop a serious answer to it. Are we doing enough research to make an objective argument in response? No matter how we answer, I think it is important to note that our respective professions have excellent patient care outcomes, even when taking into account the particular clinician level (novice through expert).

This is a challenging topic because what we do requires factual knowledge and the consistent, appropriate application of that knowledge. We know how to measure factual knowledge, more or less, but assuredly we don’t know how to measure the latter (possibly the more important part). In my opinion, we need a pragmatic approach to determine whether a clinician is competent and continues to be so.

One method is to do what is known as a 360 survey. Here’s how it might work: All coworkers of a particular clinician would be surveyed on the perceived elements of clinical competence, including knowledge, application of knowledge, efficiency, ability to make decisions, and attitude toward patients. Every person in the department—say, the ED—could anonymously complete the survey. (This would include nurses, techs, other PAs/NPs, housekeeping, on-call members of the medical staff—literally everybody, although not all of them will be capable of making some of these determinations.) Then the ED director would let the clinician review and discuss the feedback. Everyone in the department would know he or she would be similarly evaluated.6

This is the most brutal, yet fair and efficient, way to assess competency in its broadest sense. Will all opinions be factually substantiated? No! But what better technique do we have, at least for now?

But wait! Perhaps competence is not the end game. Perhaps competence is really a minimum standard. Competence (albeit novice) is measured by completion of the PA or NP curricula (meeting the course objectives) and passage of board/licensure exams, just as, essentially, physician competence is.

Most, if not all, would agree that mastery is achieved by the acquisition of knowledge coupled with sound practice and experience. Mastery or expertise, some say, is what we should focus on, the achievement of which is quite individual. All clinicians can move toward mastery, but not all will actually achieve it. Therefore, how can we mandate a minimum standard, beyond competence, for PAs and NPs but not for other providers?

So, after all the rhetorical ranting about when a PA or NP becomes fully competent, the answer is … It depends! There are too many moving parts. I would suggest that competency is the starting point and mastery (expertise) is a journey.

What do you think? Share your thoughts with me via PAEditor@frontlinemedcom.com.

REFERENCES
1. Norman GR. Defining competence: a methodological review. In: Neufeld VR, Norman GR, eds. Assessing Clinical Competence. New York, NY: Springer; 1985:15-35.

2. Gentile DL. Applying the novice-to-expert model to infusion nursing. J Infus Nurs. 2012;35(2):101-107.

3. Benner P. From novice to expert. Am J Nurs. 1982;82(3):402-407.

4. Brykczynski KA. Patricia Benner: caring, clinical wisdom, and ethics in nursing practice. In: Alligood MR, ed. Nursing Theorists and Their Work. 8th ed. St Louis, MO: Elsevier. 2014; 120-146.

5. Chuck E. The competency manifesto: part 3. The Student Doctor Network. www.student doctor.net/2011/04/the-competency-mani festo-part-3. Accessed November 11, 2014.

6. Lepsinger R, Luca AD. The Art and Science of 360-Degree Feedback. 2nd ed. San Francisco, CA: Jossey-Bass; 2009. 

References

References

Issue
Clinician Reviews - 24(12)
Issue
Clinician Reviews - 24(12)
Page Number
6-8
Page Number
6-8
Publications
Publications
Topics
Article Type
Display Headline
The Complexities of Competency
Display Headline
The Complexities of Competency
Legacy Keywords
competence, competency, novice, expert, experience, knowledge, training, education, Patricia Benner, skill set, feedback, evaluation, assessment, patient care, standard
Legacy Keywords
competence, competency, novice, expert, experience, knowledge, training, education, Patricia Benner, skill set, feedback, evaluation, assessment, patient care, standard
Sections
Article Source

PURLs Copyright

Inside the Article

Article PDF Media