View Article PDF

Ward clinical assessments are used worldwide for evaluations of both undergraduate and postgraduate students. In New Zealand, after completing a clinical attachment, junior doctors are rated from a scale of 1 (poor) to 5 (excellent) using an assessment form devised by the Medical Council of New Zealand (MCNZ) on 19 distinct domains of practice by a senior clinician with or without input from other team members. These evaluations are primarily formative however a low rating (receiving two scores of ‘2' on an evaluation form, or any ‘1') can have serious consequences such as discontinuation of employment or non-accreditation of training.New Zealand currently has little coordinated training of assessors (Senior Medical Officers - SMOs) in actual assessment. Although some receive ad hoc training through university or college affiliation, literature regarding the reliability and inter-rater variability of SMOs in the assessment of junior doctors in New Zealand is scarce. This study aimed to assess the reliability and intra-professional variation of senior and junior doctors in the assessment of a junior doctor's clinical skills via video simulation.Method A clinical simulation video was created showing a junior doctor interacting with other staff members in four different clinical scenarios. These included a junior doctor interacting with a nurse (scene 1), presenting a patient work up to a consultant (scene 2), performing a physical examination (scene 3) and handover of care at the conclusion of a shift (scene 4). The video was then presented to groups of doctors in various teachings fora throughout Auckland City Hospital (ACH). Participants included SMOs, registrars, junior doctors (postgraduate year one and two) and medical students. Participants were given a modified version of the MCNZ junior doctor evaluation form tailored towards the video presented (Table 1). Each of the 10 domains of clinical practice were scored on a scale from 1 - unsatisfactory (performs significantly below that generally observed for this level of experience), 2 - below expectation (requires further development), 3 - meets expectation (performs at a satisfactory level), 4 - above expectation (performs at a level better than that which would be expected for the level of experience), 5 - exceptional (performs at a level beyond that which would be expected for the level of experience). The video was designed to simulate a range of a junior doctor's clinical performance as well as a range in quality. The junior doctor - nurse interaction was performed towards a ‘3', the patient work up presentation and physical examination performed towards a ‘5' and the handover of care performed towards a ‘1'. In the authors' experience, such variability is commonly seen by SMOs on a day to day basis within an individual trainees skills and abilities. To test for differences in the rated items by medical position, analysis was done using the Kruskal-Wallis test in SAS version 9.2 software. Results 103 respondents completed the evaluation form: 22 SMOs, 17 registrars, 43 junior doctors and 21 medical students. Respondent data was collated from a PGY1/2 teaching session (42 respondents), Medical Grand Round (48 respondents) and a General Surgery Grand Round (13 respondents). Background training of respondents included general medicine, gastroenterology, pathology, haematology, cardiology, endocrinology, pulmonology, infectious diseases and general surgery. A response rate was unable to be calculated as the video was shown to 3 group audiences at pre-arranged teaching sessions within ACH with surveys collected on exit from the session. Variation in rating (regardless of participants' medical position) was noted amongst all questions (Table 2). The range in responses for SMOs was greatest for question 1 (clinical knowledge) and question 2 (presentation of history). Statistical significance was noted between assessor groups only for Question 6 (p=0.005) - the ability to communicate with patients and families (respect for patients). Question 6 also received the highest rating from junior doctors (median=3.0) and the lowest rating from SMOs (median=1.0). It should be noted also that this item had the fewest number of people responding to it (n=70). Table 1. Clinical Knowledge and Skills 1 2 3 4 5 N/O 1 Clinical knowledge (e.g. knowledge of common symptoms, drug doses and side effects, drug interactions, etc) 2 Presentation of History skills Clinical Judgement 1 2 3 4 5 N/O 3 Diagnostic skills (Identifies and prioritises patient problems, appropriate physical examination) 4 Patient management (Synthesises data, makes appropriate management decisions) 5 Time management (Plans and organises work, sets goals and meets them, prioritises calls, seeks advice on priorities if needed) Patient Communication 1 2 3 4 5 N/O 6 Ability to communicate with patients and families (respect for patients) Communication and Teamwork 1 2 3 4 5 N/O 7 Ability to communicate with other healthcare professionals (ability to work in a multidisciplinary team and with all team members irrespective of gender, contributes effectively to teamwork) 8 Initiative and enthusiasm (able to identify needs of the job, follows up without being prompted, thinks and plans ahead) Professional Attitudes and Behaviour 1 2 3 4 5 N/O 9 Reliability and dependability (punctual, carries out instructions, fulfils obligations, complies with hospital policies, keep up to date with work) 10 Personal manner (approachability, warmth, openness, rapport etc) Table 2. SMO REGISTRAR JUNIOR DR STUDENT Question Number of respondents Median Median Median Median p-value* Q1 -Clinical knowledge 102 3.0 3.0 3.0 3.0 0.198 Q2 -History presentation 102 3.0 3.0 3.0 2.0 0.078 Q3 -Diagnostic skills 103 3.0 3.0 3.0 3.0 0.552 Q4- Patient management 97 3.0 3.0 3.0 2.0 0.631 Q5 -Time management 95 2.0 2.0 2.0 2.0 0.163 Q6 - Patient communication 70 1.0 2.5 3.0 2.0 0.005 Q7 -Peer communication 103 2.0 2.0 2.0 2.0 0.807 Q8 -Initiative 94 2.0 2.0 2.0 2.0 0.187 Q9 -Reliability 99 2.0 2.0 2.0 2.0 0.938 Q10 -Personal manner 102 2.0 2.0 2.0 2.0 0.845 * The p-values are testing for significant differences (p<0.05) in item ratings by the respondents' medical position. These were obtained using the Kruskal-Wallis test. Table 3: Distribution of SMO responses per question Question Number of respondents Median Lowest Score Highest Score 25thpercentile 75thpercentile Q1 -Clinical knowledge 22 3.0 2.0 5.0 2.0 4.0 Q2 -History presentation 22 3.0 1.0 5.0 2.0 4.0 Q3 -Diagnostic skills 22 3.0 2.0 5.0 2.0 4.0

Summary

Abstract

Aim

To investigate the reliability and intra-professional variation of senior and junior doctors in the assessment of a junior doctors clinical skills via video simulation.

Method

Simulation video was created showing 4 clinical scenarios. This video was shown to consultants, registrars and junior doctors in various forms at Auckland City Hospital. Participants evaluated each scenario against a modified version of the current assessment form used by the Medical Council of New Zealand.

Results

103 respondents completed the survey: 22 Senior Medical Officers, 17 registrars (PGY3+), 43 junior doctors (PGY1-2) and 21 undergraduates (medical students). Statistical significance between groups was reached only for Question 6 in which Senior Medical Officers rated communication skills and respect for patients lower than postgraduate students (p=0.005). Large variability was noted in ratings for presentation of history and clinical knowledge.

Conclusion

There is marked variation between Senior Medical Officers in the assessment of a junior doctors clinical practice as demonstrated by the use of a simulation video. This variation is of potential major concern. Quality training methods of assessors may need to be implemented for standardisation of assessment if a summative component exists.

Author Information

Maneesh Deva, Advanced Paediatric Trainee (General Paediatric); Tonya Kara, Project Supervisor and Paediatric Nephrologist; Starship Childrens Hospital, Auckland

Acknowledgements

I thank the Childrens Research Centre at Starship Childrens Hospital for their additional support as well as Dr Emily Chang.

Correspondence

Dr Maneesh Deva, Starship Children's Health, Private Bag 92024, Auckland 1142, New Zealand.

Correspondence Email

ManeeshD@adhb.govt.nz

Competing Interests

None known.

Williams RG, Klamen DA, McGaghie WC. Cognitive, Social and Environmental Sources of Bias in Clinical Performance Ratings. Teach Learn Med. 2003;15(4):270-292.Herbers JE Jr, Noel GL, Cooper GS, Harvey J et al. How accurate are faculty evaluations of clinical competence? J Gen Intern Med. 1989;4(3):202-8.Elliot DL, Hickam DH. Evaluation of Physical Examination Skills: Reliability of Faculty Observers and Patient Instructors. JAMA 1987;258(23):3405-3408Noel GL, Herbers JE Jr, Caplow MP, Cooper GS et al. How well do internal faculty members evaluate the clinical skills of residents? Ann Intern Med. 1992;117(9):757-65.Davies H, Archer J, Bateman A, Dewar S et al. Specialty-specific multi-source feedback: assuring validity, informing training. Med Educ. 2008:42:1014-20.Norcini JJ, McKinley DW. Assessment methods in medical education. Teach Teach Educ. 2007:23:239-50.

For the PDF of this article,
contact nzmj@nzma.org.nz

View Article PDF

Ward clinical assessments are used worldwide for evaluations of both undergraduate and postgraduate students. In New Zealand, after completing a clinical attachment, junior doctors are rated from a scale of 1 (poor) to 5 (excellent) using an assessment form devised by the Medical Council of New Zealand (MCNZ) on 19 distinct domains of practice by a senior clinician with or without input from other team members. These evaluations are primarily formative however a low rating (receiving two scores of ‘2' on an evaluation form, or any ‘1') can have serious consequences such as discontinuation of employment or non-accreditation of training.New Zealand currently has little coordinated training of assessors (Senior Medical Officers - SMOs) in actual assessment. Although some receive ad hoc training through university or college affiliation, literature regarding the reliability and inter-rater variability of SMOs in the assessment of junior doctors in New Zealand is scarce. This study aimed to assess the reliability and intra-professional variation of senior and junior doctors in the assessment of a junior doctor's clinical skills via video simulation.Method A clinical simulation video was created showing a junior doctor interacting with other staff members in four different clinical scenarios. These included a junior doctor interacting with a nurse (scene 1), presenting a patient work up to a consultant (scene 2), performing a physical examination (scene 3) and handover of care at the conclusion of a shift (scene 4). The video was then presented to groups of doctors in various teachings fora throughout Auckland City Hospital (ACH). Participants included SMOs, registrars, junior doctors (postgraduate year one and two) and medical students. Participants were given a modified version of the MCNZ junior doctor evaluation form tailored towards the video presented (Table 1). Each of the 10 domains of clinical practice were scored on a scale from 1 - unsatisfactory (performs significantly below that generally observed for this level of experience), 2 - below expectation (requires further development), 3 - meets expectation (performs at a satisfactory level), 4 - above expectation (performs at a level better than that which would be expected for the level of experience), 5 - exceptional (performs at a level beyond that which would be expected for the level of experience). The video was designed to simulate a range of a junior doctor's clinical performance as well as a range in quality. The junior doctor - nurse interaction was performed towards a ‘3', the patient work up presentation and physical examination performed towards a ‘5' and the handover of care performed towards a ‘1'. In the authors' experience, such variability is commonly seen by SMOs on a day to day basis within an individual trainees skills and abilities. To test for differences in the rated items by medical position, analysis was done using the Kruskal-Wallis test in SAS version 9.2 software. Results 103 respondents completed the evaluation form: 22 SMOs, 17 registrars, 43 junior doctors and 21 medical students. Respondent data was collated from a PGY1/2 teaching session (42 respondents), Medical Grand Round (48 respondents) and a General Surgery Grand Round (13 respondents). Background training of respondents included general medicine, gastroenterology, pathology, haematology, cardiology, endocrinology, pulmonology, infectious diseases and general surgery. A response rate was unable to be calculated as the video was shown to 3 group audiences at pre-arranged teaching sessions within ACH with surveys collected on exit from the session. Variation in rating (regardless of participants' medical position) was noted amongst all questions (Table 2). The range in responses for SMOs was greatest for question 1 (clinical knowledge) and question 2 (presentation of history). Statistical significance was noted between assessor groups only for Question 6 (p=0.005) - the ability to communicate with patients and families (respect for patients). Question 6 also received the highest rating from junior doctors (median=3.0) and the lowest rating from SMOs (median=1.0). It should be noted also that this item had the fewest number of people responding to it (n=70). Table 1. Clinical Knowledge and Skills 1 2 3 4 5 N/O 1 Clinical knowledge (e.g. knowledge of common symptoms, drug doses and side effects, drug interactions, etc) 2 Presentation of History skills Clinical Judgement 1 2 3 4 5 N/O 3 Diagnostic skills (Identifies and prioritises patient problems, appropriate physical examination) 4 Patient management (Synthesises data, makes appropriate management decisions) 5 Time management (Plans and organises work, sets goals and meets them, prioritises calls, seeks advice on priorities if needed) Patient Communication 1 2 3 4 5 N/O 6 Ability to communicate with patients and families (respect for patients) Communication and Teamwork 1 2 3 4 5 N/O 7 Ability to communicate with other healthcare professionals (ability to work in a multidisciplinary team and with all team members irrespective of gender, contributes effectively to teamwork) 8 Initiative and enthusiasm (able to identify needs of the job, follows up without being prompted, thinks and plans ahead) Professional Attitudes and Behaviour 1 2 3 4 5 N/O 9 Reliability and dependability (punctual, carries out instructions, fulfils obligations, complies with hospital policies, keep up to date with work) 10 Personal manner (approachability, warmth, openness, rapport etc) Table 2. SMO REGISTRAR JUNIOR DR STUDENT Question Number of respondents Median Median Median Median p-value* Q1 -Clinical knowledge 102 3.0 3.0 3.0 3.0 0.198 Q2 -History presentation 102 3.0 3.0 3.0 2.0 0.078 Q3 -Diagnostic skills 103 3.0 3.0 3.0 3.0 0.552 Q4- Patient management 97 3.0 3.0 3.0 2.0 0.631 Q5 -Time management 95 2.0 2.0 2.0 2.0 0.163 Q6 - Patient communication 70 1.0 2.5 3.0 2.0 0.005 Q7 -Peer communication 103 2.0 2.0 2.0 2.0 0.807 Q8 -Initiative 94 2.0 2.0 2.0 2.0 0.187 Q9 -Reliability 99 2.0 2.0 2.0 2.0 0.938 Q10 -Personal manner 102 2.0 2.0 2.0 2.0 0.845 * The p-values are testing for significant differences (p<0.05) in item ratings by the respondents' medical position. These were obtained using the Kruskal-Wallis test. Table 3: Distribution of SMO responses per question Question Number of respondents Median Lowest Score Highest Score 25thpercentile 75thpercentile Q1 -Clinical knowledge 22 3.0 2.0 5.0 2.0 4.0 Q2 -History presentation 22 3.0 1.0 5.0 2.0 4.0 Q3 -Diagnostic skills 22 3.0 2.0 5.0 2.0 4.0

Summary

Abstract

Aim

To investigate the reliability and intra-professional variation of senior and junior doctors in the assessment of a junior doctors clinical skills via video simulation.

Method

Simulation video was created showing 4 clinical scenarios. This video was shown to consultants, registrars and junior doctors in various forms at Auckland City Hospital. Participants evaluated each scenario against a modified version of the current assessment form used by the Medical Council of New Zealand.

Results

103 respondents completed the survey: 22 Senior Medical Officers, 17 registrars (PGY3+), 43 junior doctors (PGY1-2) and 21 undergraduates (medical students). Statistical significance between groups was reached only for Question 6 in which Senior Medical Officers rated communication skills and respect for patients lower than postgraduate students (p=0.005). Large variability was noted in ratings for presentation of history and clinical knowledge.

Conclusion

There is marked variation between Senior Medical Officers in the assessment of a junior doctors clinical practice as demonstrated by the use of a simulation video. This variation is of potential major concern. Quality training methods of assessors may need to be implemented for standardisation of assessment if a summative component exists.

Author Information

Maneesh Deva, Advanced Paediatric Trainee (General Paediatric); Tonya Kara, Project Supervisor and Paediatric Nephrologist; Starship Childrens Hospital, Auckland

Acknowledgements

I thank the Childrens Research Centre at Starship Childrens Hospital for their additional support as well as Dr Emily Chang.

Correspondence

Dr Maneesh Deva, Starship Children's Health, Private Bag 92024, Auckland 1142, New Zealand.

Correspondence Email

ManeeshD@adhb.govt.nz

Competing Interests

None known.

Williams RG, Klamen DA, McGaghie WC. Cognitive, Social and Environmental Sources of Bias in Clinical Performance Ratings. Teach Learn Med. 2003;15(4):270-292.Herbers JE Jr, Noel GL, Cooper GS, Harvey J et al. How accurate are faculty evaluations of clinical competence? J Gen Intern Med. 1989;4(3):202-8.Elliot DL, Hickam DH. Evaluation of Physical Examination Skills: Reliability of Faculty Observers and Patient Instructors. JAMA 1987;258(23):3405-3408Noel GL, Herbers JE Jr, Caplow MP, Cooper GS et al. How well do internal faculty members evaluate the clinical skills of residents? Ann Intern Med. 1992;117(9):757-65.Davies H, Archer J, Bateman A, Dewar S et al. Specialty-specific multi-source feedback: assuring validity, informing training. Med Educ. 2008:42:1014-20.Norcini JJ, McKinley DW. Assessment methods in medical education. Teach Teach Educ. 2007:23:239-50.

For the PDF of this article,
contact nzmj@nzma.org.nz

View Article PDF

Ward clinical assessments are used worldwide for evaluations of both undergraduate and postgraduate students. In New Zealand, after completing a clinical attachment, junior doctors are rated from a scale of 1 (poor) to 5 (excellent) using an assessment form devised by the Medical Council of New Zealand (MCNZ) on 19 distinct domains of practice by a senior clinician with or without input from other team members. These evaluations are primarily formative however a low rating (receiving two scores of ‘2' on an evaluation form, or any ‘1') can have serious consequences such as discontinuation of employment or non-accreditation of training.New Zealand currently has little coordinated training of assessors (Senior Medical Officers - SMOs) in actual assessment. Although some receive ad hoc training through university or college affiliation, literature regarding the reliability and inter-rater variability of SMOs in the assessment of junior doctors in New Zealand is scarce. This study aimed to assess the reliability and intra-professional variation of senior and junior doctors in the assessment of a junior doctor's clinical skills via video simulation.Method A clinical simulation video was created showing a junior doctor interacting with other staff members in four different clinical scenarios. These included a junior doctor interacting with a nurse (scene 1), presenting a patient work up to a consultant (scene 2), performing a physical examination (scene 3) and handover of care at the conclusion of a shift (scene 4). The video was then presented to groups of doctors in various teachings fora throughout Auckland City Hospital (ACH). Participants included SMOs, registrars, junior doctors (postgraduate year one and two) and medical students. Participants were given a modified version of the MCNZ junior doctor evaluation form tailored towards the video presented (Table 1). Each of the 10 domains of clinical practice were scored on a scale from 1 - unsatisfactory (performs significantly below that generally observed for this level of experience), 2 - below expectation (requires further development), 3 - meets expectation (performs at a satisfactory level), 4 - above expectation (performs at a level better than that which would be expected for the level of experience), 5 - exceptional (performs at a level beyond that which would be expected for the level of experience). The video was designed to simulate a range of a junior doctor's clinical performance as well as a range in quality. The junior doctor - nurse interaction was performed towards a ‘3', the patient work up presentation and physical examination performed towards a ‘5' and the handover of care performed towards a ‘1'. In the authors' experience, such variability is commonly seen by SMOs on a day to day basis within an individual trainees skills and abilities. To test for differences in the rated items by medical position, analysis was done using the Kruskal-Wallis test in SAS version 9.2 software. Results 103 respondents completed the evaluation form: 22 SMOs, 17 registrars, 43 junior doctors and 21 medical students. Respondent data was collated from a PGY1/2 teaching session (42 respondents), Medical Grand Round (48 respondents) and a General Surgery Grand Round (13 respondents). Background training of respondents included general medicine, gastroenterology, pathology, haematology, cardiology, endocrinology, pulmonology, infectious diseases and general surgery. A response rate was unable to be calculated as the video was shown to 3 group audiences at pre-arranged teaching sessions within ACH with surveys collected on exit from the session. Variation in rating (regardless of participants' medical position) was noted amongst all questions (Table 2). The range in responses for SMOs was greatest for question 1 (clinical knowledge) and question 2 (presentation of history). Statistical significance was noted between assessor groups only for Question 6 (p=0.005) - the ability to communicate with patients and families (respect for patients). Question 6 also received the highest rating from junior doctors (median=3.0) and the lowest rating from SMOs (median=1.0). It should be noted also that this item had the fewest number of people responding to it (n=70). Table 1. Clinical Knowledge and Skills 1 2 3 4 5 N/O 1 Clinical knowledge (e.g. knowledge of common symptoms, drug doses and side effects, drug interactions, etc) 2 Presentation of History skills Clinical Judgement 1 2 3 4 5 N/O 3 Diagnostic skills (Identifies and prioritises patient problems, appropriate physical examination) 4 Patient management (Synthesises data, makes appropriate management decisions) 5 Time management (Plans and organises work, sets goals and meets them, prioritises calls, seeks advice on priorities if needed) Patient Communication 1 2 3 4 5 N/O 6 Ability to communicate with patients and families (respect for patients) Communication and Teamwork 1 2 3 4 5 N/O 7 Ability to communicate with other healthcare professionals (ability to work in a multidisciplinary team and with all team members irrespective of gender, contributes effectively to teamwork) 8 Initiative and enthusiasm (able to identify needs of the job, follows up without being prompted, thinks and plans ahead) Professional Attitudes and Behaviour 1 2 3 4 5 N/O 9 Reliability and dependability (punctual, carries out instructions, fulfils obligations, complies with hospital policies, keep up to date with work) 10 Personal manner (approachability, warmth, openness, rapport etc) Table 2. SMO REGISTRAR JUNIOR DR STUDENT Question Number of respondents Median Median Median Median p-value* Q1 -Clinical knowledge 102 3.0 3.0 3.0 3.0 0.198 Q2 -History presentation 102 3.0 3.0 3.0 2.0 0.078 Q3 -Diagnostic skills 103 3.0 3.0 3.0 3.0 0.552 Q4- Patient management 97 3.0 3.0 3.0 2.0 0.631 Q5 -Time management 95 2.0 2.0 2.0 2.0 0.163 Q6 - Patient communication 70 1.0 2.5 3.0 2.0 0.005 Q7 -Peer communication 103 2.0 2.0 2.0 2.0 0.807 Q8 -Initiative 94 2.0 2.0 2.0 2.0 0.187 Q9 -Reliability 99 2.0 2.0 2.0 2.0 0.938 Q10 -Personal manner 102 2.0 2.0 2.0 2.0 0.845 * The p-values are testing for significant differences (p<0.05) in item ratings by the respondents' medical position. These were obtained using the Kruskal-Wallis test. Table 3: Distribution of SMO responses per question Question Number of respondents Median Lowest Score Highest Score 25thpercentile 75thpercentile Q1 -Clinical knowledge 22 3.0 2.0 5.0 2.0 4.0 Q2 -History presentation 22 3.0 1.0 5.0 2.0 4.0 Q3 -Diagnostic skills 22 3.0 2.0 5.0 2.0 4.0

Summary

Abstract

Aim

To investigate the reliability and intra-professional variation of senior and junior doctors in the assessment of a junior doctors clinical skills via video simulation.

Method

Simulation video was created showing 4 clinical scenarios. This video was shown to consultants, registrars and junior doctors in various forms at Auckland City Hospital. Participants evaluated each scenario against a modified version of the current assessment form used by the Medical Council of New Zealand.

Results

103 respondents completed the survey: 22 Senior Medical Officers, 17 registrars (PGY3+), 43 junior doctors (PGY1-2) and 21 undergraduates (medical students). Statistical significance between groups was reached only for Question 6 in which Senior Medical Officers rated communication skills and respect for patients lower than postgraduate students (p=0.005). Large variability was noted in ratings for presentation of history and clinical knowledge.

Conclusion

There is marked variation between Senior Medical Officers in the assessment of a junior doctors clinical practice as demonstrated by the use of a simulation video. This variation is of potential major concern. Quality training methods of assessors may need to be implemented for standardisation of assessment if a summative component exists.

Author Information

Maneesh Deva, Advanced Paediatric Trainee (General Paediatric); Tonya Kara, Project Supervisor and Paediatric Nephrologist; Starship Childrens Hospital, Auckland

Acknowledgements

I thank the Childrens Research Centre at Starship Childrens Hospital for their additional support as well as Dr Emily Chang.

Correspondence

Dr Maneesh Deva, Starship Children's Health, Private Bag 92024, Auckland 1142, New Zealand.

Correspondence Email

ManeeshD@adhb.govt.nz

Competing Interests

None known.

Williams RG, Klamen DA, McGaghie WC. Cognitive, Social and Environmental Sources of Bias in Clinical Performance Ratings. Teach Learn Med. 2003;15(4):270-292.Herbers JE Jr, Noel GL, Cooper GS, Harvey J et al. How accurate are faculty evaluations of clinical competence? J Gen Intern Med. 1989;4(3):202-8.Elliot DL, Hickam DH. Evaluation of Physical Examination Skills: Reliability of Faculty Observers and Patient Instructors. JAMA 1987;258(23):3405-3408Noel GL, Herbers JE Jr, Caplow MP, Cooper GS et al. How well do internal faculty members evaluate the clinical skills of residents? Ann Intern Med. 1992;117(9):757-65.Davies H, Archer J, Bateman A, Dewar S et al. Specialty-specific multi-source feedback: assuring validity, informing training. Med Educ. 2008:42:1014-20.Norcini JJ, McKinley DW. Assessment methods in medical education. Teach Teach Educ. 2007:23:239-50.

Contact diana@nzma.org.nz
for the PDF of this article

Subscriber Content

The full contents of this pages only available to subscribers.
Login, subscribe or email nzmj@nzma.org.nz to purchase this article.

LOGINSUBSCRIBE