View Article PDF

Within New Zealand, urgent care clinics exist as an effective community-based alternative to the emergency department in certain situations. Operating on a walk-in basis and containing full resuscitation facilities, the urgent care centre is a demanding and high-pressure work environment. A business case (2014) produced by the Auckland After Hours Health Care Network demonstrated over double the number of out-of-hours attendances to urgent care clinics compared to the emergency department.[[1]] The need to rapidly diagnose a large variety of conditions in this setting may contribute to diagnostic errors. Alongside implications for patient care, missed diagnoses have significant implications for patient satisfaction: identified as the most frequent cause for medical malpractice claims in the United States,[[2]] most diagnostic errors within the emergency department are linked to cases of minor trauma.[[3,4]] However, no studies to date have examined the frequency of discrepancies that occur during X-ray interpretation within the context of urgent care, or whether such diagnostic error has a clinically significant impact on patient management. The goal of this study is to establish the X-ray misinterpretation rate in cases of minor trauma that present to urgent care, where these errors occur, possible factors that contribute to misdiagnosis and whether such errors affect patient care.

Methods

Study design

This retrospective study was approved by the network’s clinical governance board. As a retrospective audit of clinical care, no ethical approval was needed. Data were collected from six urban North Island practices in New Zealand serving a population of 1.8 million. Two of these facilities are open 24 hours a day seven days a week, with the other four open 0730–2000 or 0800–2000 seven days a week. Four centres are dedicated urgent care clinics and two are hybrid GP and urgent care facilities. Clinics are typically staffed with between two and four doctors, with overnight cover between the hours of 2300-0700 being provided by one doctor at the two 24-hour clinics. Doctors working at these clinics come from a variety of backgrounds; for the purposes of this study, they have been classified as, urgent care trainees (UCT), urgent care fellows (UCF), general practitioners (GP), provisionally registered (PROV) and others (GEN). See Table 1 for an explanation of these classifications. Typically, PROV or UCT doctors will be paired on a shift with a GP or UCF doctor, allowing junior doctors to discuss films with a more experienced colleague.

View Table 1.

Plain film radiography is available in private facilities adjacent to every urgent care clinic, and these are staffed during daytime and/or evening hours depending on the clinic, with one 24-hour clinic also having an on-call radiographer available overnight. X-ray reporting is provided by radiologists working offsite, with an X-ray report available within 72 hours of image acquisition. As a result, urgent care doctors are expected to provide their own interpretation of any X-rays ordered, and to use this interpretation to determine best management. Once reports are generated, these are passed electronically on to the doctor who requested the X-ray, with all results being actioned either by the responsible doctor or another duty doctor within 48 hours.

All clinical record-keeping is done electronically using software that integrates an inbox of patient results, and data collection was performed using this software. The 5000 plain film interpretations included in this study were selected from trauma consultations occurring between March and August 2021 across the six clinics. The X-ray sites were classified as face, mandible, finger, thumb, hand, wrist, forearm, humerus, elbow, shoulder, clavicle, chest, cervical spine, thoracic spine, lumbar spine, sacrococcygeal spine, pelvis, hip, femur, knee, tibia and fibula, ankle, foot or toe. Data were also collected on the following: date and time of consultation; the age of patient; the type of doctor; the doctor’s initial interpretation of the X-ray; the radiologist’s formal report; and whether any subsequent change in management occurred after the viewing of the radiology report.

A discrepancy was recorded if the treating doctor’s initial interpretation did not agree with that of the reporting radiologist, and this was judged to be a clinically significant discrepancy (CSD) if it resulted in a change of management. Any subsequent follow up or further imaging was also noted.

Inclusion criteria

Records were included when a patient presented to a clinic following an injury, and received an X-ray with a report subsequently identifiable in the patient inbox. Where multiple X-rays were performed in the context of polytrauma, the interpretation of each X-ray was considered as a single data point.

Exclusion criteria

Consultations were excluded if either a formal radiology report was present at the time of X-ray interpretation and was therefore substituted for the treating doctor’s own interpretation, or if a radiology report was not linked to the patient record.

Statistical analysis

Results were analysed using Pearson’s chi-squared test and Pearson’s product moment correlation coefficient. Both were calculated using inbuilt software available on Microsoft Excel.

Results

Out of 5,000 interpretations analysed, 1,522 of these interpretations (30.4%) involved a paediatric patient, defined as a patient under the age of 15. Interpretations were evenly divided by gender, with 2,465 involving a female patient and 2,535, a male patient.

One thousand three hundred and fifty-four interpretations (27.1%) had an X-ray that was reported as having a positive finding. Six hundred and seventy-three discrepancies occurred (13.5%), with 171 of these judged to be CSDs (3.4%). Out of these discrepancies, there were 390 false positives (57.9%) where an abnormality was identified by the treating doctor but the X-ray was reported as normal, and 225 false negatives (33.4%) where the treating doctor interpreted the X-ray as normal but an abnormality was reported. Additionally, there were 58 cases (8.6%) involving both a false positive and false negative. This occurred where an abnormality was misidentified by the treating doctor at one site, and the radiologist reported a different abnormality at a second site.

Two hundred and thirty-four of the discrepancies and 90 of the CSDs had some form of follow-up or investigation that allowed the accuracy of the original radiology report to be assessed, as indicated in Table 2. For discrepancies, subsequent follow-up or investigation supported the conclusion of the radiologist in 77.8% of cases and that of the urgent care doctor in 16.2% of cases, with the remaining 6.0% suggesting a diagnosis different to that of either party. Figures were approximately the same for CSDs, with subsequent follow up or investigation agreeing with the radiologist 78.9% of the time, the urgent care doctor 18.9% of the time, and neither 2.2% of the time.

View Table 2.

The upper limb and lower limb each accounted for roughly half of all X-rays taken (2386 and 2245 interpretations, respectively), with the ankle being the most popular single site to X-ray (946 interpretations). The wrist and finger were the sites where the most abnormalities were detected (239 and 230 interpretations, respectively), while the clavicle and toe had the highest rates of abnormalities per X-ray performed (55.6% and 48.6%). For sites commonly X-rayed (more than 100 interpretations), the chest and elbow had the highest discrepancy rates in interpretation (24.0% and 23.8%) and false negatives (11.9% and 8.2%), while the elbow and hand had the highest rates of CSDs (8.2% and 6.5%). These results are tabulated in Appendix 1.

Of the abnormalities identified, finger, distal radius and toe fractures occurred most frequently (255, 200 and 148 consultations), while finger, clavicle and toe fractures had the highest likelihoods of being found on performing an X-ray at their respective sites (66.7%, 50.3% and 42.4%). For abnormalities with five or more occurrences in this dataset, lipohemarthrosis, bone tumours and glenoid fractures had the highest miss rates (85.7%, 80% and 80%), while bone tumours, glenoid fractures and patella fractures had the highest miss rates requiring a change in management (80%, 40% and 30.8%). For abnormalities with 20 or more occurrences, elbow effusions, distal tibia fractures and rib fractures had the highest miss rates (31.0%, 30.3% and 28.6%), while scaphoid fractures, proximal humerus fractures and elbow effusions had the highest miss rates requiring a change in management (15%, 11.5% and 10.3%). See Appendix 2 for a more comprehensive illustration of these figures.

With regards to the grade of the interpreting doctor, a discrepancy was most likely to be recorded if a GP made the initial interpretation, while the GEN and PROV categories accounted for the most CSDs. UCF interpretations proved to have both the least discrepancies overall as well as the fewest CSDs, as demonstrated in Table 3 and Table 4. The largest amount of variation in error rates between these groups was accounted for by the false positive rates for discrepancies: over 4% variation in this rate was noted on comparing GP to UCF. Less than 2% of variation was explained by training when comparing false negative rates for discrepancies, or false positive and negative rates for CSDs.

Assuming a significance level of alpha = .05 and using Pearson’s chi-squared test to assess for goodness of fit, UCF interpretations resulted in significantly fewer discrepancies than PROV (P = .00135) or GP (P < .001) interpretations as shown in Appendix 3, and significantly fewer CSDs than PROV (P = .015) or GEN (P = .015) interpretations as shown in Appendix 4. Significantly fewer false positive errors were also made by the UCF group compared to the GP (P < .001), PROV (P < .001) and UCT (P = .00846) groups as demonstrated in Appendix 5.

The age of the patient also influenced the frequency of discrepancies, with significantly more CSDs (P < .001) present in the paediatric sample. This can be explained by a significantly larger number of false positives within this group, and no significant difference in the number of false negatives.

The gender of the patient had no significant impact on the frequency of discrepancies or CSDs. The time of consultation also had no significant impact on discrepancy or CSD rate when comparing out of hours (1700–0800) consultations to those occurring during normal working hours.

Of the 171 CSDs, the main impacts on management were removal of casts applied at the initial consult (n = 51), recall for review of the injury in the urgent care clinic (n = 37), referral to orthopaedic clinic (n = 29) and application of a new cast (n = 18). A full breakdown of the impact on management is depicted in Table 5.

View Tables 3–5.

In only two cases—both involving missed fractures of the hip (see Figure 1)—was a patient referred to hospital for operative management. No complaints were identified that originated from either misdiagnosis or mismanagement.

View Figures 1–5.

Discussion

No data currently exists on error rates among urgent care physicians when interpreting X-rays. However, equivalent studies based in the emergency department estimate the discrepancy rate between emergency physician’s and radiologist’s readings as between 1% and 28%,[[5–15]] with errors that result in a change in management occurring at a rate of between 0% and 9%.[[5–15]] The findings of this study therefore suggest that comparable error rates exist between urgent care doctors at our facilities and emergency physicians for X-ray interpretation.

In our study, we found that factors contributing to mistakes included whether the patient was younger than 15 years old and the type of doctor interpreting the X-ray, both of which have a significant impact on patient management.

For CSDs amongst paediatric consults, the larger difference in error rate, when compared to the adult cohort, is accounted for by a larger proportion of false positives but the same proportion of false negatives. This suggests that urgent care doctors are more conservative in their interpretation of paediatric X-rays, and do not miss more fractures compared to the adult population—a reassuring result given that a false positive will at worst result in unnecessary immobilisation pending the formal radiology report.

For discrepancies that affected patient management, it is evident that the UCF category made significantly fewer errors compared to the GEN and PROV groups. Within the emergency department, it has been shown that physicians in training are more likely to make interpretive errors compared to those who have completed training.[[10]] This suggests that enrolment in or completion of a vocational training programme offers benefits in reducing interpretive errors that result in a change in patient management. As noted, while the GP group made mistakes more frequently, this did not translate into a proportionately higher rate of changes to patient management for this group, with GPs in fact having the lowest false negative (or missed abnormality) rate for CSDs.

Previous studies based in the emergency department have identified the foot[[15,19]] as the most common site of missed fractures, which differs from the results of this study where the chest and elbow accounted for the most frequent discrepancies in interpretation. However, as noted by Wei et al,[[19]] missed foot fractures in the emergency department typically occur in the context of major trauma, and it would be expected that these patients bypass urgent care clinics.

Our results suggest that the urgent care doctor’s initial interpretation may have been correct for a proportion of discrepancies and CSDs. One study on the interpretation of plain film X-rays originating in the emergency department demonstrated that, on double reading of an X-ray by experienced radiologists, disagreement occurred in 10–12% of skeletal and chest radiographs.[[20]] Given such rates of interobserver variability amongst radiologists, it is entirely possible that the true discrepancy and CSD rates for this dataset are somewhat lower.

Some of the important missed abnormalities are illustrated in Figures 2–5.

Conclusion

In this study we found that 13.5% of X-rays reported were discordant with the subsequent report from the radiologist, with 3.4% of these judged to be CSDs. This is similar to the rate in the emergency department setting reported in the literature. In only two cases did patients have acute operative management delayed. In both these cases, the patients involved were safely discharged following an operation with no complications identified due to delay in their treatment. Furthermore, when a patient was recalled for immobilisation, no cases of subsequent malunion or non-union were identified. While every mistake represents an opportunity for improvement, this suggests an excellent standard of care in the urgent care setting.

Potential avenues for reducing error include having a protocol where X-rays ordered by less experienced doctors are checked by the senior doctor on shift, and urgent care specific education. With regards to the latter, the findings of this study are being incorporated into an online teaching resource with ongoing discussions regarding how best to integrate this into the RNZCUC training programme.

View Appendices.

Summary

Abstract

Aim

To assess the error rate in plain film interpretation amongst urgent care doctors in the context of minor trauma, to determine where such errors occur and whether they affect patient care, and to identify possible causative factors.

Method

Five thousand X-ray interpretations occurring between March and August 2021 across six urgent care clinics were included in this retrospective study. Data analysis focused on demographic data, site of injury, the experience of the doctor interpreting the X-ray, and whether any change in management occurred following an error.

Results

Six hundred and seventy-three X-ray interpretation errors occurred (13.5%), with 171 of these (3.4%) resulting in a change in patient management. Chest and elbow X-rays were misinterpreted most often. Both the age of the patient and training of the urgent care doctor had a significant effect on this error rate. The main impacts on patient management were cast removal and recall for review in the urgent care centre or an orthopaedic clinic.

Conclusion

X-ray misinterpretation occurs at equivalent rates in urgent care when compared to the emergency department. Errors occur more commonly with paediatric patients and for doctors with less urgent care-specific training. These errors rarely result in any serious impact on patient management.

Author Information

Crispian Wilson: Urgent care doctor, ℅ Accident & Healthcare, 19 Second Avenue, Tauranga.

Acknowledgements

Correspondence

Crispian Wilson: Urgent care doctor, ℅ Accident & Healthcare, 19 Second Avenue, Tauranga. 3110, 07-577 0010.

Correspondence Email

crispian.e.wilson@gmail.com

Competing Interests

Nil.

1) After-hours Health Care Network. After-hours business case. 2014. Unpublished manuscript. Retrieved from https://www.adhb.health.nz/assets/Documents/OIA/2020/11-November-2020/Urgent-care-services.pdf

2) Studdert DM, Mello MM, Gawande AA, et al. Claims, errors, and compensation payments in medical malpractice litigation. N Engl J Med. 2006;354:2024-33.

3) Hallas P, Ellingsen T. Errors in fracture diagnoses in the emergency department--characteristics of patients and diurnal variation. BMC Emerg Med. 2006;6:4.

4) Mounts J, Clingenpeel J, McGuire E, et al. Most frequently missed fractures in the emergency department. Clin Pediatr (Phila). 2011;50:183-6.

5) Gratton MC, Salomone JA 3rd, Watson WA. Clinically significant radiograph misinterpretations at an emergency medicine residency program. Ann Emerg Med. 1990;19:497-502.

6) Higginson I, Vogel S, Thompson J, Aickin R. Do radiographs requested from a paediatric emergency department in New Zealand need reporting? Emerg Med Australas. 2004;16:288-94.

7) Klein EJ, Koenig M, Diekema DS, Winters W. Discordant radiograph interpretation between emergency physicians and radiologists in a pediatric emergency department. Pediatr Emerg Care. 1999;15:245-8.

8) Nitowski LA, O'Connor RE, Reese CL 4th. The rate of clinically significant plain radiograph misinterpretation by faculty in an emergency medicine residency program. Acad Emerg Med. 1996;3:782-9.

9) Petinaux B, Bhat R, Boniface K, Aristizabal J. Accuracy of radiographic readings in the emergency department. Am J Emerg Med. 2011;29:18-25.

10) Walsh-Kelly CM, Melzer-Lange MD, Hennes HM, et al. Clinical impact of radiograph misinterpretation in a pediatric ED and the effect of physician training level. Am J Emerg Med. 1995;13:262-4.

11) Shirm SW, Graham CJ, Seibert JJ, et al. Clinical effect of a quality assurance system for radiographs in a pediatric emergency department. Pediatr Emerg Care. 1995;11:351-4.

12) Simon HK, Khan NS, Nordenberg DF, Wright JA. Pediatric emergency physician interpretation of plain radiographs: Is routine review by a radiologist necessary and cost-effective? Ann Emerg Med. 1996;27:295-8.

13) Benger JR, Lyburn ID. What is the effect of reporting all emergency department radiographs? Emerg Med J. 2003;20:40-3.

14) Kim SJ, Lee SW, Hong YS, Kim DH. Radiological misinterpretations by emergency physicians in discharged minor trauma patients. Emerg Med J. 2012;29:635-9.

15) Catapano M, Albano D, Pozzi G, et al. Differences between orthopaedic evaluation and radiological reports of conventional radiographs in patients with minor trauma admitted to the emergency department. Injury. 2017;48:2451-2456

16) Frich LH, Larsen MS. How to deal with a glenoid fracture. EFORT Open Rev. 2017;2:151-157

17) Major NM, Crawford ST. Elbow effusions in trauma in adults and children: is there an occult fracture? AJR Am J Roentgenol. 2002;178:413-8.

18) Waite S, Scott J, Gale B, et al. Interpretive Error in Radiology. AJR Am J Roentgenol. 2017;208:739-749

19) Wei CJ, Tsai WC, Tiu CM, et al. Systematic analysis of missed extremity fractures in emergency radiology. Acta Radiol. 2006;47:710–7

20) Robinson PJ, Wilson D, Coral A, et al. Variation between experienced observers in the interpretation of accident and emergency radiographs. Br J Radiol. 1999;72:323-30.

For the PDF of this article,
contact nzmj@nzma.org.nz

View Article PDF

Within New Zealand, urgent care clinics exist as an effective community-based alternative to the emergency department in certain situations. Operating on a walk-in basis and containing full resuscitation facilities, the urgent care centre is a demanding and high-pressure work environment. A business case (2014) produced by the Auckland After Hours Health Care Network demonstrated over double the number of out-of-hours attendances to urgent care clinics compared to the emergency department.[[1]] The need to rapidly diagnose a large variety of conditions in this setting may contribute to diagnostic errors. Alongside implications for patient care, missed diagnoses have significant implications for patient satisfaction: identified as the most frequent cause for medical malpractice claims in the United States,[[2]] most diagnostic errors within the emergency department are linked to cases of minor trauma.[[3,4]] However, no studies to date have examined the frequency of discrepancies that occur during X-ray interpretation within the context of urgent care, or whether such diagnostic error has a clinically significant impact on patient management. The goal of this study is to establish the X-ray misinterpretation rate in cases of minor trauma that present to urgent care, where these errors occur, possible factors that contribute to misdiagnosis and whether such errors affect patient care.

Methods

Study design

This retrospective study was approved by the network’s clinical governance board. As a retrospective audit of clinical care, no ethical approval was needed. Data were collected from six urban North Island practices in New Zealand serving a population of 1.8 million. Two of these facilities are open 24 hours a day seven days a week, with the other four open 0730–2000 or 0800–2000 seven days a week. Four centres are dedicated urgent care clinics and two are hybrid GP and urgent care facilities. Clinics are typically staffed with between two and four doctors, with overnight cover between the hours of 2300-0700 being provided by one doctor at the two 24-hour clinics. Doctors working at these clinics come from a variety of backgrounds; for the purposes of this study, they have been classified as, urgent care trainees (UCT), urgent care fellows (UCF), general practitioners (GP), provisionally registered (PROV) and others (GEN). See Table 1 for an explanation of these classifications. Typically, PROV or UCT doctors will be paired on a shift with a GP or UCF doctor, allowing junior doctors to discuss films with a more experienced colleague.

View Table 1.

Plain film radiography is available in private facilities adjacent to every urgent care clinic, and these are staffed during daytime and/or evening hours depending on the clinic, with one 24-hour clinic also having an on-call radiographer available overnight. X-ray reporting is provided by radiologists working offsite, with an X-ray report available within 72 hours of image acquisition. As a result, urgent care doctors are expected to provide their own interpretation of any X-rays ordered, and to use this interpretation to determine best management. Once reports are generated, these are passed electronically on to the doctor who requested the X-ray, with all results being actioned either by the responsible doctor or another duty doctor within 48 hours.

All clinical record-keeping is done electronically using software that integrates an inbox of patient results, and data collection was performed using this software. The 5000 plain film interpretations included in this study were selected from trauma consultations occurring between March and August 2021 across the six clinics. The X-ray sites were classified as face, mandible, finger, thumb, hand, wrist, forearm, humerus, elbow, shoulder, clavicle, chest, cervical spine, thoracic spine, lumbar spine, sacrococcygeal spine, pelvis, hip, femur, knee, tibia and fibula, ankle, foot or toe. Data were also collected on the following: date and time of consultation; the age of patient; the type of doctor; the doctor’s initial interpretation of the X-ray; the radiologist’s formal report; and whether any subsequent change in management occurred after the viewing of the radiology report.

A discrepancy was recorded if the treating doctor’s initial interpretation did not agree with that of the reporting radiologist, and this was judged to be a clinically significant discrepancy (CSD) if it resulted in a change of management. Any subsequent follow up or further imaging was also noted.

Inclusion criteria

Records were included when a patient presented to a clinic following an injury, and received an X-ray with a report subsequently identifiable in the patient inbox. Where multiple X-rays were performed in the context of polytrauma, the interpretation of each X-ray was considered as a single data point.

Exclusion criteria

Consultations were excluded if either a formal radiology report was present at the time of X-ray interpretation and was therefore substituted for the treating doctor’s own interpretation, or if a radiology report was not linked to the patient record.

Statistical analysis

Results were analysed using Pearson’s chi-squared test and Pearson’s product moment correlation coefficient. Both were calculated using inbuilt software available on Microsoft Excel.

Results

Out of 5,000 interpretations analysed, 1,522 of these interpretations (30.4%) involved a paediatric patient, defined as a patient under the age of 15. Interpretations were evenly divided by gender, with 2,465 involving a female patient and 2,535, a male patient.

One thousand three hundred and fifty-four interpretations (27.1%) had an X-ray that was reported as having a positive finding. Six hundred and seventy-three discrepancies occurred (13.5%), with 171 of these judged to be CSDs (3.4%). Out of these discrepancies, there were 390 false positives (57.9%) where an abnormality was identified by the treating doctor but the X-ray was reported as normal, and 225 false negatives (33.4%) where the treating doctor interpreted the X-ray as normal but an abnormality was reported. Additionally, there were 58 cases (8.6%) involving both a false positive and false negative. This occurred where an abnormality was misidentified by the treating doctor at one site, and the radiologist reported a different abnormality at a second site.

Two hundred and thirty-four of the discrepancies and 90 of the CSDs had some form of follow-up or investigation that allowed the accuracy of the original radiology report to be assessed, as indicated in Table 2. For discrepancies, subsequent follow-up or investigation supported the conclusion of the radiologist in 77.8% of cases and that of the urgent care doctor in 16.2% of cases, with the remaining 6.0% suggesting a diagnosis different to that of either party. Figures were approximately the same for CSDs, with subsequent follow up or investigation agreeing with the radiologist 78.9% of the time, the urgent care doctor 18.9% of the time, and neither 2.2% of the time.

View Table 2.

The upper limb and lower limb each accounted for roughly half of all X-rays taken (2386 and 2245 interpretations, respectively), with the ankle being the most popular single site to X-ray (946 interpretations). The wrist and finger were the sites where the most abnormalities were detected (239 and 230 interpretations, respectively), while the clavicle and toe had the highest rates of abnormalities per X-ray performed (55.6% and 48.6%). For sites commonly X-rayed (more than 100 interpretations), the chest and elbow had the highest discrepancy rates in interpretation (24.0% and 23.8%) and false negatives (11.9% and 8.2%), while the elbow and hand had the highest rates of CSDs (8.2% and 6.5%). These results are tabulated in Appendix 1.

Of the abnormalities identified, finger, distal radius and toe fractures occurred most frequently (255, 200 and 148 consultations), while finger, clavicle and toe fractures had the highest likelihoods of being found on performing an X-ray at their respective sites (66.7%, 50.3% and 42.4%). For abnormalities with five or more occurrences in this dataset, lipohemarthrosis, bone tumours and glenoid fractures had the highest miss rates (85.7%, 80% and 80%), while bone tumours, glenoid fractures and patella fractures had the highest miss rates requiring a change in management (80%, 40% and 30.8%). For abnormalities with 20 or more occurrences, elbow effusions, distal tibia fractures and rib fractures had the highest miss rates (31.0%, 30.3% and 28.6%), while scaphoid fractures, proximal humerus fractures and elbow effusions had the highest miss rates requiring a change in management (15%, 11.5% and 10.3%). See Appendix 2 for a more comprehensive illustration of these figures.

With regards to the grade of the interpreting doctor, a discrepancy was most likely to be recorded if a GP made the initial interpretation, while the GEN and PROV categories accounted for the most CSDs. UCF interpretations proved to have both the least discrepancies overall as well as the fewest CSDs, as demonstrated in Table 3 and Table 4. The largest amount of variation in error rates between these groups was accounted for by the false positive rates for discrepancies: over 4% variation in this rate was noted on comparing GP to UCF. Less than 2% of variation was explained by training when comparing false negative rates for discrepancies, or false positive and negative rates for CSDs.

Assuming a significance level of alpha = .05 and using Pearson’s chi-squared test to assess for goodness of fit, UCF interpretations resulted in significantly fewer discrepancies than PROV (P = .00135) or GP (P < .001) interpretations as shown in Appendix 3, and significantly fewer CSDs than PROV (P = .015) or GEN (P = .015) interpretations as shown in Appendix 4. Significantly fewer false positive errors were also made by the UCF group compared to the GP (P < .001), PROV (P < .001) and UCT (P = .00846) groups as demonstrated in Appendix 5.

The age of the patient also influenced the frequency of discrepancies, with significantly more CSDs (P < .001) present in the paediatric sample. This can be explained by a significantly larger number of false positives within this group, and no significant difference in the number of false negatives.

The gender of the patient had no significant impact on the frequency of discrepancies or CSDs. The time of consultation also had no significant impact on discrepancy or CSD rate when comparing out of hours (1700–0800) consultations to those occurring during normal working hours.

Of the 171 CSDs, the main impacts on management were removal of casts applied at the initial consult (n = 51), recall for review of the injury in the urgent care clinic (n = 37), referral to orthopaedic clinic (n = 29) and application of a new cast (n = 18). A full breakdown of the impact on management is depicted in Table 5.

View Tables 3–5.

In only two cases—both involving missed fractures of the hip (see Figure 1)—was a patient referred to hospital for operative management. No complaints were identified that originated from either misdiagnosis or mismanagement.

View Figures 1–5.

Discussion

No data currently exists on error rates among urgent care physicians when interpreting X-rays. However, equivalent studies based in the emergency department estimate the discrepancy rate between emergency physician’s and radiologist’s readings as between 1% and 28%,[[5–15]] with errors that result in a change in management occurring at a rate of between 0% and 9%.[[5–15]] The findings of this study therefore suggest that comparable error rates exist between urgent care doctors at our facilities and emergency physicians for X-ray interpretation.

In our study, we found that factors contributing to mistakes included whether the patient was younger than 15 years old and the type of doctor interpreting the X-ray, both of which have a significant impact on patient management.

For CSDs amongst paediatric consults, the larger difference in error rate, when compared to the adult cohort, is accounted for by a larger proportion of false positives but the same proportion of false negatives. This suggests that urgent care doctors are more conservative in their interpretation of paediatric X-rays, and do not miss more fractures compared to the adult population—a reassuring result given that a false positive will at worst result in unnecessary immobilisation pending the formal radiology report.

For discrepancies that affected patient management, it is evident that the UCF category made significantly fewer errors compared to the GEN and PROV groups. Within the emergency department, it has been shown that physicians in training are more likely to make interpretive errors compared to those who have completed training.[[10]] This suggests that enrolment in or completion of a vocational training programme offers benefits in reducing interpretive errors that result in a change in patient management. As noted, while the GP group made mistakes more frequently, this did not translate into a proportionately higher rate of changes to patient management for this group, with GPs in fact having the lowest false negative (or missed abnormality) rate for CSDs.

Previous studies based in the emergency department have identified the foot[[15,19]] as the most common site of missed fractures, which differs from the results of this study where the chest and elbow accounted for the most frequent discrepancies in interpretation. However, as noted by Wei et al,[[19]] missed foot fractures in the emergency department typically occur in the context of major trauma, and it would be expected that these patients bypass urgent care clinics.

Our results suggest that the urgent care doctor’s initial interpretation may have been correct for a proportion of discrepancies and CSDs. One study on the interpretation of plain film X-rays originating in the emergency department demonstrated that, on double reading of an X-ray by experienced radiologists, disagreement occurred in 10–12% of skeletal and chest radiographs.[[20]] Given such rates of interobserver variability amongst radiologists, it is entirely possible that the true discrepancy and CSD rates for this dataset are somewhat lower.

Some of the important missed abnormalities are illustrated in Figures 2–5.

Conclusion

In this study we found that 13.5% of X-rays reported were discordant with the subsequent report from the radiologist, with 3.4% of these judged to be CSDs. This is similar to the rate in the emergency department setting reported in the literature. In only two cases did patients have acute operative management delayed. In both these cases, the patients involved were safely discharged following an operation with no complications identified due to delay in their treatment. Furthermore, when a patient was recalled for immobilisation, no cases of subsequent malunion or non-union were identified. While every mistake represents an opportunity for improvement, this suggests an excellent standard of care in the urgent care setting.

Potential avenues for reducing error include having a protocol where X-rays ordered by less experienced doctors are checked by the senior doctor on shift, and urgent care specific education. With regards to the latter, the findings of this study are being incorporated into an online teaching resource with ongoing discussions regarding how best to integrate this into the RNZCUC training programme.

View Appendices.

Summary

Abstract

Aim

To assess the error rate in plain film interpretation amongst urgent care doctors in the context of minor trauma, to determine where such errors occur and whether they affect patient care, and to identify possible causative factors.

Method

Five thousand X-ray interpretations occurring between March and August 2021 across six urgent care clinics were included in this retrospective study. Data analysis focused on demographic data, site of injury, the experience of the doctor interpreting the X-ray, and whether any change in management occurred following an error.

Results

Six hundred and seventy-three X-ray interpretation errors occurred (13.5%), with 171 of these (3.4%) resulting in a change in patient management. Chest and elbow X-rays were misinterpreted most often. Both the age of the patient and training of the urgent care doctor had a significant effect on this error rate. The main impacts on patient management were cast removal and recall for review in the urgent care centre or an orthopaedic clinic.

Conclusion

X-ray misinterpretation occurs at equivalent rates in urgent care when compared to the emergency department. Errors occur more commonly with paediatric patients and for doctors with less urgent care-specific training. These errors rarely result in any serious impact on patient management.

Author Information

Crispian Wilson: Urgent care doctor, ℅ Accident & Healthcare, 19 Second Avenue, Tauranga.

Acknowledgements

Correspondence

Crispian Wilson: Urgent care doctor, ℅ Accident & Healthcare, 19 Second Avenue, Tauranga. 3110, 07-577 0010.

Correspondence Email

crispian.e.wilson@gmail.com

Competing Interests

Nil.

1) After-hours Health Care Network. After-hours business case. 2014. Unpublished manuscript. Retrieved from https://www.adhb.health.nz/assets/Documents/OIA/2020/11-November-2020/Urgent-care-services.pdf

2) Studdert DM, Mello MM, Gawande AA, et al. Claims, errors, and compensation payments in medical malpractice litigation. N Engl J Med. 2006;354:2024-33.

3) Hallas P, Ellingsen T. Errors in fracture diagnoses in the emergency department--characteristics of patients and diurnal variation. BMC Emerg Med. 2006;6:4.

4) Mounts J, Clingenpeel J, McGuire E, et al. Most frequently missed fractures in the emergency department. Clin Pediatr (Phila). 2011;50:183-6.

5) Gratton MC, Salomone JA 3rd, Watson WA. Clinically significant radiograph misinterpretations at an emergency medicine residency program. Ann Emerg Med. 1990;19:497-502.

6) Higginson I, Vogel S, Thompson J, Aickin R. Do radiographs requested from a paediatric emergency department in New Zealand need reporting? Emerg Med Australas. 2004;16:288-94.

7) Klein EJ, Koenig M, Diekema DS, Winters W. Discordant radiograph interpretation between emergency physicians and radiologists in a pediatric emergency department. Pediatr Emerg Care. 1999;15:245-8.

8) Nitowski LA, O'Connor RE, Reese CL 4th. The rate of clinically significant plain radiograph misinterpretation by faculty in an emergency medicine residency program. Acad Emerg Med. 1996;3:782-9.

9) Petinaux B, Bhat R, Boniface K, Aristizabal J. Accuracy of radiographic readings in the emergency department. Am J Emerg Med. 2011;29:18-25.

10) Walsh-Kelly CM, Melzer-Lange MD, Hennes HM, et al. Clinical impact of radiograph misinterpretation in a pediatric ED and the effect of physician training level. Am J Emerg Med. 1995;13:262-4.

11) Shirm SW, Graham CJ, Seibert JJ, et al. Clinical effect of a quality assurance system for radiographs in a pediatric emergency department. Pediatr Emerg Care. 1995;11:351-4.

12) Simon HK, Khan NS, Nordenberg DF, Wright JA. Pediatric emergency physician interpretation of plain radiographs: Is routine review by a radiologist necessary and cost-effective? Ann Emerg Med. 1996;27:295-8.

13) Benger JR, Lyburn ID. What is the effect of reporting all emergency department radiographs? Emerg Med J. 2003;20:40-3.

14) Kim SJ, Lee SW, Hong YS, Kim DH. Radiological misinterpretations by emergency physicians in discharged minor trauma patients. Emerg Med J. 2012;29:635-9.

15) Catapano M, Albano D, Pozzi G, et al. Differences between orthopaedic evaluation and radiological reports of conventional radiographs in patients with minor trauma admitted to the emergency department. Injury. 2017;48:2451-2456

16) Frich LH, Larsen MS. How to deal with a glenoid fracture. EFORT Open Rev. 2017;2:151-157

17) Major NM, Crawford ST. Elbow effusions in trauma in adults and children: is there an occult fracture? AJR Am J Roentgenol. 2002;178:413-8.

18) Waite S, Scott J, Gale B, et al. Interpretive Error in Radiology. AJR Am J Roentgenol. 2017;208:739-749

19) Wei CJ, Tsai WC, Tiu CM, et al. Systematic analysis of missed extremity fractures in emergency radiology. Acta Radiol. 2006;47:710–7

20) Robinson PJ, Wilson D, Coral A, et al. Variation between experienced observers in the interpretation of accident and emergency radiographs. Br J Radiol. 1999;72:323-30.

For the PDF of this article,
contact nzmj@nzma.org.nz

View Article PDF

Within New Zealand, urgent care clinics exist as an effective community-based alternative to the emergency department in certain situations. Operating on a walk-in basis and containing full resuscitation facilities, the urgent care centre is a demanding and high-pressure work environment. A business case (2014) produced by the Auckland After Hours Health Care Network demonstrated over double the number of out-of-hours attendances to urgent care clinics compared to the emergency department.[[1]] The need to rapidly diagnose a large variety of conditions in this setting may contribute to diagnostic errors. Alongside implications for patient care, missed diagnoses have significant implications for patient satisfaction: identified as the most frequent cause for medical malpractice claims in the United States,[[2]] most diagnostic errors within the emergency department are linked to cases of minor trauma.[[3,4]] However, no studies to date have examined the frequency of discrepancies that occur during X-ray interpretation within the context of urgent care, or whether such diagnostic error has a clinically significant impact on patient management. The goal of this study is to establish the X-ray misinterpretation rate in cases of minor trauma that present to urgent care, where these errors occur, possible factors that contribute to misdiagnosis and whether such errors affect patient care.

Methods

Study design

This retrospective study was approved by the network’s clinical governance board. As a retrospective audit of clinical care, no ethical approval was needed. Data were collected from six urban North Island practices in New Zealand serving a population of 1.8 million. Two of these facilities are open 24 hours a day seven days a week, with the other four open 0730–2000 or 0800–2000 seven days a week. Four centres are dedicated urgent care clinics and two are hybrid GP and urgent care facilities. Clinics are typically staffed with between two and four doctors, with overnight cover between the hours of 2300-0700 being provided by one doctor at the two 24-hour clinics. Doctors working at these clinics come from a variety of backgrounds; for the purposes of this study, they have been classified as, urgent care trainees (UCT), urgent care fellows (UCF), general practitioners (GP), provisionally registered (PROV) and others (GEN). See Table 1 for an explanation of these classifications. Typically, PROV or UCT doctors will be paired on a shift with a GP or UCF doctor, allowing junior doctors to discuss films with a more experienced colleague.

View Table 1.

Plain film radiography is available in private facilities adjacent to every urgent care clinic, and these are staffed during daytime and/or evening hours depending on the clinic, with one 24-hour clinic also having an on-call radiographer available overnight. X-ray reporting is provided by radiologists working offsite, with an X-ray report available within 72 hours of image acquisition. As a result, urgent care doctors are expected to provide their own interpretation of any X-rays ordered, and to use this interpretation to determine best management. Once reports are generated, these are passed electronically on to the doctor who requested the X-ray, with all results being actioned either by the responsible doctor or another duty doctor within 48 hours.

All clinical record-keeping is done electronically using software that integrates an inbox of patient results, and data collection was performed using this software. The 5000 plain film interpretations included in this study were selected from trauma consultations occurring between March and August 2021 across the six clinics. The X-ray sites were classified as face, mandible, finger, thumb, hand, wrist, forearm, humerus, elbow, shoulder, clavicle, chest, cervical spine, thoracic spine, lumbar spine, sacrococcygeal spine, pelvis, hip, femur, knee, tibia and fibula, ankle, foot or toe. Data were also collected on the following: date and time of consultation; the age of patient; the type of doctor; the doctor’s initial interpretation of the X-ray; the radiologist’s formal report; and whether any subsequent change in management occurred after the viewing of the radiology report.

A discrepancy was recorded if the treating doctor’s initial interpretation did not agree with that of the reporting radiologist, and this was judged to be a clinically significant discrepancy (CSD) if it resulted in a change of management. Any subsequent follow up or further imaging was also noted.

Inclusion criteria

Records were included when a patient presented to a clinic following an injury, and received an X-ray with a report subsequently identifiable in the patient inbox. Where multiple X-rays were performed in the context of polytrauma, the interpretation of each X-ray was considered as a single data point.

Exclusion criteria

Consultations were excluded if either a formal radiology report was present at the time of X-ray interpretation and was therefore substituted for the treating doctor’s own interpretation, or if a radiology report was not linked to the patient record.

Statistical analysis

Results were analysed using Pearson’s chi-squared test and Pearson’s product moment correlation coefficient. Both were calculated using inbuilt software available on Microsoft Excel.

Results

Out of 5,000 interpretations analysed, 1,522 of these interpretations (30.4%) involved a paediatric patient, defined as a patient under the age of 15. Interpretations were evenly divided by gender, with 2,465 involving a female patient and 2,535, a male patient.

One thousand three hundred and fifty-four interpretations (27.1%) had an X-ray that was reported as having a positive finding. Six hundred and seventy-three discrepancies occurred (13.5%), with 171 of these judged to be CSDs (3.4%). Out of these discrepancies, there were 390 false positives (57.9%) where an abnormality was identified by the treating doctor but the X-ray was reported as normal, and 225 false negatives (33.4%) where the treating doctor interpreted the X-ray as normal but an abnormality was reported. Additionally, there were 58 cases (8.6%) involving both a false positive and false negative. This occurred where an abnormality was misidentified by the treating doctor at one site, and the radiologist reported a different abnormality at a second site.

Two hundred and thirty-four of the discrepancies and 90 of the CSDs had some form of follow-up or investigation that allowed the accuracy of the original radiology report to be assessed, as indicated in Table 2. For discrepancies, subsequent follow-up or investigation supported the conclusion of the radiologist in 77.8% of cases and that of the urgent care doctor in 16.2% of cases, with the remaining 6.0% suggesting a diagnosis different to that of either party. Figures were approximately the same for CSDs, with subsequent follow up or investigation agreeing with the radiologist 78.9% of the time, the urgent care doctor 18.9% of the time, and neither 2.2% of the time.

View Table 2.

The upper limb and lower limb each accounted for roughly half of all X-rays taken (2386 and 2245 interpretations, respectively), with the ankle being the most popular single site to X-ray (946 interpretations). The wrist and finger were the sites where the most abnormalities were detected (239 and 230 interpretations, respectively), while the clavicle and toe had the highest rates of abnormalities per X-ray performed (55.6% and 48.6%). For sites commonly X-rayed (more than 100 interpretations), the chest and elbow had the highest discrepancy rates in interpretation (24.0% and 23.8%) and false negatives (11.9% and 8.2%), while the elbow and hand had the highest rates of CSDs (8.2% and 6.5%). These results are tabulated in Appendix 1.

Of the abnormalities identified, finger, distal radius and toe fractures occurred most frequently (255, 200 and 148 consultations), while finger, clavicle and toe fractures had the highest likelihoods of being found on performing an X-ray at their respective sites (66.7%, 50.3% and 42.4%). For abnormalities with five or more occurrences in this dataset, lipohemarthrosis, bone tumours and glenoid fractures had the highest miss rates (85.7%, 80% and 80%), while bone tumours, glenoid fractures and patella fractures had the highest miss rates requiring a change in management (80%, 40% and 30.8%). For abnormalities with 20 or more occurrences, elbow effusions, distal tibia fractures and rib fractures had the highest miss rates (31.0%, 30.3% and 28.6%), while scaphoid fractures, proximal humerus fractures and elbow effusions had the highest miss rates requiring a change in management (15%, 11.5% and 10.3%). See Appendix 2 for a more comprehensive illustration of these figures.

With regards to the grade of the interpreting doctor, a discrepancy was most likely to be recorded if a GP made the initial interpretation, while the GEN and PROV categories accounted for the most CSDs. UCF interpretations proved to have both the least discrepancies overall as well as the fewest CSDs, as demonstrated in Table 3 and Table 4. The largest amount of variation in error rates between these groups was accounted for by the false positive rates for discrepancies: over 4% variation in this rate was noted on comparing GP to UCF. Less than 2% of variation was explained by training when comparing false negative rates for discrepancies, or false positive and negative rates for CSDs.

Assuming a significance level of alpha = .05 and using Pearson’s chi-squared test to assess for goodness of fit, UCF interpretations resulted in significantly fewer discrepancies than PROV (P = .00135) or GP (P < .001) interpretations as shown in Appendix 3, and significantly fewer CSDs than PROV (P = .015) or GEN (P = .015) interpretations as shown in Appendix 4. Significantly fewer false positive errors were also made by the UCF group compared to the GP (P < .001), PROV (P < .001) and UCT (P = .00846) groups as demonstrated in Appendix 5.

The age of the patient also influenced the frequency of discrepancies, with significantly more CSDs (P < .001) present in the paediatric sample. This can be explained by a significantly larger number of false positives within this group, and no significant difference in the number of false negatives.

The gender of the patient had no significant impact on the frequency of discrepancies or CSDs. The time of consultation also had no significant impact on discrepancy or CSD rate when comparing out of hours (1700–0800) consultations to those occurring during normal working hours.

Of the 171 CSDs, the main impacts on management were removal of casts applied at the initial consult (n = 51), recall for review of the injury in the urgent care clinic (n = 37), referral to orthopaedic clinic (n = 29) and application of a new cast (n = 18). A full breakdown of the impact on management is depicted in Table 5.

View Tables 3–5.

In only two cases—both involving missed fractures of the hip (see Figure 1)—was a patient referred to hospital for operative management. No complaints were identified that originated from either misdiagnosis or mismanagement.

View Figures 1–5.

Discussion

No data currently exists on error rates among urgent care physicians when interpreting X-rays. However, equivalent studies based in the emergency department estimate the discrepancy rate between emergency physician’s and radiologist’s readings as between 1% and 28%,[[5–15]] with errors that result in a change in management occurring at a rate of between 0% and 9%.[[5–15]] The findings of this study therefore suggest that comparable error rates exist between urgent care doctors at our facilities and emergency physicians for X-ray interpretation.

In our study, we found that factors contributing to mistakes included whether the patient was younger than 15 years old and the type of doctor interpreting the X-ray, both of which have a significant impact on patient management.

For CSDs amongst paediatric consults, the larger difference in error rate, when compared to the adult cohort, is accounted for by a larger proportion of false positives but the same proportion of false negatives. This suggests that urgent care doctors are more conservative in their interpretation of paediatric X-rays, and do not miss more fractures compared to the adult population—a reassuring result given that a false positive will at worst result in unnecessary immobilisation pending the formal radiology report.

For discrepancies that affected patient management, it is evident that the UCF category made significantly fewer errors compared to the GEN and PROV groups. Within the emergency department, it has been shown that physicians in training are more likely to make interpretive errors compared to those who have completed training.[[10]] This suggests that enrolment in or completion of a vocational training programme offers benefits in reducing interpretive errors that result in a change in patient management. As noted, while the GP group made mistakes more frequently, this did not translate into a proportionately higher rate of changes to patient management for this group, with GPs in fact having the lowest false negative (or missed abnormality) rate for CSDs.

Previous studies based in the emergency department have identified the foot[[15,19]] as the most common site of missed fractures, which differs from the results of this study where the chest and elbow accounted for the most frequent discrepancies in interpretation. However, as noted by Wei et al,[[19]] missed foot fractures in the emergency department typically occur in the context of major trauma, and it would be expected that these patients bypass urgent care clinics.

Our results suggest that the urgent care doctor’s initial interpretation may have been correct for a proportion of discrepancies and CSDs. One study on the interpretation of plain film X-rays originating in the emergency department demonstrated that, on double reading of an X-ray by experienced radiologists, disagreement occurred in 10–12% of skeletal and chest radiographs.[[20]] Given such rates of interobserver variability amongst radiologists, it is entirely possible that the true discrepancy and CSD rates for this dataset are somewhat lower.

Some of the important missed abnormalities are illustrated in Figures 2–5.

Conclusion

In this study we found that 13.5% of X-rays reported were discordant with the subsequent report from the radiologist, with 3.4% of these judged to be CSDs. This is similar to the rate in the emergency department setting reported in the literature. In only two cases did patients have acute operative management delayed. In both these cases, the patients involved were safely discharged following an operation with no complications identified due to delay in their treatment. Furthermore, when a patient was recalled for immobilisation, no cases of subsequent malunion or non-union were identified. While every mistake represents an opportunity for improvement, this suggests an excellent standard of care in the urgent care setting.

Potential avenues for reducing error include having a protocol where X-rays ordered by less experienced doctors are checked by the senior doctor on shift, and urgent care specific education. With regards to the latter, the findings of this study are being incorporated into an online teaching resource with ongoing discussions regarding how best to integrate this into the RNZCUC training programme.

View Appendices.

Summary

Abstract

Aim

To assess the error rate in plain film interpretation amongst urgent care doctors in the context of minor trauma, to determine where such errors occur and whether they affect patient care, and to identify possible causative factors.

Method

Five thousand X-ray interpretations occurring between March and August 2021 across six urgent care clinics were included in this retrospective study. Data analysis focused on demographic data, site of injury, the experience of the doctor interpreting the X-ray, and whether any change in management occurred following an error.

Results

Six hundred and seventy-three X-ray interpretation errors occurred (13.5%), with 171 of these (3.4%) resulting in a change in patient management. Chest and elbow X-rays were misinterpreted most often. Both the age of the patient and training of the urgent care doctor had a significant effect on this error rate. The main impacts on patient management were cast removal and recall for review in the urgent care centre or an orthopaedic clinic.

Conclusion

X-ray misinterpretation occurs at equivalent rates in urgent care when compared to the emergency department. Errors occur more commonly with paediatric patients and for doctors with less urgent care-specific training. These errors rarely result in any serious impact on patient management.

Author Information

Crispian Wilson: Urgent care doctor, ℅ Accident & Healthcare, 19 Second Avenue, Tauranga.

Acknowledgements

Correspondence

Crispian Wilson: Urgent care doctor, ℅ Accident & Healthcare, 19 Second Avenue, Tauranga. 3110, 07-577 0010.

Correspondence Email

crispian.e.wilson@gmail.com

Competing Interests

Nil.

1) After-hours Health Care Network. After-hours business case. 2014. Unpublished manuscript. Retrieved from https://www.adhb.health.nz/assets/Documents/OIA/2020/11-November-2020/Urgent-care-services.pdf

2) Studdert DM, Mello MM, Gawande AA, et al. Claims, errors, and compensation payments in medical malpractice litigation. N Engl J Med. 2006;354:2024-33.

3) Hallas P, Ellingsen T. Errors in fracture diagnoses in the emergency department--characteristics of patients and diurnal variation. BMC Emerg Med. 2006;6:4.

4) Mounts J, Clingenpeel J, McGuire E, et al. Most frequently missed fractures in the emergency department. Clin Pediatr (Phila). 2011;50:183-6.

5) Gratton MC, Salomone JA 3rd, Watson WA. Clinically significant radiograph misinterpretations at an emergency medicine residency program. Ann Emerg Med. 1990;19:497-502.

6) Higginson I, Vogel S, Thompson J, Aickin R. Do radiographs requested from a paediatric emergency department in New Zealand need reporting? Emerg Med Australas. 2004;16:288-94.

7) Klein EJ, Koenig M, Diekema DS, Winters W. Discordant radiograph interpretation between emergency physicians and radiologists in a pediatric emergency department. Pediatr Emerg Care. 1999;15:245-8.

8) Nitowski LA, O'Connor RE, Reese CL 4th. The rate of clinically significant plain radiograph misinterpretation by faculty in an emergency medicine residency program. Acad Emerg Med. 1996;3:782-9.

9) Petinaux B, Bhat R, Boniface K, Aristizabal J. Accuracy of radiographic readings in the emergency department. Am J Emerg Med. 2011;29:18-25.

10) Walsh-Kelly CM, Melzer-Lange MD, Hennes HM, et al. Clinical impact of radiograph misinterpretation in a pediatric ED and the effect of physician training level. Am J Emerg Med. 1995;13:262-4.

11) Shirm SW, Graham CJ, Seibert JJ, et al. Clinical effect of a quality assurance system for radiographs in a pediatric emergency department. Pediatr Emerg Care. 1995;11:351-4.

12) Simon HK, Khan NS, Nordenberg DF, Wright JA. Pediatric emergency physician interpretation of plain radiographs: Is routine review by a radiologist necessary and cost-effective? Ann Emerg Med. 1996;27:295-8.

13) Benger JR, Lyburn ID. What is the effect of reporting all emergency department radiographs? Emerg Med J. 2003;20:40-3.

14) Kim SJ, Lee SW, Hong YS, Kim DH. Radiological misinterpretations by emergency physicians in discharged minor trauma patients. Emerg Med J. 2012;29:635-9.

15) Catapano M, Albano D, Pozzi G, et al. Differences between orthopaedic evaluation and radiological reports of conventional radiographs in patients with minor trauma admitted to the emergency department. Injury. 2017;48:2451-2456

16) Frich LH, Larsen MS. How to deal with a glenoid fracture. EFORT Open Rev. 2017;2:151-157

17) Major NM, Crawford ST. Elbow effusions in trauma in adults and children: is there an occult fracture? AJR Am J Roentgenol. 2002;178:413-8.

18) Waite S, Scott J, Gale B, et al. Interpretive Error in Radiology. AJR Am J Roentgenol. 2017;208:739-749

19) Wei CJ, Tsai WC, Tiu CM, et al. Systematic analysis of missed extremity fractures in emergency radiology. Acta Radiol. 2006;47:710–7

20) Robinson PJ, Wilson D, Coral A, et al. Variation between experienced observers in the interpretation of accident and emergency radiographs. Br J Radiol. 1999;72:323-30.

Contact diana@nzma.org.nz
for the PDF of this article

Subscriber Content

The full contents of this pages only available to subscribers.
Login, subscribe or email nzmj@nzma.org.nz to purchase this article.

LOGINSUBSCRIBE