X-ray Interpretation in Emergency Department, Do We Need the Radiologist?

Article Information

Fatimah Alherz1, Fayez Alharthi1, Fahad Almutari1, Abdulrahman Alahmari1, Amani Alsolami1, Maha nojoom2 , Abdullah Al-Shamrani3

1Department of Emergency, Prince Sultan Military Medical City (PSMMC).

2Department of Radiology, PSMMC.

3Department of Pediatrics, PSMMC, Riyadh 11159, Saudi Arabia 2 Al Faisal University, Riyadh 11159, Saudi Arabia

*Corresponding Author:Fatimah Alherz,Department of Emergency, Prince Sultan Military Medical City (PSMMC).

Received: 20 January 2023; Accepted: 27 January 2023; Published: xx January 2023


Fatimah Alherz, Fayez Alharthi, Fahad Almutari, Abdulrahman Alahmari, Amani Alsolami, Maha nojoom , Abdullah Al-Shamrani. X-ray Interpretation in Emergency Department, Do We Need the Radiologist. Journal of Radiology and Clinical Imaging. 6 (2023): 24-30.

Share at Facebook


Background: Diagnostic imaging plays an integral role in the evaluation of patients in the emergency department (ED), and its utilization has increased significantly in the last two decades. Chest x-ray (CXR) is an important diagnostic tool for diagnosing and monitoring a spectrum of diseases in the pediatric field, and decisions based on x- rays can have serious consequences for patients.

Method: This study was a cross sectional study among 110 health care providers working in the emergency department. Providers were requested to fill out an online questionnaire consisting of 10 different cases revised and approved by the research committee. Each question had at least two important observations whether related to the chest x-ray finding or subsequently related diagnosis.

Result: The health care providers were as follows: specialist 39 (27.3%), consultant 29 (26.4%), senior resident 22(20%), fellow 9 (8.2%), junior resident 15(13.6%), general practitioner 3(2.7%) and medical intern 2(1.8%) The highest score was 100% and lowest score was 30% with a mean score of 64.3% (13% sd). The consultants had the highest score with a mean of 72.2% (11.1% sd), followed by specialists 66.37(12.8), medical interns 62.5% senior residents 62.45%(9.8 sd), fellows 60.2% (15.6 sd). Meanwhile, while the lowest scoring groups were the junior residents and family practitioners (53.7 and 47.7, respectively). Seventy-three percent of the case patients had atypical pneumonia and 80% had inspiratory film and boat-shaped heart CXR, and 74.5% and 72.7% of the providers reached the correct diagnosis of tetralogy of fallout (TOF), respectively. A total of 64.5% of the health care providers correctly identified foreign body aspiration. In the chest mass case, 35% of the providers identified the type the film (AP); furthermore, 26.4% missed the potential diagnosis. In the tension pneumothorax case, hyperlucency was still misse


Chest X-ray; Radiology; Emergency department

Chest X-ray articles; Radiology articles; Emergency department articles

Article Details

1. Introduction

Chest x-ray (CXR) is an important diagnostic tool in managing and monitoring a spectrum of diseases encountered in pediatric patients, including different types of pneumonia, complicated pneumonia, bronchiolitis, lung malformation, and suspected foreign body disease [1]. Every working day, clinicians make independent decisions based on paraclinical test results, and these decisions can have serious consequences for patients. For example, a missed diagnosis of pneumothorax may be life threatening. As we are not investigating one disease, e.g., chest radiology in patients infected with COVID or pneumonia, we cannot apply certain scoring systems, such as the RALE scoring system, which divides the lungs into 2 regions, the left and right lung. In this scoring system, each lung is scored from 0 to 4 each; 0 for no involvement, 1 for less than 25% involvement, 2 for 25%-50% involvement, 3 for 50–75% involvement, and 4 for more than 75% involvement. The maximum score in the RALE scoring system is 8 [2].

Neither Brixia nor the modified CXR scoring system created by Rosy Setiawati et al. calculated the score or severity from the posteroanterior and anterior projection of CXR by dividing the lungs into 6 regions [3-4]. Two lines divided the lung horizontally, resulting in each lung having 3 regions. Each region was scored from 0 to 2 based on the lesions as follows: 0 if there was no involvement, 1 if infiltrates or consolidations were less than 50%, and 2 if infiltrates or consolidations were more than 50%. The maximum score in the modified CXR scoring system is 12. The final scores were then classified further into mild (scores 1–4), moderate (scores 5–8) and severe (scores 9–12) [4].

X-rays were introduced to the medical field in 1895[1]. Emergency radiology (ER) plays an essential role in the patient's first diagnosis arriving in the hospital, both in an emergency and at the beginning of the patient's monitoring and care process. Therefore, it is necessary to carefully determine the location of the ER within the emergency department (ED), the technological equipment, the organizational process, and the sizing and training of staff [5]. A radiograph was performed in 34.4% of ED visits, and in 15.8% of visits, computed

tomography (CT) was performed [6]. We identified factors associated with successful CXR interpretation, including the level of training, field of training, interest in a pulmonary career, and overall certainty. Although interpretation improved with training, important diagnoses were still missed [7].

Approximately 3% of radiographs interpreted by emergency physicians (EPs) are incorrectly interpreted. The most commonly missed findings include fractures, dislocations, air-space disease, and pulmonary nodules [8]. There is a low discrepancy in the interpretation of pediatric emergency radiographs between emergency department physicians and radiologists. Most errors occur for radiographs of the chest and upper extremities. The low rate of clinically significant discrepancies allows for safe management based on EP interpretation [9]. However, emergency department physicians frequently miss specific radiographic abnormalities, and there is considerable discrepancy between their interpretations and those of trained radiologists. These findings highlight the importance of the routine evaluation of chest radiographs by a well-trained radiologist and emphasize the need for improving interpretive skills among emergency department physicians [10]. Specialty registrars and consultants scored the highest with the highest average certainty levels. Junior trainees felt

the least certain about making their diagnosis and were less likely to be correct; however, most misinterpreted radiographs are of no clinical significance [11,12]. Interpreting CXR with a radiologist's assistance might help reduce overdiagnosis and minimize antibiotic over prescription, thus improving the diagnostic accuracy of pneumonia in the ED [13]. Serrano CO et al. demonstrated that most symptomatic children with COVID-19 infection show abnormalities on CXR. Findings are nonspecific; therefore, a chest x-ray cannot be used to screen for COVID-19 as a first-line diagnostic test [14]. However, the absence of CXR findings does not exclude the presence of a foreign body; a routine chest CXR had a sensitivity of 68-76% and specificity of 45-67% [15]. A 45° oblique view on expiration is recommended for radiographic imaging of patients with clinical signs of fracture [16]. Twenty-five percent of CXR images had errors in radiographic exposure (underexposure and overexposure), and 13% had inadequate respiratory maneuvers performed [17].

The objectives of this study were to check the accuracy of chest x-ray film interpretation by nonradiologists in the emergency department, to further compare the accuracy of interpretation among trainees of different levels and to determine if the input of radiologists was still needed for x-ray interpretation in the emergency department.

2. Method

In this study, clinicians reviewed cases with chest x-rays in the emergency department at the Prince Sultan Medical Military City (PSMMC). The study was conducted through a self-administered online questionnaire consisting of 10 questions that were revised and approved by three consultants (certified pediatric radiologist, pediatric emergency and pediatric respirologist). Each question was divided into A and B except for question 4, which was composed of A, B and C. Each correctly answered question was worth 1 point, and the answer choices were closed. The questions were chosen carefully to cover different x-ray abnormalities that could be encountered in pediatric emergencies, and further questions were conducted to determine the basic knowledge of x-ray interpretation, e.g., inspiratory vs. expiratory, AP or PA film.

The study population was as follows: pediatric emergency consultants, pediatric emergency and respiratory fellows, pediatric emergency specialists, pediatric junior residents, pediatric senior residents, medical interns and primary health care providers. The data were collected in an Excel sheet. The statistical analysis was conducted using SPSS for Windows, version 21.0 (SPSS Inc., Chicago, IL, USA). A nonparametric test was employed for variables outside the normal distribution. Independent-samples t tests were used to compare data between groups. A value of P <0.05 with a 95% confidence interval was considered statistically significant. The Institutional Rreview Board (IRB) approval was obtained from PSMMC 2022.

3. Results

A total of 110 health care providers completed the questionnaire as follows: 39 specialists (27.3%), 29 consultants (26.4%), 22 senior residents (20%), 9 fellows (8.2%), 15 junior residents (13.6%), 3 general practitioners (2.7%) and 2 medical interns (1.8%). The highest score was 100%, and the lowest score was 30%, with a mean of 64.3% (13%sd). The highest scoring group was the consultant group, with a mean of 72.2% (11.1% sd), followed by specialists (66.37 (12.8)), medical interns (62.5%), senior residents (62.45% (9.8 sd)), and fellows (60.2% (15.6 sd)), while the lowest scoring groups were the junior residents and family practitioners (53.7 and 47.7, respectively). Subsequently, the variability between the groups was significant (p value 0.00%). (Table 1)













Junior resident




Medical intern




Other (general practitioner)




Senior resident












Table 1: Characteristics of health care providers



Percent of right


Question 1

a- Technique (inspiratory film)


A typical


b- picking lung infiltration 85%


Question 2

a-Boot shape heart



b-Tetralogy of Fallot


Question 3

a- Hyperinflation and radiolucency of the right-sided


Foreign body




b- Right-sided foreign body inhalation/aspiration


Question 4

a- Type of the film (AP/PA)


Chest mass

b- Homogenous opacity in right upper and

medial zone

81.8 %

73.6 %


c- Right chest mass, most probably neuroblastoma or




Question 5

a- Hyperlucent right hemithorax with markedly



collapsed right lung



b- Right-sided tension pneumothorax


Question 6

a- Rounded cavitation in the right middle zone with a


Lung abscess

fluid level

b- Right sided lung abscess


Question 7


a- Under penetrated, because the vertebral bodies

behind the cardiac shadow cannot be visualized



b- Hyperinflation


Question 8

a- Homogenous (soft tissue) opacity of the left upper





b- Normal thymus shadow


Question 9

a- AP film and inspiratory


Soft tissue

b- A left lateral chest and abdominal wall soft tissue



swelling that could be an infection (cellulitis versus




Question 10

a- Right-sided lung opacity with air bronchograms




b- Right pleural effusion/ parapneumonic effusion


Table 2: Appropriate radiological answers

Table icon

Table 3: Characteristics of radiological interpretations

Atypical pneumonia was correctly identified by 73% of clinicians; however, it was still missed by 33.3% of general practitioners and fellows and by 50% of interns; additionally, 80% of the clinicians were able to identity the inspiratory film. In the boat-shaped heart CXR, 74.5% and 72.7% of clinicians reached the right diagnosis (TOF), and the boat-shaped heart was often missed by interns and specialists (43.3%, 50%, respectively), with similar results in making a correct diagnosis (40%, 50%, respectively). Asymmetrical hyperinflation was identified by 76.4% of clinicians but missed by 36.4%, 33.3% and 40% of senior residents, general practitioners and junior residents, respectively, while only 64.5% identified potential foreign body aspiration; however, more than 50% of general practitioners and junior and senior residents missed the diagnosis, with a P value of 0.04%. In the case of chest masses, although 81.8% of clinicians were able to describe the opacity location, only 35% correctly identified the type of film (AP). Furthermore, 26.4% of clinicians missed identification of the chest mass. Eighty percent of clinicians correctly identified tension pneumothorax, but hyperlucency was still missed by 43.4%, 40% and 31.8% of fellows, junior residents, and senior residents, respectively, while 42.7% and 50% of junior residents and interns missed the diagnosis, respectively, with a p value of 0.012%. Lung cavitation was detected by 77.3% of clinicians (33.3% of general practitioners, junior residents and fellows missed the finding); furthermore, only 63.6% reached the appropriate diagnosis of lung abscess (72.7%, 55.6% and 53.3% general practitioners, fellows and junior residents missed the diagnosis, respectively). For the bronchiolitis images, only 44% of clinicians were able to identify the underpenetration of film and hyperinflation, with a P value of 0.038%. A very alarming result was observed in describing normal film with a large thymus; only 28.2% of clinicians correctly identified the description of lung opacity, and 44% identified the normal thymus (this was missed by more than 50% of all clinician groups except the consultants, with a P value of 0.04%). For the soft tissue film, only 34.5% of clinicians were able to identify the appropriate description of the film (AP and inspiratory), and 60% identified the soft tissue swelling; the consultant group scored 86.2% and the specialist group scored 60%, while the other clinician groups scored below 50%. In pleural effusion film, 87.3% identified the diagnosis, but only 34.5% of clinicians were able to adequately describe the x-ray and select the correct answer of lung opacity.

4. Discussion

The ED is a major user of medical imaging, which makes awareness of its trends in interpreting medical imaging crucial [18]. The results of chest x-ray interpretation in this study showed a very alarming result for interpreting common radiological findings in the emergency department [19,20]. The physician’s reliability of radiographic interpretation in the emergency room has been the subject of numerous studies. However, standardization has been challenging because each study had a unique design. In our study, missing important findings was common, and the rate of missed crucial findings higher in our study than what has been reported in the literature [8]. Several studies have reported that plain radiographs were correctly interpreted by EPs with a very low incidence of clinically significant discrepancies when compared to radiologist interpretation, and incorrect readings did not lead to any negative results [8, 20,21]. Although the majority of incorrectly evaluated radiographs are not clinically significant, the most frequently overlooked findings were lung nodules, air-space illness, fractures, and dislocations [22]. Missing these diagnoses has a significant clinical impact; however, the most frequently requested radiographs were correctly interpreted [23].

Our study showed that consultants were the most likely group to reach the accurate x-ray interpretation, while the groups with the lowest accuracy were junior residents and family practitioners (72.2% vs. 53.7%,47.7%). Similar findings were reported in different studies, which found that specialist registrars and consultants attained the best accuracy and had the greatest average certainty levels, whereas junior trainees were less confident and more likely to be inaccurate when making diagnoses [11]. Another study found that junior doctors failed to reach the minimum requirements for radiological diagnostic skills for chest x-ray interpretation [24]. We believe that a lack of clinical experience plays a huge role in making correct interpretations. Although chest radiography is commonly performed, chest radiographs have reportedly been the most frequently misread radiographs, particularly in emergency room settings [7-10]. In our study, the variability was very clear. The certainty of certain diagnoses was high in tension pneumothorax, was identified by 80% of all participants; however, the accuracy of interpretations was very low in cases with fasciitis and lung abscess (60 and 63.7%%, respectively).

However, it was found that this percentage is variable and depends on the type and severity of pneumothorax as well as the level of the trainee. If the severity of the pneumothorax increased, chest radiograph diagnosis was simple. Correct pneumothorax diagnosis has been correlated with residency year, particularly in situations of minor pneumothorax [25].

Atypical pneumonia was detected by 73% of clinicians in our study, but another study based on an adult ED found that pneumonia was misdiagnosed in the ED (false positive) in approximately 29% of cases [10,26]. A normal chest x-ray does not automatically rule out the diagnosis of foreign body aspiration in patients with a history suggestive of foreign body aspiration and a positive physical examination [15]. However, the presence of hyperlucency is highly indicated and raises suspicion of FBA, especially in toddlers [20]. This finding was missed by 43.4% of fellow residents, 40% of junior residents and 31.8% of senior residents, with a p value of 0.012%.

The latest recommendation is against performing chest X-ray for patients with bronchiolitis and not performing it as a routine investigation. The finding of bronchiolitis was not easily identified by the participants in our study, with a result of 56% failing to catch the finding with a significant p value (0.038%) [27].

The correct identification of normal thymus shadow was missed by more than 50% of all participants, except in the consultant group (P value 0.04%). Although the typical thymus has a variety of structural variances, being aware of these variations is essential for correct identification of the normal thymus [28]. Our study showed that despite its overwhelming use, the accuracy of CXR interpretation is below the level of expectation, and there is substantial variation among all clinicians. Failing to recognize normal x-rays is still common, as demonstrated by the first and last case with inspiratory film and the thymus shadow [22].

Nevertheless, physicians need more practice to correctly identify basic x-ray features such as AP vs. PA and inspiratory vs. expiratory films. There is also a critical need to improve the recognition of common findings, such as hyperinflation, lung opacity, boat-shaped heart, lung cavitation, and pleural effusion. The fact that at least 25% of participants missed important findings on chest x-rays, such as boat-shaped hearts, should not be overlooked. This study emphasizes the need for enhanced supervision of residents (juniors, seniors), and primary health care providers should be monitored and followed by ED consultants. Radiologist input is highly recommended in cases in which the diagnosis is unclear or the images are confusing [29].

5. Conclusion

The study results regarding the accuracy of x-ray interpretation in the pediatric emergency department are very alarming. Missing important finding is common, and this study emphasizes the importance of radiologist input for chest x-rays.

However, algorithms could be used to minimize missing findings on x-rays. Training programs should focus more on radiology and promote greater collaboration with radiologists. Further studies are urgently needed to confirm our observations.

Author Contributions:

Alherz and Dr. F. Almutari wrote the manuscript; A. Alahmari and A. Alsolami collected the literature; M. nojoom revised the x-ray interpretation questionnairex; F. ALharthi and A. Al-Shamrani supervised and contributed to manuscript drafting. All authors have read and agreed to the published version of the manuscript


This research received no external funding. Institutional Review Board Statement: Not applicable.

Informed Consent Statement:

This work was conducted at PSMMC, IRB approved for the manuscript, Riyad, Saudi Arabia


The authors would like to thank Sumayyah Kobeisy, Dr. Soliman Fakeeh Hospital, P.O. Box2537, Jeddah, Saudi Arabia 21461. skobeisy@fakeeh.care. for great support in data analysis.

Conflicts of Interest:

The authors declare no conflict of interest.



CXR: Chest x-ray

AP: Anterior posterior

PA: Posterior anterior

CT: Computerized tomography

ED: Emergency department

ER: Emergency radiology

Eps: Emergency physicians

COVID: Coronavirus disease

RALE: Radiographic assessment of lung edema

TOF: Tetralogy of fallout

PSMMC: Prince sultan military medical city

6. References

  1. Thomas AM, Banerjee AK. The history of radiology. OUP Oxford; 2013 May 9
  2. Warren MA, Zhao Z, Koyama T, et al. Severity scoring of lung oedema on the chest radiograph is associated with clinical outcomes in ARDS. Thorax 73(9) (2018): 840–846.
  3. Rosy Setiawati, Anita Widyoningroem, Triwulan Handarini, Fierly Hayati et al. Modified Chest X-Ray Scoring System in Evaluating Severity of COVID-19 Patient in Dr. Soetomo General Hospital Surabaya, Indonesia. International Journal of General Medicine 14 (2021): 2407–2412.
  4. Agrawal, N, Chougale, S. D, Jedge, P, Iyer, S, Dsouza, J. Brixia chest X-ray scoring system in critically Ill patients with COVID-19 pneumonia for determining outcomes. Journal of Clinical and Diagnostic Research 15(8) (2021): 15-17.
  5. Miele V, Di Giampietro I. Diagnostic imaging in an emergency. Salute e Società 13(2) (2014): 128-141.
  6. Erdem Fatihoglu Sonay Aydin,Fatma Dilek Gokharman, Bunyamin Ece and Pinar NercisKosar X-ray Use in Chest Imaging in Emergency Department on the Basis of Cost and Effectiveness Academic Radiology 23 (10) (2016): 1239- 1245.
  7. Chester: A Web Delivered Locally Computed Chest X-ray Disease Prediction System. Proceedings of Machine Learning Research – Under Review 7(2020): 1–12.
  8. Petinaux B, Bhat R, Boniface K, Aristizabal J. Accuracy of radiographic readings in the emergency department. The American journal of emergency medicine 29(1) (2011): 18-25.
  9. Taves J, Skitch S, Valani R. Determining the clinical significance of errors in pediatric radiograph interpretation between emergency physicians and radiologists. CJEM 20(3) (2018): 420-424.
  10. Gatt ME, Spectre G, Paltiel O, Hiller N, Stalnikowicz R. Chest radiographs in the emergency department: is the radiologist necessary? Postgraduate medical journal 79(930) (2003): 214
  11. Satia I, Bashagha S, Bibi A, Ahmed R, et al. Assessing the accuracy and certainty in interpreting chest X-rays in the medical division. Clin Med (Lond) 13(4) (2013): 349–352
  12. Walsh-Kelly CM, Melzer-Lange MD, Hennes HM, Lye P, Hegenbarth M, Sty J, Starshak R. Clinical impact of radiograph misinterpretation in a pediatric ED and the effect of physician training level. The American journal of emergency medicine 13(3) (1995): 262-264.
  13. Atamna A, Shiber S, Yassin M, Drescher MJ, Bishara J. The accuracy of a diagnosis of pneumonia in the emergency department. International Journal of Infectious Diseases 89 (2019): 62-65.
  14. Serrano CO, Alonso E, Andrés M, Buitrago NM, Vigara AP, Pajares MP, López EC, Moll GG, Espin IM, Barriocanal MB, la Calle MD. Pediatric chest x-ray in covid-19 infection. European Journal of Radiology 131 (2020): 109236.
  15. Abdul Sattar , Ijaz Ahmad, Azhar Mehmood Javed, Sadia Anjum. Diagnostic accuracy of chest x-ray in tracheobronchial foreign body aspiration in paediatric patients. J Ayub Med Coll Abbottabad 23 (4) (2011): 103-105.
  16. Nazal Y. Rib fracture: Different radiographic projections. Polish Journal of Radiology 77 (4) (2012): 13-15.
  17. Dhiego Donizethe Ferreira Gumieri1* and Israel de Souza Marques. Evaluation of Chest X-Ray Quality Parameters. Int J Radiol Imaging Technol 7(2021): 082.
  18. Gunnar Juliusson, Birna Thorvaldsdottir, Jon Magnus Kristjansson, and Petur Hannesson. Diagnostic imaging trends in the emergency department: an extensive single- center experience. Acta Radiol Open 8 (7) (2019).
  19. Chistine M. Walsh-Kelly, Marlene D. Melzer-Lange, Halim M. Hennes, and Patricia Lye et al: Clinical Impact of Radiograph Misinterpretation in a Pediatric ED and the Effect of Physician Training Level Am J Emerg Med 13(3) (1995): 262-264.
  20. Svedström E, Puhakka H, Kero P. How accurate is chest radiography in diagnosing tracheobronchial foreign bodies in children? Pediatric radiology 19 (8) (1989): 520-522.
  21. Michael J. Tranovich, Christopher M. Gooch, Joseph M. Dougherty. Radiograph Interpretation Discrepancies in a Community Hospital Emergency Department. West J Emerg Med 20(4) (2019): 626-632.
  22. Parisa Kaviani, Mannudeep K. Kalra, Subba R. Digumarthy, Reya V. Gupta. et al. Frequency of Missed Findings on Chest Radiographs (CXRs) in an International, Multicenter Study: Application of AI to Reduce Missed Findings. Diagnostics 12(10) (2022): 2382.
  23. Amin Tafti; Doug W. Byerly. X-ray Radiographic Patient Positioning. Stat Pearls [Internet].2021.
  24. Janus Mølgaard Christiansen, Oke Gerke, Jens Karstoft& Poul Erik Andersen. Poor interpretation of chest X-rays by junior doctors. Dan Med J 61 (7)(2014): A4875.
  25. Burcu Azapoglu Kaymak, Vehbi Ozaydin, Halil Tozum, Didem Ay. Interpretation of Pneumothorax on Emergency Department Chest Radiographs by Emergency Physicians and Residents. Eurasian J Emerg Med 17 (2018): 14-7.
  26. Dustin R. Stamm; Holly A. Stankewicz.Atypical Bacterial Pneumonia. Stat Pearls [Internet] 6(2022).
  27. J N Friedman. Avoid doing chest x rays in infants with typical bronchiolitis. BMJ 375 (2021).
  28. Smita Manchanda, Ashu S Bhalla, Manisha Jana, and Arun K Gupta. Imaging of the pediatric thymus: Clinicoradiologic approach. World J Clin Pediatr 6 (1)( 2017): 10– 23
  29. Harold K Simon, Naghma S Khan, Dale F Nordeoberg, Jean A Wright. Pediatric Emergency Physician Interpretation of Plain Radiographs: Is Routine Review by a Radiologist Necessary and Cost-Effective? Ann Emerg Med 27(3) (1996): 295-298.

© 2016-2023, Copyrights Fortune Journals. All Rights Reserved