AI-based analysis of radiologist's eye movements for fatigue estimation: A pilot study on chest X-rays

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

  • Ilya Pershin
  • Maksim Kholiavchenko
  • Bulat Maksudov
  • Tamerlan Mustafaev
  • Ibragimov, Bulat
Radiologist-AI interaction is a novel area of research of potentially great impact. It has been observed in the literature that the radiologists’ performance deteriorates towards the shift ends and there is a visual change in their gaze patterns. However, the quantitative features in these patterns that would be predictive of fatigue have not yet been discovered. A radiologist was recruited to read chest X-rays, while his eye movements were recorded. His fatigue was measured using the target concentration test and Stroop test having the number of analyzed X-rays being the reference fatigue metric. A framework with two convolutional neural networks based on UNet and ResNeXt50 architectures was developed for the segmentation of lung fields. This segmentation was used to analyze radiologist’s gaze patterns. With a correlation coefficient of 0.82, the eye gaze features extracted lung segmentation exhibited the strongest fatigue predictive powers in contrast to alternative features.
OriginalsprogEngelsk
TitelMedical Imaging 2022 : Image Perception, Observer Performance, and Technology Assessment
RedaktørerClaudia R. Mello-Thoms, Claudia R. Mello-Thoms, Sian Taylor-Phillips
Antal sider4
ForlagSPIE
Publikationsdato2022
Artikelnummer120350Y
ISBN (Elektronisk)9781510649453
DOI
StatusUdgivet - 2022
BegivenhedMedical Imaging 2022: Image Perception, Observer Performance, and Technology Assessment - Virtual, Online
Varighed: 21 mar. 202227 mar. 2022

Konference

KonferenceMedical Imaging 2022: Image Perception, Observer Performance, and Technology Assessment
ByVirtual, Online
Periode21/03/202227/03/2022
SponsorThe Society of Photo-Optical Instrumentation Engineers (SPIE)
NavnProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Vol/bind12035
ISSN1605-7422

Bibliografisk note

Funding Information:
This research was supported by the Russian Science Foundation under Grant no.18-71-10072.

Publisher Copyright:
© 2022 SPIE. All rights reserved.

ID: 344726142