Advertisement

Development of an Eye-Tracking Image Manipulation System for Angiography: A Comparative Study

Published:November 03, 2020DOI:https://doi.org/10.1016/j.acra.2020.09.027

      Rationale and Objectives

      Appropriate image manipulation of angiographic image display systems during interventional radiology is performed by radiological technologists and/or nurses given instructions from radiologists. However, appropriate images might not be displayed because of communication errors. Therefore, we developed a manipulation system that uses an eye tracker. The study aimed to determine if an angiographic image display system can be manipulated as well by using an eye tracker as by using a mouse.

      Materials and Methods

      An angiographic image display system using an eye tracker to calculate the gaze position on the screen and state of fixation was developed. Fourteen radiological technologists participated in an observer study by manipulating 10 images for each of 5 typical cases frequently performed in angiography, such as renal tumor, cerebral aneurysm, liver tumor, uterine bleeding, and hypersplenism. We measured the time from the start to the end of manipulating a series of images required when using the eye tracker and the conventional mouse. In this study, the statistical processing was done using Excel and R and R studio.

      Results

      The average time required for all observers for completing all cases was significantly shorter when using the eye tracker than when using the mouse (10.4 ± 2.1 s and 16.9 ± 2.6 s, respectively; p< 0.001 by paired t test).

      Conclusion

      Radiologists were able to manipulate an angiographic image display system directly by using the newly developed eye tracker system without touching contact devices, such as a mouse or angiography console. Therefore, communication error could be avoided.

      Key Words

      Abbreviations:

      IVR (interventional radiology), RIS (radiology information system), HIS (hospital information system), PACS (picture archiving and communication system), NUI (natural user interface)
      To read this article in full you will need to make a payment

      Purchase one-time access:

      Academic & Personal: 24 hour online accessCorporate R&D Professionals: 24 hour online access
      One-time access price info
      • For academic or personal research use, select 'Academic and Personal'
      • For corporate R&D use, select 'Corporate R&D Professionals'

      Subscribe:

      Subscribe to Academic Radiology
      Already a print subscriber? Claim online access
      Already an online subscriber? Sign in
      Institutional Access: Sign in to ScienceDirect

      References

        • Bates R
        • Donegan M
        • Istance HO
        • et al.
        Introducing COGAIN: communication by gaze interaction.
        Univers Access Inf Soc. 2007; 6: 159-166https://doi.org/10.1007/s10209-007-0077-9
        • Wachs JP
        • Stern HI
        • Edan Y
        • et al.
        A gesture-based tool for sterile browsing of radiology images.
        J Am Med Inform Assoc. 2008; 15: 321-323https://doi.org/10.1197/jamia.m241
        • Ogura T
        • Sato M
        • Ishida Y
        • et al.
        Development of a novel method for manipulation of angiographic images by use of a motion sensor in operating rooms.
        Radiol Phys Technol. 2014; 7: 228-234https://doi.org/10.1007/s12194-014-0259-0
        • Sato M
        • Ogura T
        • Yasumoto Y
        • et al.
        Development of an image operation system with a motion sensor in dental radiology.
        Radiol Phys Technol. 2015; 8: 243-247https://doi.org/10.1007/s12194-015-0313-6
        • Park BJ
        • Jang T
        • Choi JW
        • et al.
        Gesture-controlled interface for contactless control of various computer programs with a hooking-based keyboard and mouse-mapping technique in the operating room.
        Comput Math Methods Med. 2016; : 2016https://doi.org/10.1155/2016/5170379
        • Li ECF
        • Lai CWK.
        A user report on the trial use of gesture commands for image manipulation and X-ray acquisition.
        Radiol Phys Technol. 2016; 9: 261-269https://doi.org/10.1007/s12194-016-0358-1
        • Ma M
        • Fallavollita P
        • Habert S
        • et al.
        Device- and system-independent personal touchless user interface for operating rooms: one personal UI to control all displays in an operating room.
        Int J Comput Assist Radiol Surg. 2016; 11: 853-861https://doi.org/10.1007/s11548-016-1375-6
        • Hettig J
        • Saalfeld P
        • Luz M
        • et al.
        Comparison of gesture and conventional interaction techniques for interventional neuroradiology.
        Int J Comput Assist Radiol. 2017; 12: 1643-1653https://doi.org/10.1007/s11548-017-1523-7
        • Oshiro Y
        • Ohuchida K
        • Okada T
        • et al.
        Novel imaging using a touchless display for computer-assisted hepato-biliary surgery.
        Surg Today. 2017; 47: 1512-1518https://doi.org/10.1007/s00595-017-1541-7
        • Mewes A
        • Hensen B
        • Wacker F
        • et al.
        Touchless interaction with software in interventional radiology and surgery: a systematic literature review.
        Int J Comput Assist Radiol Surg. 2017; 12: 291-305https://doi.org/10.1007/s11548-016-1480-6
        • Yoshida S
        • Ito M
        • Tatokoro M
        • et al.
        Multitask imaging monitor for surgical navigation: combination of touchless interface and head-mounted display.
        Urol Int. 2017; 98: 486-488https://doi.org/10.1159/000381104
        • Madapana N
        • Gonzalez G
        • Rodgers R
        • et al.
        Gestures for picture archiving and communication systems (PACS) operation in the operating room: is there any standard?.
        PLoS One. 2018; 13https://doi.org/10.1371/journal.pone.0198092
        • Chan A
        • Lau RWH
        • Li L
        Hand motion prediction for distributed virtual environments.
        IEEE Trans Vis Comput Graph. 2008; 14: 146-159https://doi.org/10.1109/TVCG.2007.1056
        • Sato M
        • Ogura T
        • Yamanouchi S
        • et al.
        Development of a new image manipulation system based on detection of electroencephalogram signals from the operator's brain: a feasibility study.
        Radiol Phys Technol. 2019; 12https://doi.org/10.1007/s12194-019-00508-8
        • Alvarez-Lopez F
        • Maina MF
        • Saigí-Rubió F
        Use of commercial off-the-shelf devices for the detection of manual gestures in surgery: systematic literature review.
        J Med Internet Res. 2019; 21https://doi.org/10.2196/11925
        • Bockhacker M
        • Syrek H
        • Elstermann Von Elster M
        • et al.
        Evaluating usability of a touchless image viewer in the operating room.
        Appl Clin Inform. 2020; 11: 88-94https://doi.org/10.1055/s-0039-1701003
        • Aidlen JT
        • Glick S
        • Silverman K
        • et al.
        Head-motion-controlled video goggles: preliminary concept for an interactive laparoscopic image display (i-LID).
        J Laparoendosc Adv Surg Tech. 2009; 19: 595-598https://doi.org/10.1089/lap.2009.0123
        • Debeljak M
        • Ocepek J
        • Zupan A
        Eye controlled human computer interaction for severely motor disabled children: two clinical case studies.
        International Conference on Computers Helping People with Special Needs. 2012; 7383: 153-156https://doi.org/10.1007/978-3-642-31534-3_23
        • Ebert LC
        • Hatch G
        • Ampanozi G
        • et al.
        You can't touch this: touch-free navigation through radiological images.
        Surg Innov. 2012; 19: 301-307https://doi.org/10.1177/1553350611425508
        • Ruppert G
        • Reis L
        • Amorim P
        • et al.
        Touchless gesture user interface for interactive image visualization in urological surgery.
        World J Urol. 2012; 30: 687-691https://doi.org/10.1007/s00345-012-0879-0
        • Jacob MG
        • Wachs JP
        • Packer RA
        Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images.
        J Am Med Inform Assoc. 2013; 20: e183-e186https://doi.org/10.1136/amiajnl-2012-001212
        • Jacob MG
        • Wachs JP
        • Packer RA
        Speech and motion control for interventional radiology: requirements and feasibility.
        Int J Comput Assist Radiol Surg. 2013; 8 (2013): 997-1002https://doi.org/10.1007/s11548-013-0841-7
        • Tan JH
        • Chao C
        • Zawaideh M
        • et al.
        Informatics in radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures.
        Radiographics. 2013; 33https://doi.org/10.1148/rg.332125101
        • Ahuja PD
        • Mhaske SP
        • Mishra G
        • et al.
        Assessment of root resorption and root shape by periapical and panoramic radiographs: a comparative study.
        J Contemp Dent Pract. 2017; 18: 479-483https://doi.org/10.5005/jp-journals-10024-2069
        • Van Der Gijp A
        • Ravesloot CJ
        • Jarodzka H
        • et al.
        How visual search relates to visual diagnostic performance: a narrative systematic review of eye-tracking research in radiology.
        Adv Health Sci Educ Theory Pract. 2017; 22: 765-787https://doi.org/10.1007/s10459-016-9698-1
        • Peltier C
        • Becker MW.
        Eye movement feedback fails to improve visual search performance.
        Cogn Res Princ Implic. 2017; 2https://doi.org/10.1186/s41235-017-0083-2
        • Klausen A
        • Röhrig R
        • Lipprandt M
        Feasibility of eyetracking in critical care environments - a systematic review.
        Stud Health Technol Inform. 2017; 228: 604-608https://doi.org/10.3233/978-1-61499-678-1-604
        • Brunyé TT
        • Nallamothu BK
        • Elmore JG
        Eye-tracking for assessing medical image interpretation: A pilot feasibility study comparing novice vs expert cardiologists.
        Perspect Med Educ. 2019; 8: 65-73https://doi.org/10.1007/s40037-019-0505-6
        • Chen ZH
        • Kim JT
        • Liang J
        • et al.
        Real-time hand gesture recognition using finger segmentation.
        Sci World J. 2014; https://doi.org/10.1155/2014/267872
        • Li G
        • Tang H
        • Sun Y
        • et al.
        Hand gesture recognition based on convolution neural network.
        Cluster Comput. 2019; 22: 2719-2729https://doi.org/10.1007/s10586-017-1435-x
        • Côté-Allard U
        • Fall CL
        • Drouin A
        • et al.
        Deep learning for electromyographic hand gesture signal classification using transfer learning.
        IEEE Trans Neural Syst Rehabil Eng. 2019; 27: 760-771https://doi.org/10.1109/TNSRE.2019.2896269
        • Zemblys R
        • Niehorster DC
        • Holmqvist K
        gazeNet: End-to-end eye-movement event detection with deep neural networks.
        Behav Res Methods. 2019; 51: 840-864https://doi.org/10.3758/s13428-018-1133-5
        • Li W
        • Dong Q
        • Jia H
        • et al.
        Training a camera to perform long-distance eye tracking by another eye-tracking.
        IEEE Access. 2019; 7: 155313-155324https://doi.org/10.1109/ACCESS.2019.2949150
        • Tabar YR
        • Halici U.
        A novel deep learning approach for classification of EEG motor imagery signals.
        J Neural Eng. 2017; 14016003https://doi.org/10.1088/1741-2560/14/1/016003
        • Ren Y
        • Wu Y.
        Convolutional deep belief networks for feature extraction of EEG signal.
        in: Proceedings of the International Joint Conference on Neural Networks. 2014: 2850-2853https://doi.org/10.1109/IJCNN.2014.6889383
      1. “Support & training, for software and devices - PCEye Mini - Tobii Dynavox.” 2015. https://www.tobiidynavox.com/support-training/pceye-mini/, (Accessed June 27, 2020).

      2. “How do Tobii eye trackers work? - Learn more with Tobii Pro,” 2015. https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/how-do-tobii-eye-trackers-work/. (Accessed July 5, 2020).

        • Holmqvist K
        • Nyström M
        • Andersson R
        • et al.
        Eye tracking.
        Oxford University Press, 2011