Can radiologists ’ eye-tracking data enhance AI?

Integrating eye-tracking data that map how radiologists interpret x-rays into deep learning AI algorithms could be key to developing more “human-centered” AI, according to a recent study. A group led by researchers in Lisbon, Portugal, conducted a systematic literature review of the use of eye gaze-driven data in chest x-ray deep learning (DL) models and ultimately proposed improvements that could enhance the approach. “There is a clear demand for more human-centric technologies to bridge the gap between DL systems and human understanding,” noted lead author José Neves, of the University of Lisbon, and colleagues. The study was published in the March issue of the European Journal of Radiology. DL models have demonstrated remarkable proficiency in various tasks in radiology, yet their internal decision-making processes remain largely opaque, the authors explained. This presents a so-called “black-box” problem, where the logic leading to a model’s decisions remains inaccessible for human scrutiny, they wrote. A promising method to help overcome this problem is to integrate data from studies that use eye-tracking hardware and software to characterize how radiologists read normal and abnormal chest x-rays, the authors added. This data is focused on saccades (rapid eye movements that occur when the viewer shifts gaze from one point of interest to another) and fixations (periods when the eyes remain relatively still), for instance, and can be presented as attent...
Source: AuntMinnie.com Headlines - Category: Radiology Authors: Tags: Subspecialties Chest Radiology Source Type: news