Study sheds light on AI ' black box ' problem

A group in Japan has provided clear evidence that so-called intrasource balance in COVID-19 chest x-ray data sets is vital to minimize the risk of poor performance of AI deep-learning models, according to a study published November 3 in Scientific Reports. A group led by Zhang Zhang, PhD, of Tohoku University in Sendai, found that using an intra-source imbalanced dataset of x-rays caused a serious training bias, even though the data set had a good intercategory balance.“Our study reveals that the [intrasource imbalance] of training data can lead to an unreliable performance of deep-learning models,” the group wrote.When developing deep-learning AI models to detect COVID-19, researchers collect as much data as possible from different medical facilities to avoid the impact of intercategory imbalance (ICI), which means a difference in data quantity among categories.However, due to the ICI within each medical facility, medical data are often isolated and acquired in different settings among medical facilities, and this is known as the intra-source imbalance (ISI) characteristic, the authors explained. Moreover, this imbalance can also impact the performance of DL models, yet has received negligible attention, they added.Thus, the group aimed to explore the impact of the ISI on DL models by comparing a version of a deep-learning model that was trained separately by an intrasource imbalanced chest x-ray data set and an intra-source balanced data set for COVID-19 diagnosis.One d...
Source: AuntMinnie.com Headlines - Category: Radiology Authors: Tags: Subspecialties Artificial Intelligence Chest Radiology Source Type: news