We Need to Open Up the AI Black Box

To convince physicians and nurses that deep learning algorithms are worth using in everyday practice, developers need to explain how they work in plain clinical English.Paul Cerrato, senior research analyst and communications specialist, Mayo Clinic Platform, and John Halamka, M.D., president, Mayo Clinic Platform, wrote this article.AI ’s so-called black box refers to the fact that much of the underlying technology behind machine learning-enhanced algorithms isprobability/statistics without a human readable explanation.Oftentimes that ’s the case because the advanced math or the data science behind the algorithms is too complex for the average user to understand without additional training. Several stakeholders in digital health maintain, however, that this lack of understanding isn’t that important. They argue that as long a s an algorithm generates actionable insights, most clinicians don’t really care about what’s “under the hood.” Is that reasoning sound?Some thought leaders point to the fact that there are many advanced, computer-enhanced diagnostic and therapeutic tools currently in use that physicians don ’t fully understand, but nonetheless accept. The CHA2DSA-VASc score, for instance, is used to estimate the likelihood of a patient with non-valvular atrial fibrillation having a stroke. Few clinicians are familiar with the original research or detailed reasoning upon which the calculator is based, but they nonetheless use the tool. Similarly...
Source: Life as a Healthcare CIO - Category: Information Technology Source Type: blogs