The FDA Needs to Set Standards for Using Artificial Intelligence in Drug Development

By CHARLES K. FISHER, PhD Artificial intelligence has become a crucial part of our technological infrastructure and the brain underlying many consumer devices. In less than a decade, machine learning algorithms based on deep neural networks evolved from recognizing cats in videos to enabling your smartphone to perform real-time translation between 27 different languages. This progress has sparked the use of AI in drug discovery and development. Artificial intelligence can improve efficiency and outcomes in drug development across therapeutic areas. For example, companies are developing AI technologies that hold the promise of preventing serious adverse events in clinical trials by identifying high-risk individuals before they enroll. Clinical trials could be made more efficient by using artificial intelligence to incorporate other data sources, such as historical control arms or real-world data. AI technologies could also be used to magnify therapeutic responses by identifying biomarkers that enable precise targeting of patient subpopulations in complex indications. Innovation in each of these areas would provide substantial benefits to those who volunteer to take part in trials, not to mention downstream benefits to the ultimate users of new medicines. Misapplication of these technologies, however, can have unintended harmful consequences. To see how a good idea can turn bad, just look at what’s happened with social media since the rise...
Source: The Health Care Blog - Category: Consumer Health News Authors: Tags: Health Tech Health Technology AI Charles Fisher FDA Regulation Source Type: blogs