Scientists use AI to decipher words and sentences from brain scans

A technique based on artificial intelligence (AI) can translate brain scans into words and sentences, a team of computational neuroscientists reports. Although in the early stages and far from perfect, the new technology might eventually help individuals with brain injuries or paralysis regain the ability to communicate, researchers say. The study “shows that, using the right methods and better models, we can actually decode what the subject is thinking,” says Martin Schrimpf, a computational neuroscientist at the Massachusetts Institute of Technology who was not involved in the work. Other research teams have created brain-computer interfaces (BCIs) to, for example, translate a paralyzed patient’s brain activity into words. However, most of these approaches rely on electrodes implanted in the patient’s brain. Noninvasive techniques based on methods such as electroencephalogram (EEG), which measures brain activity via electrodes attached to the scalp, have fared less well. BCIs based on EEG have so far only been able to decipher phrases and can’t reconstruct coherent language, Schrimpf says. Previous BCIs also typically focused on individuals attempting to speak or thinking about speaking, so they relied on areas of the brain involved in producing speech-related movements and only worked when a person was moving or attempting to move. Now, Alexander Huth, a computational neuroscientist at the University of Texas at Austin, and colleagues have devel...
Source: Science of Aging Knowledge Environment - Category: Geriatrics Source Type: research