Augmented tongue ultrasound for speech therapy

(CNRS) Researchers have developed a system that can display the movements of our own tongues in real time. These movements are processed by a machine learning algorithm that controls an 'articulatory talking head.' This avatar shows the tongue, palate and teeth, which are usually hidden inside the vocal tract. This " visual biofeedback " system, which ought to be easier to understand and therefore should produce better correction of pronunciation, could be used for speech therapy.
Source: EurekAlert! - Medicine and Health - Category: International Medicine & Public Health Source Type: news