NYTimes Bits Blog: Brain-computer interfaces moving closer to the mainstream

The ACM TechNews email pointed me to a new entry in the New York Times Bits blog on the subject of brain-computer interfaces. Human-computer interface (HCI) technologies are the means by which we tell computers what we want them to do, and how they respond to our commands, requests, and in some cases plaintive beggings. The computer-to-human direction is working best so far. Computers generally do fairly well at communicating with healthcare providers, including but not limited to physicians and nurses. Providers generally are asking computers for information, and computers can easily display text and graphical information. There are numerous caveats around that generalization, including but not limited to screen clutter, alert fatigue, and proliferation of what Edward Tufte calls "chart junk". But overall, humans are pretty forgiving in terms of processing audiovisual information, and HCI has unarguably made order-of-magnitude improvements in the delivery of information over the past couple decades. The human-to-computer direction is in worse shape. We mostly still use keyboards and mouse-like pointing devices on the desktop, and crude touch-screen interfaces on mobile devices. All these HCI technologies were introduced in the late 1960's and early '70's, before many members of the current healthcare workforce were even born. For decades we have been promised voice input, gesture recognition via haptic interfaces, handwriting recognition, and the promises remain largel...
Source: FutureHIT - Speculations on the Future of Health IT - Category: Technology Consultants Authors: Source Type: blogs