Listen up: new tool to help people who are locked in - TopicsExpress



          

Listen up: new tool to help people who are locked in ift.tt/1gyFksM People who are paralysed and unable to speak may soon be able to communicate simply by focusing on voices saying yes or no while their brain is monitored. , can be the result of motor neurone (Lou Gehrigs) disease, multiple sclerosis or a devastating brain injury. People who are locked in often communicate via tiny eye movements or facial twitches. But sometimes even this may be impossible. Soon they may able to communicate just by listening. Neuroscientist Jeremy Hill aims to use hearing to open up lines of communication for even these most isolated patients. He and his team at the New York State Department of Health have developed a new brain-computer interface. It can detect if someone is paying attention to one spoken word or another by measuring the pattern of electrical activity in the brain. In the new system, users wear headphones and listen to alternating voices: a male saying no in the left ear, and a female saying yes in the right. The act of paying attention to the yes voice over the no produces a distinct electrical brainwave pattern. That can be picked up by electrodes on the scalp, and translated by Hills algorithms to create a computerised yes output. Listening is a very private mental act that doesnt necessarily have an outward sign, says Hill, who presented this work at the 2013 Society for Neuroscience conference in San Diego. But with this brain-computer interface were finding that listening can become an act that influences the world in a very direct way. Ears never get tired In previous studies, Hill and his team used two different beeps, instead of voices, as stimuli. But subjects complained the beeps were unpleasant and sometimes difficult to match to the response they wished to convey. Hill hopes that this latest approach takes the interface closer to becoming an everyday device. In the latest work, they tested the new system on 14 healthy volunteers. They found that on average, the algorithms had an accuracy of about 76 per cent. And responses from two people with advanced Lou Gehrigs disease were processed just as well. Though assistive technologies using visual responses such as eye movement are more versatile than auditory ones at the moment, this could add one more tool to ease communication for people who are locked in, Hill says. He cites one Lou Gehrigs user in the study, who normally communicates via eyebrow movements, welcoming the approach and telling researchers: My eyes get tired, but never my ears. Hills team is now developing an app to allow a smartphone to sync with the system. Locked-in syndrome expert Steven Laureys of the University of Liege in Belgium is supportive of the new approach. I think its very important to offer alternative tools that do not depend on eye movements. We need to adapt to the specific sensory impairments of each individual patient, he says. If you would like to reuse any content from New Scientist, either in print or online, please contact the syndication department first for permission. New Scientist does not own rights to photos, but there are a variety of licensing options available for use of articles and graphics we own the copyright to. Have your say Only subscribers may leave comments on this article. Please log in. Only personal subscribers may leave comments on this article Subscribe now to comment. All comments should respect the New Scientist House Rules. If you think a particular comment breaks these rules then please use the Report link in that comment to report it to us. If you are having a technical problem posting a comment, please contact technical support.
Posted on: Sat, 09 Nov 2013 15:40:33 +0000

Trending Topics



Recently Viewed Topics




© 2015