Skip to main content

New Brain Device Is First To Read Out Inner Speech

4 months 1 week ago
An anonymous reader quotes a report from ScientificAmerican: After a brain stem stroke left him almost entirely paralyzed in the 1990s, French journalist Jean-Dominique Bauby wrote a book about his experiences -- letter by letter, blinking his left eye in response to a helper who repeatedly recited the alphabet. Today people with similar conditions often have far more communication options. Some devices, for example, track eye movements or other small muscle twitches to let users select words from a screen. And on the cutting edge of this field, neuroscientists have more recently developed brain implants that can turn neural signals directly into whole words. These brain-computer interfaces (BCIs) largely require users to physically attempt to speak, however -- and that can be a slow and tiring process. But now a new development in neural prosthetics changes that, allowing users to communicate by simply thinking what they want to say. The new system relies on much of the same technology as the more common "attempted speech" devices. Both use sensors implanted in a part of the brain called the motor cortex, which sends motion commands to the vocal tract. The brain activation detected by these sensors is then fed into a machine-learning model to interpret which brain signals correspond to which sounds for an individual user. It then uses those data to predict which word the user is attempting to say. But the motor cortex doesn't only light up when we attempt to speak; it's also involved, to a lesser extent, in imagined speech. The researchers took advantage of this to develop their "inner speech" decoding device and published the results on Thursday in Cell. The team studied three people with amyotrophic lateral sclerosis (ALS) and one with a brain stem stroke, all of whom had previously had the sensors implanted. Using this new "inner speech" system, the participants needed only to think a sentence they wanted to say and it would appear on a screen in real time. While previous inner speech decoders were limited to only a handful of words, the new device allowed participants to draw from a dictionary of 125,000 words. To help keep private thoughts private, the researchers implemented a code phrase "chitty chitty bang bang" that participants could use to prompt the BCI to start or stop transcribing.

Read more of this story at Slashdot.

BeauHD

Sam Altman's Brain Chip Venture Is Mulling Gene Therapy Approach

4 months 1 week ago
Sam Altman's brain-chip venture is exploring the idea of genetically altering brain cells to make better implants. "The company, which has been referred to as Merge Labs, is looking at an approach involving gene therapy that would modify brain cells," reports Bloomberg. "In addition, an ultrasound device would be implanted in the head that could detect and modulate activity in the modified cells." From the report: It's one of a handful of ideas and technologies the company has been exploring, they said. The venture is still in early stages and could evolve significantly. "We have not done that deal yet," Altman told journalists at a dinner Thursday in San Francisco, referring to a question about a brain-computer interface venture. "I would like us to." Altman said he wants to be able to think something and have ChatGPT respond to it. [...] For years, researchers have been studying how to genetically change cells to make them respond to ultrasound, a field called sonogenetics. The idea Merge is considering to combine ultrasound with gene therapy could take years, some of the people said. Ultrasound has attracted significant attention recently as a possible brain therapy. Other companies are exploring the idea of using ultrasound transmitters outside the brain to massage brain tissue, with the goal of treating psychiatric conditions. That kind of technology has shown promise in research studies.

Read more of this story at Slashdot.

BeauHD