People who are paralyzed use brain-reading devices to talk

Improving the Efficiency of Brain Implants to Translate and Translate into Words: A Semi-Inclusive Study of a Brain-Insulator

A BCI developed by Willett and his colleagues allows for interpreting neural activity at the cellular level to translate into text. Pat Bennett has a condition known as amyotrophic lateral Sclerosis, also known as motor neuron disease, which causes a loss of muscle control and can make it difficult to speak and move.

While the studies used slightly different approaches, the results were similar in terms of accuracy and speed. The Stanford study had an error rate of 9.1 percent when limited to a 50-word vocabulary and 23.8 percent when expanded to a 125,000-word vocabulary. The brain signals could be converted into words in about four months. The UC San Francisco and Berkeley algorithm was able to decode at a median rate of 78wpm. It had an error rate of 25 percent for a 1,024-word vocabulary, and an error rate of 8.2 percent for a 118-word vocabulary.

Although a 23 to 25 percent error rate isn’t good enough for everyday use, it’s a significant improvement over existing tech. In a press briefing, Edward Chang, chair of neurological surgery at UCSF and co-author of the UCSF study, noted that the effective rate of communication for existing technology is “laborious” at five to 15wpm when compared to the 150 to 250wpm for natural speech.

Chang said at the briefings that sixty to 70 Wpm is a real milestone in the field because it is coming from two different centers and different approaches.

The studies are more proof of concept for the technology than a technology that is ready for prime time. One potential issue is that these treatments require long sessions to train the algorithm. However, researchers from both teams told press at a briefing that they were hopeful that algorithm training would be less intensive in the future.

The devices must also be tested on many more people to prove their reliability. Even though these data are elegant and technically sophisticated, Judy Illes says we have to understand them in a more measured way. “We have to be careful with over promising wide generalizability to large populations,” she adds. I do not know if we are there yet.

It is difficult to make technology easy for people to use at home without requiring caregivers to go through training. In these studies, the brain implants had to be connected by wires to a computer on the outside of the skull. There are some concerns that the solutions may not be permanent. To get the tech to work in consumers’ homes, it will have to be thoroughly tested.

Brain-reading devices allow paralysed people to talk using their thoughts, not to self-interact with their brains: A study of a woman who lost her speech after a brain stroke

Chang said that the potential benefit of this tech would be tremendous if it can be safely and widely implemented. We are looking at that quite seriously, and what the next steps are.

These devices “could be products in the very near future”, says Christian Herff, a computational neuroscientist at Maastricht University, the Netherlands.

“For those who are nonverbal, this means they can stay connected to the bigger world, perhaps continue to work, maintain friends and family relationships,” said Bennett in a statement to reporters.

In a separate study2, Edward Chang, a neurosurgeon at the University of California, San Francisco, and his colleagues worked with a 47-year-old woman named Ann, who lost her ability to speak after a brainstem stroke 18 years ago.

Although the implants used by Willett’s team, which capture neural activity more precisely, outperformed this on larger vocabularies, it is “nice to see that with ECoG, it’s possible to achieve low word-error rate”, says Blaise Yvert, a neurotechnology researcher at the Grenoble Institute of Neuroscience in France.

Chang and his team also created customized algorithms to convert Ann’s brain signals into a synthetic voice and an animated avatar that mimics facial expressions. They personalized the voice to sound like Ann’s before her injury, by training it on recordings from her wedding video.

Ann told the researchers in a feedback session that hearing a voice like your own is emotional. I was able to talk to myself when I was able to.

And the participants of both studies still have the ability to engage their facial muscles when thinking about speaking and their speech-related brain regions are intact, says Herff. “This will not be the case for every patient.”

Source: Brain-reading devices allow paralysed people to talk using their thoughts

Can wind-tunnel experiments help explain why tropical rainforests are colour: The impact of a warming atmosphere on photosynthesis in tropical forests

“We see this as a proof of concept and just providing motivation for industry people in this space to translate it into a product somebody can actually use,” says Willett.

How wind-tunnel experiments could help athletes run the fastest marathon ever, and an analysis that could help explain why birds are the colours they are.

As the climate warms, tropical forests around the world are facing increasing temperatures. But it’s unknown how much the trees can endure before their leaves start to die. A team has combined multiple data sources to try and answer this question, and suggest that a warming of 3.9 °C would lead to many leaves reaching a tipping point at which photosynthesis breaks down. This scenario would likely cause significant damage to these ecosystems’ role in vital carbon storage and as homes to significant biodiversity.

University of California San Francisco neurosurgeon Edward Chang has said, “The effective rate of communication for existing technology is laborious at five to 15wpm…when compared to the 150 to 250wpm for natural speech.” Chang said that the implant that his team used to translate a paralysed woman’s brain signals into a synthetic voice was able to achieve low word-error rate.