Facebook terminates the neural mental reading interface that restored a user’s speech • The Register



[ad_1]

Update Facebook abandons a project to develop a brain-computer interface (BCI), even as researchers it funded introduced the device helping a person with severe speech disabilities communicate with nothing more than the thought.

In an article published in The New England Journal of Medicine Facebook-funded researchers have shown how a “neuroprosthesis” could be used to restore speech in a subject who had lost ability over 16 years ago, following a stroke – simply by sensing intending to speak and transcribing it into words.

“To our knowledge, this is the first successful demonstration of direct decoding of complete words from the brain activity of a person who is paralyzed and unable to speak,” said Edward Chang, president of neurosurgery at the University. from California to San Francisco and lead author of the study. . “This has great promise for restoring communication by harnessing the brain’s natural speech machinery.

“My research team at UCSF have been working on this goal for over a decade. We have learned so much about how speech is processed in the brain during this time, but it is only in the last five years. This has taken us to this milestone in recent years as advances in machine learning have brought us this milestone. That, combined with Facebook’s machine learning advice and funding, has really accelerated our progress. “

Youtube video

Facebook has been working with Chang on the fusion of man and machine for some time, showing its funding for research on a system to turn thoughts into text in 2019 – much to the dismay of some. The work followed a 2017 proclamation that the company needed to build exactly one such device by 2019, so it can’t at least be faulted for sticking to the roadmap.

“To see this work come to fruition has been a dream for the field for so long, and for me personally,” said Emily Mugler, head of neural engineering research at Facebook Reality Labs. “As a BCI scientist, one of my main activities throughout my career has been to demonstrate that the neural signals that control the articulation of speech can be decoded for a more efficient BCI that can be used for communication. .

“These findings have opened up many possibilities for assistive technologies that could dramatically improve the quality of life of people with speech impairments.”

Despite this success, Facebook has apparently lost interest. “While we still believe in the long-term potential of head-mounted BCI optical technologies,” the company announced, “we decided to focus our immediate efforts on a different neural interface approach that has a shorter path. term to market: wrist devices powered by electromyography. “

In this project, a wrist sensor will track signals in the wearer’s motor neurons and translate them into the machine’s input. “In the short term, these signals will allow you to communicate with your device with a degree of control that is highly reliable, subtle, customizable and adaptable to many situations,” the company boasted.

“As this area of ​​research evolves, EMG-based neural interfaces have the potential to dramatically expand the bandwidth with which we can communicate with our devices, opening up the possibility of things like high-speed typing. “

The problem doesn’t appear to be a lack of success, but simply a path to market: Facebook is looking to release a real product and seems to believe that proven, inexpensive, non-invasive wearables built around EMG sensors are the way. to be continued.

“We can confidently say that as a consumer interface, a head-mounted silent optical speech device is still a long way off,” said Mark Chevillet, Facebook research director, who was in charge of the project. BCI. MIT Technology Review. “Maybe longer than we expected.”

Facebook was far from alone in trying to crack the secret to adding a USB port to the side of someone’s head. DARPA threw money at five companies working on brain-computer interfaces in 2017, and Silicon Valley bad boy turned cryptocurrency market manipulator, Elon Musk’s Neuralink, unveiled his lace tech neuronal in 2019 before sticking it into a pig a year later.

Facebook’s full report of the work, its upcoming EMG bracelet, and other research projects can be found on the Tech @ Facebook blog.

Nor are its efforts completely in vain: the company has released its BCI software, LabGraph, under the permissive MIT license on GitHub, and is committed to sharing the hardware designs “with key researchers and other peers.”

We asked Facebook to comment further. ®

Updated at 14:22 UTC on 7/15/2021 to add:

A Facebook spokesperson said The Reg: “Speech was the focus of our BCI research because it is inherently high bandwidth – you can speak faster than you can type. But speech is not the only way to apply this research – we can shoot built on the foundational work of the BCI team to enable intuitive wrist-based controls too.

“In view of this, we are no longer pursuing a line of research to develop a silent, non-invasive voice interface that would allow people to type just by imagining the words they want to say. Instead of a speech-based neural interface, we’re looking for new forms of intuitive control with EMG. “

[ad_2]

Source link