California scientists have found a way to translate their thoughts into speech



[ad_1]

"For the first time, this study demonstrates that we can generate complete spoken sentences based on the brain activity of an individual," said Edward Chang, professor of neurological surgery and lead author of the journal. study, in a press release. "It is an exalting proof of principle that with technology already within our reach, we should be able to build a clinically viable device for speech-impaired patients."

Gopala Anumanchipalli, a speech specialist who led the research, said the breakthrough was achieved by linking brain activity to mouth and throat movements during speech, rather than by badociating the brain signals to acoustics and sounds.

"We felt that if these centers of speech in the brain encoded movements rather than sounds, we should try to do the same to decode these signals," he said in a press release.

Up to 69% of the words generated by the computer were accurately identified by people asked to transcribe the voice of the computer. The researchers said this rate was significantly higher than previous studies.

"We still have some way to go to imitate spoken language," said Josh Chartier, a graduate student in bioengineering who had worked on research. "We are good enough to synthesize slower vocal sounds such as" sh "and" z ", as well as to preserve the rhythms and intonations of the speech, as well as the speaker's gender and identity, but some the most abrupt sounds such as "b and" p However, the levels of precision that we have produced here would constitute an incredible improvement in real-time communication compared to what is currently available. "

[ad_2]
Source link