One of the most tragic aspects of anarthria, or the inability to speak, is that many individuals who suffer from it can still think well. They just don’t have the same facility with language that the rest of us do. Especially if they are immobilized and unable to use a tablet to type down their thoughts.
Years of research have gone into developing devices to aid anarthric people, especially those who are paralyzed, in communicating. Recent research has focused on implanting devices in or near the brains of anarthric persons so that they can read the electrical impulses that make up their thoughts and beam text to a device that displays it or hears it out.
The technology behind brain-computer interfaces, or BCIs, has been rapidly improving in recent years. However, they still have a serious issue with precision. About a quarter of trial participants had their ideas mistranslated by a top BCI prototype in a big experiment conducted three years ago.
The Chang Lab at the University of California, San Francisco, which oversaw the 2019 study, is now attempting a new strategy. Under the direction of eminent neuroscientist Edward Chang, the group has created a novel BCI that interprets single letters rather than full sentences or phrases. Persons using this method express themselves by writing out their ideas letter by letter.
The study’s early findings are promising. Participants’ thought letters were translated and presented by the BCI with a high degree of accuracy (about 94%). The novel spelling-BCI developed in the Chang Lab may speed up the development of brain implant technology, bringing it closer to widespread use. Including the silenced in the conversation.
It has been three years since the Chang Lab’s BrainNet BCI demo grabbed news. The experiment had two participants who had electrodes placed on their heads similar to those used by neurologists to identify epilepsy using electroencephalograms. BrainNet, in contrast to earlier, less sophisticated BCIs, does not necessitate the intrusive surgical implantation of sensors right into the brain.
During this time, the volunteers were asked to think quietly and focus on a few brief, easy tasks. Volunteers were asked to utter sentences while wearing the EE headsets, and the resulting neural activity was compared to a vocabulary of terms compiled by the team.




