Site icon Wonderful Engineering

This Program Turns Mental ‘Handwriting’ Into On-Screen Words

It’s amazing how technology and artificial intelligence is being used to help disabled people overcome their disability like how Steven Hawkings used a machine to communicate his thoughts to the world. These researchers from Stanford University have developed a technique called mind writing. It basically uses a device implanted in the brain to read information from the brain and convert it into text on a computer screen.

The device is a BCI or a brain-computer interface that has to be implanted in the brain. This coupled with the help of artificial intelligence, enabled the system to read a person’s thoughts about handwriting and displayed them on screen. The new research was published online in the journal Nature. The device was tested on a man with full-body paralysis.

The aim of the research is to help many Americans and even thousands of people across the world that have lost their upper limbs or the ability to speak due to spinal cord injuries, strokes, and even Lou Gehrig’s disease, also known as amyotrophic lateral sclerosis. According to Jamie Henderson MD, a professor of neurosurgery, “This approach allowed a person with paralysis to compose sentences at speeds nearly comparable to those of able-bodied adults of the same age typing on a smartphone. The goal is to restore the ability to communicate by text”.

The results showed that the participant in the study, who was only referred to as T5, could produce text at a rate of 18 words per minute while an adult of the same age can type about 23 words per minute of their smartphone. The results show promise, especially considering that T5 had lost all movement below his neck due to a spinal cord injury back in 2007.

This was all possible due to two small BCI chips that were implanted on the side of T5’s brain. Each chip was the size of an aspirin, containing over 100 electrodes. The electrodes were able to pick up signals from neurons firing in the part of the motor cortex, a region of the brain’s outermost surface that governs hand movement.

The AI algorithms used to decode the brain signals were designed in Stanford’s Neural Prosthetics Translational Lab, co-directed by Henderson and Krishna Shenoy, Ph.D., professor of electrical engineering, and the Hong Seh and Vivian W. M. Lim Professor of Engineering. Shenoy and Henderson have been collaborating for a long time with Frank Willet, who is also a research scientist in the lab.

Willet talked about the research, saying that “We’ve learned that the brain retains its ability to prescribe fine movements a full decade after the body has lost its ability to execute those movements. And we’ve learned that complicated intended motions involving changing speeds and curved trajectories, like handwriting, can be interpreted more easily and more rapidly by the artificial-intelligence algorithms we’re using than can simpler intended motions like moving a cursor in a straight path at a steady speed. Alphabetical letters are different from one another, so they’re easier to tell apart”.

Chips used restricted in their use by law and can only be used for research, nothing like them is available for commercial use. The researchers are hopeful that the system could offer even more options for patients to communicate effectively.

Exit mobile version