How would a brain machine interface work? A brain machine interface would work by reading the brain’s electrical signals and converting them into a signal that a machine can use. They could be worn on the head or implanted in the brain.
At its simplest level, the brain machine interface is just reading the electricity in the brain. Everything our brain does, from thinking to moving muscles, comes down to electrical signals. For example, if you want to pick up a coffee cup, the decision is made in the cerebrum part of the brain. The brain then analyzes other signals, such as sight, to work out where the cup is and decide what is involved in picking it up. Then the signals to pick the cup up are transmitted across the neurons to the nervous system and on to the muscles, where they make the muscles contract. The idea behind the brain machine interface is that something on or in the brain picks up these electrical signals and broadcasts them to a device. That sounds much easier to say than it is to do.
There are two kinds of brain machine interface devices. There are invasive ones that are implanted into the brain and there are non-invasive ones, which are worn on top of the head. Currently, the main function of brain machine interfaces is to allow people without mobility to use devices or to communicate, or to be able to control a bionic limb. Chips have been surgically implanted into a patient with paralysis to allow them to move a computer mouse by thinking about it. Invasive interfaces are better because they are closer to the brain and can pick up the electrical signals more clearly while the non-invasive ones have to read the signal through the skull, so it is not as strong. However, there is a lot of risk involved with implanting a chip into the brain and there can be a lot of scar tissue. There is also the risk that the body will reject the chip and there will be an immune response.
Brain machine interfaces use sensors that pick up the electrical activity in the brain and then analyze it. They can pick it up by just being near the brain and working out which part of the brain is firing, or they can be implanted into the brain. This gives the best signal, but the damage to the brain will cause scar tissue around the neurons until the sensors can no longer pick them up. The signal is picked up and sent to the microchip where machine learning tries to work out what the signal means. This is done with neural decoding software, which tries to learn with feedback. These devices can help people move exoskeletons and robotic limbs, but they can also help people who have had strokes or other debilitating brain injuries to speak. Currently, a brain chip developed by Stanford University can read the signals from a patient’s brain well enough to translate them into 62 words a minute, which is conversation speed. That means people who have lost the power of speech, such as people suffering from locked-in syndrome can still talk.
These devices sound amazing, but it takes a lot of practice to get them to work and there is a steep learning curve. The microchip that can translate what someone is trying to say into words is not hearing that person’s thoughts. It is watching the electrical signals and trying to match those signals to a database of words that it has. The person will need to teach the system which words they are trying to say in the beginning, and the microchip will slowly learn until it can translate much faster.
The first step is trying to make devices that can help people with disabilities, and it goes both ways. Experiments show that signals can be taken from the brain and translated into an action, but signals can also be fed into the brain. If someone has lost their sight, perhaps because of an injury to their eyes, the part of the brain that analyzes sight is still working. A device was made that sent signals from glasses fitted with a camera into the brain and the brain could interpret them as images. The images were not great, but it is still early days.
The next step might be brain implants for all of us that allow us to manipulate devices just by thinking. If that happens, have we become telekinetic? And this is what I learned today.
Liked this? Read these:
Sources
https://cumming.ucalgary.ca/research/pediatric-bci/bci-program/what-bci
https://en.wikipedia.org/wiki/Brain%E2%80%93computer_interface