So you made your way to this article, but how did you do it? Did your motor cortex fire up the muscle fibers in your fingers to click on a particular area of the screen, prompting the CPU inside your device to load up this page? One day that could all seem decidedly archaic. That’s because some smart people are investing big time and money into computers that can read your thoughts as they are conceived. The goal is to have machines that know what you want and will give you the information you need before you could literally lift a finger. But how far off might such a future be? Let’s take a look at the current state of these brain-computer interfaces, and the challenges that remain in getting them inside our heads.
Brain-computer interfaces (BCIs) have actually been in the works for decades, but sometimes it takes a billionaire that likes landing rockets on floating pads in the ocean to make an audacious technology actually seem possible. Elon Musk generated quite a buzz when he revealed that he was working on such a thing (more on that later), but in actual fact, the basis for these mind-reading machines has its roots in neuroscience research from almost a century ago.
In 1924, German psychiatrist Hans Berger made the first ever EEG (electroencephalogram) recordings during neurosurgery on a 17-year-old boy. What Berger later described as “alpha and beta” waves would soon be recognized as electrical activity that was, and still is, of huge assistance to physicians working to detect brain disorders.
By attaching electrodes to the scalp and having the measured brainwaves appear onscreen as a graph, physicians can look out for abnormalities and gain insights into the health of the brain. Rapid spikes might be indicative of epilepsy or seizures, for example, while slower waves may be the result of a tumor or stroke. Alzheimer’s, narcolepsy and brain damage are other examples of conditions that can be surveyed by EEG.
Toward direct brain-computer communication
In the 1970’s, an electrical engineer from Belgium called Jacques Vidal started to wonder whether these electrical signals could be used for applications beyond the medical realm. His 1973 peer-reviewed paper “Toward direct brain-computer communication,” was the first to describe a brain-computer interface (he is now credited with coining the term), and explored the feasibility of pulling electrical signals from the brain and converting them into commands for a computer.
“Can these observable electric brain signals be put to work as carriers of information in man-computer communication or for the purpose of controlling such external apparatus as prosthetic devices or spaceships?” wrote the retired air force lieutenant. “Even on the sole basis of the present states-of-the-art of computer science and neurophysiology, one may suggest that such a feat is potentially around the corner.”
That corner may have taken a little longer to round than Vidal guessed, but his ideas on how BCIs could be used are proving quite prescient.
At the FIFA 2014 World Cup in Brazil, an international collaboration of scientists making up The Walk Again Project demonstrated their latest advance in assisted mobility technology: a brain-controlled exoskeleton. Using a set of non-invasive electrodes to read brain signals and relay commands to the lightweight exoskeleton, a paraplegic man completed the symbolic kick-off for the tournament.
We have also seen scientists progress toward mobility solutions by drawing data from non-invasive EEG devices to reconstruct 3D hand and leg movements, enable a paraplegic to walk again using his own paralyzed limbs and allow a quadriplegic woman to eat chocolate with a mind-controlled robotic arm.
And spaceships? Alright we’re not there yet, but NASA is exploring the possibilities. In 2013 the space agency teamed up with scientists from the University of Essex on a project where two subjects controlled a virtual spaceship using BCIs. The study was designed to explore the potential of using BCIs to control planetary rovers, though that kind of thing remains a long way off.
In the meantime, drones aren’t a bad compromise, right? Unmanned aircraft have become quite a popular testbed for BCI technologies. We have seen mind-controlled quadcopters and fixed-wing drones, with some even adding a competitive flavor to the mix to really nudge things along.
In April last year, neuroscientists at the University of Florida held the first Brain Drone Race, an event that asks pilots to will their drones across the finish line using only their thoughts. The technology involved here takes brain signals collected by EEG devices and converts them into control inputs for drones. So rather than pushing left on a joystick, you only have to think about pushing left.
But more than purely competitive spectacle, Brain Drone Race was an attempt at inspiring further developments in the BCI area, with a view to one day using the devices in everyday life. And the scientist behind the event, Juan Gilbert – chair of computer, information science and engineering at the University of Florida – tells us that they are making some good progress.
“We are planning Brain-Drone Race II in a couple of weeks and we have started some projects,” he tells New Atlas. “We have a project called Brainwords where we are trying to use the BCI as an authentication device; imagine using your thoughts as your passwords. We have a project sponsored by Lenovo to play the drums with your thoughts. We are also working on the design of a new BCI that is easier to use by the general population. We have a project on building tools that make the BCI easier to use for app development and we are doing research on the BCI for monitoring your brain activity, or what’s called quantified-self.”
As it stands, non-invasive BCIs like EEG caps need to read the electrical signals through layers of skull and tissue, so there is a lot of noise to sort through, which does limit their use. For the clearest signals and truly game-changing potential, you need to get closer to the source.
Insane in the membrane
They require surgery and carry risk of infection, but BCIs that can be planted inside the head in direct contact with the surface of the brain offer the best signal quality. And this approach has allowed scientists to do some truly remarkable things.
Back in 2014, Dr Ali Rezai, the director of Ohio State University’s Center for Neuromodulation, implanted a tiny 4 x 4 mm microchip on the surface of Ian Burkhart’s motor cortex. 26-year-old Burkhart had suffered a diving injury at age 19 that left him quadriplegic. The doctor’s hope was that this chip, when used with purpose-made algorithms and an electrical sleeve to stimulate muscles in the arm, would allow them to bypass the damaged spinal cord and use Burkhart’s thoughts to control his fingers and hands.
“The results are excellent,” Rezai now tells New Atlas. “Ian is the first human who was able to move his own hand and arm using his thoughts. He initially achieved rough movements of the wrist and hand. Over the past two-and-a-half years, Ian has exceeded our expectations and is able to perform increasingly complex movements that he could not have imagined ever doing again, such as a rapidly opening and closing his hands; moving fingers; grabbing and holding objects like a cup, toothbrush, phone, key and credit card; opening and close a jar; stirring a cup of coffee; pouring from a bottle; holding a phone; feeding and grooming and even playing a video game.”
Another recent example involves a man paralyzed from the shoulders down regaining control of his paralyzed muscles by also bypassing the injured spinal cord. To do this, scientists implanted two aspirin-sized 96-channel electrode arrays into his motor cortex, and connected another set of electrodes to his arm. Then with some training, just by thinking about moving his arm or hand, his brain signals could be translated into electric pulses that triggered the desired muscles movements in his arm.
So the BCIs of today are already impacting lives of disabled people in a very real way. But Elon Musk imagines machines that go well beyond that.
What does this future look like?
So let’s fast forward 10, 20, 40 years down the track, whenever it might be that only total Luddites would dare walk around without BCIs inside their heads. What are we doing? What would communication look like? Do we even need to speak anymore?
“If I were to communicate a concept to you, you would essentially engage in consensual telepathy,” says Musk. “You wouldn’t need to verbalize unless you want to add a little flair to the conversation or something, but the conversation would be conceptual interaction on a level that’s difficult to conceive of right now.”
So what Musk is essentially describing is a completely different kind of communication, one that is impossible for us to wrap our stupid, non-computer-enhanced heads around. Such a platform wouldn’t just make typing by finger on a mobile phone old-hat, it would do the same to speech, our primary means of communication for tens of thousands of years.
All of the thoughts rattling around in your head amount to much more information than can be instantly conveyed in English, French or Mandarin including all the nuanced emotions, half-baked ideas, fleeting moments of inspiration, adrenaline, excitement and fear. Planting a brain-reading device inside your head could open up entirely new ways of expressing yourself.
What does it feel like to have a close shave with death? Score the winning touchdown in a Superbowl? Experience true love and have your heart broken? How about other experiences that defy words? That Musk sees this future as not only possible, but essential for our survival, is a little unsettling, but hey, it beats becoming a house cat.