University of Washington researchers have hooked some people’s brains up to a computer and asked them to play a simple game — no monitor, speakers, or other stimulus included. And it worked. This is a vital first step in showing how humans can interact with virtual realities only through direct brain stimulation.
“The way virtual reality is done these days is through displays, headsets and goggles, but ultimately your brain is what creates your reality,” said UW professor of Computer Science & Engineering and senior author Rajesh Rao.
The paper describes the first case of humans playing a simple, 2D computer game only through input from direct brain stimulation. Five players were presented with 21 different mazes to navigate, with a choice to move forward or down. The game offered them information of obstacles in the form of a phosphene, perceived blobs or bars of light generated through transcranial magnetic stimulation — a technique that uses magnetic coils placed near the skull to stimulate specific areas of the brain.
“The fundamental question we wanted to answer was: Can the brain make use of artificial information that it’s never seen before that is delivered directly to the brain to navigate a virtual world or do useful tasks without other sensory input? And the answer is yes.”
The participants made the right move (avoided obstacles) 15% of the time when they didn’t receive any input. But under direct brain stimulation, they made the right move 92% of the time. They also got better at the game the more they practiced their hand at detecting the artificial stimuli. This goes to show that new information — from artificial sensors or computers — can be successfully encoded and transmitted to the brain to solve tasks. The technology behind the experiment — transcranial magnetic stimulation — is usually employed to study how the brain works, but the team showed how it can be used to convey information to the brain instead.
“We’re essentially trying to give humans a sixth sense,” said lead author Darby Losey.
“So much effort in this field of neural engineering has focused on decoding information from the brain. We’re interested in how you can encode information into the brain.”
This trial was intended as a proof of concept and as such used a very simple binary system — whether a phosphene was present or not — as feedback for the players. But the experiment shows that in theory, the approach can be used to transmit information from any sensor, such as cameras or ultrasounds — to the brain. Even a binary system such as the one used for the game can give a lot of help to certain individuals, such as helping the blind navigate.
“The technology is not there yet — the tool we use to stimulate the brain is a bulky piece of equipment that you wouldn’t carry around with you,” said UW assistant professor of psychology and co-author Andrea Stocco.
“But eventually we might be able to replace the hardware with something that’s amenable to real world applications.”
The team is currently investigating how to create more complex perceptions of various senses by modulating the intensity and location of stimulation in the brain.
“Over the long term, this could have profound implications for assisting people with sensory deficits while also paving the way for more realistic virtual reality experiences,” Rao concluded.
The full paper “Navigating a 2D Virtual World Using Direct Brain Stimulation” has been published in the journal Frontiers in Robotics.