This article appeared this week in Scientific American. Head Games: Video Controller Taps into Brain Waves. Very like the SQUID headset they used in Kathryn Bigelow’s film Strange Days (seminal viewing if you have never seen it) this headset allows a measure of control to gamers previously unknown – but not of course unimagined. There’s a video of one of the inventors from Emotive doing a demo here – it’s clearly early days but interesting all the same. The full article is below.
No matter how hard you try, your mind can’t bend a spoon or channel the powers of a Jedi knight. Thanks to a new headset under development by neuroengineering company Emotiv Systems, however, you may soon be able to do this and more via the magic of video games.
By the end of this year, San Francisco–based Emotiv’s sensor-laden EPOC headset will enable gamers to use their own brain activity to interact with the virtual worlds where they play. The $299 headset’s 14 strategically placed head sensors are at the ends of what look like stretched, plastic fingers that detect patterns produced by the brain’s electrical activity. These neural signals are then narrowed down and interpreted in 30 possible ways as real-time intentions, emotions or facial expressions that are reflected in virtual world characters and actions in a way that a joystick or other type of controller could not hope to match.
The EPOC detects brain activity noninvasively using electroencephalography (EEG), a measure of brain waves, via external sensors along the scalp that pick up the electrical bustle in various parts of the furrowed surface of the brain’s cortex, a region that handles higher order thoughts. Added to the challenge of attaining clear signals was making sense of them. “The human cortex is unique like a fingerprint,” says Emotiv president Tan Le. “Even though a signal may emanate from [the same region] deep inside the brain, by the time it gets [projected] to the cortex and the surface of the scalp, the signal appears very random. We had to come up with a mathematical algorithm to unfold the cortex and match the signal to its source.”
Emotiv solved this brain–computer interface problem with the help of a multidisciplinary team that included neuroscientists, who understood the brain at a systems level (rather than individual cells), and computer engineers with a knack for machine learning and pattern recognition. Over the last four years, the company has conducted thousands of EEG recordings on hundreds of volunteers—not all gamers—as they experienced virtual scenarios that elicited various emotions, facial expressions and cognitive demands. The aim was to find a revealing brain activity that many people shared—a needle in a haystack of frenzied signals. Now, the EPOC allows users to fine-tune settings that allow it to pick up on even the subtlest of smirks.
When building these algorithms commenced two years ago, it had taken up to 72 hours for a bank of powerful computers to run through a mere 10 seconds of individual brain data and extract important features. Sorting through a seemingly endless stream of recordings eventually led Emotiv to find consistent signal patterns that revealed specific mental experiences. “Through a large enough sample size,” Le says, “we were able to get some consistency around the population to attain a high degree of confidence that it accurately measures an emotional state.”
Homing in on these revealing brain waves allows the EPOC system to quickly deduce a player’s emotional qualities and react to it by, for example, changing the music of a game in real-time to match the user’s tension or throw in more villains in case a player seemed to get bored of a certain world.
Streamlining the neuro-headset, which Le describes as a “high-fidelity EEG-acquisition device,” was another obstacle that involved “loads” of sensors collecting as much data as possible to pinpoint informative spots around the skull where brain activity best revealed a person’s thoughts and emotions. “The challenge has been to optimize the number of sensors to get resolution that is needed to allow robust and accurate detection but at the same time not have a ‘swim cap’ that is hot and uncomfortable,” she adds.
An EEG device that consists of only 14 sensors and can simply be placed snuggly on the head, without signals becoming significantly dampened and muddled, is also novel. Clinicians and researchers often go through great pains to maximize EEG signals by abrading the top layer of skin and applying a conductive gel where the scalp is in contact with the sensors—something not even the passionate gamer would endure. “They must have developed amplifiers that can amplify the signal better,” says Ethan Buch, a brain–computer interface researcher at the National Institute of Mental Health, who uses an EEG recorder with 64 sensors. Buch also suspects that the facial expressions that the EPOC detects are based more on the electrical activity of facial and scalp muscles than the brain per se. Although the electrical activity of muscles, he explained, is normally considered as artifact noise that needs to be filtered out to attain clean EEG signals that are of interest, they are still informative about how facial muscles move, such as during a wink. Tan agrees, saying that in their classification strategy some of the EPOC’s detections are based on muscle movements.
At this stage in development and patenting, Emotiv still has to be secretive about the advances that made the EPOC possible. So far, Emotiv’s intentions are not to replace gaming hand controllers but rather to complement them by including with the headset software, the EmoKey, which associates a game’s unique keystroke command patterns with brain activity. Emotiv also offers a development kit for other game developers, such as Nintendo, to potentially integrate the brain–computer interface technology into their products.
Yet the EPOC’s expressive qualities will most likely be a hit in games that create hyper-realistic alternative realities, such as in virtual worlds like Linden Lab’s Second Life, where social encounters play a big part of the experience. Rather than tap out a “wink” or “frown” command on a keyboard, players will be able to rely on their natural facial expressions to intuitively reveal how they feel to other inhabitants.
Le is not counting out the idea that the EPOC may be developed further to one day allow players to mentally articulate fighting moves that still require the speed of their fingers. But for now, the device is meant more for enhancing the human and interactive component of gaming and virtual world habitation. After all, this is as close as we’re going to get to emulating Luke Skywalker’s ability to pluck his X-wing from the swamps of the planet Dagobah.