Keeping Our Thoughts Private in the Age of Mind-Reading

Manahel Thabet, PhD

Brain expert

With brain computer interfaces (BCIs) having become commercially available after extensive use in the medical sector, recent research has found that they can be used to hack our brains for PINs or mine our minds for data.

What is an EEG, and what have studies concerning its security found?

Two new studies by the University of Alabama and the University of Washington have revealed the malicious possibilities lurking in the shadows of impressive promises of brain-computer interface (BCI) developers: the ability to access PINs and other private information.

Electroencephalograms (EEG) are tests that detect electrical activity in your brain using a skullcap studded with electrodes. This technology has been used in the medical sector for years — for example, to diagnose schizophrenia as far back as 1998. However it is now due to be used for far more commercial enterprises. Rudimentary versions, such as Emotiv’s Epoc+ have been released with the promise of far more sophisticated versions just around the corner, including examples being developed by Elon Musk and Facebook.

The University of Alabama’s study discovered that hacking into a BCI could increase the chances of guessing a PIN from 1 in 10,000 to 1 in 20; it could shorten the odds of guessing a six-letter password by roughly 500,000 times to around 1 in 500. Emotiv has dismissed the criticisms, stating that all software using its headsets is vetted and that users would find the activity of inputting codes suspicious; but Alejandro Hernández, a security researcher with IOActive, claimed that the Alabama case is “100 percent feasible.”

EEG graph
Illustration
The test involved people entering random pins and passwords while wearing the headset, allowing software to establish a link between what was typed and brain activity. After data from entering 200 characters was gathered, algorithms could then make educated guesses what characters they would enter next. Nitesh Saxena, Research Director of the department of Computer and Information Sciences at the University of Alabama, detailed a situation in which someone still logged on to a gaming session while checking their bank details could be at risk.

The University of Washington test focused on gathering data. In their study, subliminal messages flashed up in the corner of a gaming screen while EEG gauged the participant’s response. Tamara Bonaci, a University of Washington Electrical Engineer, said that “300 milliseconds after they saw a stimulus there is going to be a positive peak hidden within their EEG signal” if they have a strong emotional reaction to it. Howard Chizeck, Bonaci’s fellow electrical engineer who was also involved with the project, said, “This is kind of like a remote lie detector; a thought detector.” Potential uses of data could stretch from more targeted advertising than ever before to determining sexual orientation or other such personal information that could be used to coerce users.

“This is kind of like a remote lie detector; a thought detector.”

How serious is the threat?

While some BCIs are being used in extremely positive ways, like diagnosing concussions or allowing people with severe motor disabilities control over robotic aids, the threat to security and privacy posed by commercially available and mainstream BCIs is huge.

Experts have advised that we begin to think of means of protection now, rather than after the technology has become more widespread. Howard Chizeck told Motherboard over Skype that “There’s actually very little time. If we don’t address this quickly, it’ll be too late,” while scientists from the University of Basal and the University from Zurich have called for a “right to mental privacy” in response to the developments.

As with many technologies, the novelty and potential of BCI is headily seductive, but we must beware of the practical consequences their use may give rise to.Worryingly, there has been very little work towards providing protection against such attacks. The BCI Anonymizer, still in its propositional stages, aims to “extract information corresponding to a user’s intended BCI commands, while filtering out any potentially private information.” But aside from this, there is very little else.

References: MIT Technology ReviewMotherboardVenture RadarBCI AnonymizerFuturism