This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.theguardian.com/commentisfree/2013/aug/28/brain-to-brain-interfacing-remote-control

The article has changed 2 times. There is an RSS feed of changes available.

Version 0 Version 1
Warning: this article involves brain-to-brain interfacing Warning: this article involves brain-to-brain interfacing
(21 days later)
Imagine if your brain could be seamlessly connected to another person's so that you think their thoughts, feel their feelings and even see what they see. The rapidly developing field of brain-machine interfacing – or more recently brain-brain interfacing – may be bringing this science-fiction vision of "mind melding" closer to reality. But there are good reasons to be sceptical of the more brazen claims emanating from neuroscience laboratories, or from their interpreters in the media. As things stand we are very far from the sort of "mind control" that could mobilise a zombie army.Imagine if your brain could be seamlessly connected to another person's so that you think their thoughts, feel their feelings and even see what they see. The rapidly developing field of brain-machine interfacing – or more recently brain-brain interfacing – may be bringing this science-fiction vision of "mind melding" closer to reality. But there are good reasons to be sceptical of the more brazen claims emanating from neuroscience laboratories, or from their interpreters in the media. As things stand we are very far from the sort of "mind control" that could mobilise a zombie army.
Right now, a study by researchers at the University of Washington is capturing imaginations. They used electroencephalography (EEG) to detect when one person (the "encoder") imagined pressing a button while playing a video game. A computer algorithm then translated this detection event into a signal – transmitted via the internet – to fire a magnetic pulse into the motor cortex of a second person (the "decoder") causing the finger of his right hand to contract, as if pulling a trigger. While this is a fun demonstration, detecting electrical brain signals of intentions-to-move has been possible for decades and always requires the co-operation of the participant. Using this information to evoke a movement in another person is rather trivial – the same signal could have been used to turn on the Blackpool Illuminations. And the fact that the internet carries the signal is completely incidental – carrying signals is what the internet does and the fact that brain 1 might be in the US and brain 2 might be in Europe adds nothing scientifically.Right now, a study by researchers at the University of Washington is capturing imaginations. They used electroencephalography (EEG) to detect when one person (the "encoder") imagined pressing a button while playing a video game. A computer algorithm then translated this detection event into a signal – transmitted via the internet – to fire a magnetic pulse into the motor cortex of a second person (the "decoder") causing the finger of his right hand to contract, as if pulling a trigger. While this is a fun demonstration, detecting electrical brain signals of intentions-to-move has been possible for decades and always requires the co-operation of the participant. Using this information to evoke a movement in another person is rather trivial – the same signal could have been used to turn on the Blackpool Illuminations. And the fact that the internet carries the signal is completely incidental – carrying signals is what the internet does and the fact that brain 1 might be in the US and brain 2 might be in Europe adds nothing scientifically.
The research nonetheless highlights some important opportunities in neuroscience. In a related study, researchers at Duke University claimed to have demonstrated direct brain-to-brain interfacing in rats. This research – published in the journal Scientific Reports – used electrodes implanted into rats' brains to decipher when an "encoder rat" made a decision to press one of two levers. Signals were then sent (again across the internet) into the corresponding brain area of the "decoder rat", leading to this second rat being more likely to press the same lever. Here, unlike the Washington study, the transmitted signals appear to be influencing a decision rather than eliciting an automatic reflex – but even so the decision is very simple and decoding was far from perfect.The research nonetheless highlights some important opportunities in neuroscience. In a related study, researchers at Duke University claimed to have demonstrated direct brain-to-brain interfacing in rats. This research – published in the journal Scientific Reports – used electrodes implanted into rats' brains to decipher when an "encoder rat" made a decision to press one of two levers. Signals were then sent (again across the internet) into the corresponding brain area of the "decoder rat", leading to this second rat being more likely to press the same lever. Here, unlike the Washington study, the transmitted signals appear to be influencing a decision rather than eliciting an automatic reflex – but even so the decision is very simple and decoding was far from perfect.
What all this means is that we need to better understand the how the brain speaks to itself – the so-called "neural code". This is an incredibly active area of research. Neuroscientists are making important strides in figuring out, for example, whether it is the precise timing of neuronal "spikes" or just the overall level of activity of neurons that matters in transmitting information between different parts of the nervous system. And this research is supporting advances in so-called brain reading, where complex machine-learning algorithms are applied to brain data in order to detect what a person may be perceiving, imagining, or intending, but without trying to impose these phenomena on to a separate brain.What all this means is that we need to better understand the how the brain speaks to itself – the so-called "neural code". This is an incredibly active area of research. Neuroscientists are making important strides in figuring out, for example, whether it is the precise timing of neuronal "spikes" or just the overall level of activity of neurons that matters in transmitting information between different parts of the nervous system. And this research is supporting advances in so-called brain reading, where complex machine-learning algorithms are applied to brain data in order to detect what a person may be perceiving, imagining, or intending, but without trying to impose these phenomena on to a separate brain.
Again there are limits – the current state of the art is well represented by the ability to partially reconstruct movie scenes from brain data alone – but there are also important implications, most crucially in the clinic. Brain-reading technologies could allow paralysed people to regain control of their limbs by using their decoded intentions to move to directly control their muscles. And they have already been successfully deployed in allowing severely brain-damaged patients to regain a form of non-behavioural communication (for example by imagining playing tennis to signal "yes"). In the near future these developments could extend to treating a range of other disabling neurological conditions.Again there are limits – the current state of the art is well represented by the ability to partially reconstruct movie scenes from brain data alone – but there are also important implications, most crucially in the clinic. Brain-reading technologies could allow paralysed people to regain control of their limbs by using their decoded intentions to move to directly control their muscles. And they have already been successfully deployed in allowing severely brain-damaged patients to regain a form of non-behavioural communication (for example by imagining playing tennis to signal "yes"). In the near future these developments could extend to treating a range of other disabling neurological conditions.
But what about the future of brain-to-brain interfacing – the transmission of thoughts, ideas, intentions, and percepts from one person to another? As things stand, by far the most effective technology for this purpose has been around for a long time. It's called language.But what about the future of brain-to-brain interfacing – the transmission of thoughts, ideas, intentions, and percepts from one person to another? As things stand, by far the most effective technology for this purpose has been around for a long time. It's called language.
Our editors' picks for the day's top news and commentary delivered to your inbox each morning.