Revealing secret intentions in the brain
Scientists decode concealed intentions from human brain activity
Every day we plan numerous actions, such as to return a book to a friend or to make an appointment. How and where the brain stores these intentions has been revealed by John-Dylan Haynes from the Max Planck Institute for Human Cognitive and Brain Sciences, in cooperation with researchers from London and Tokyo. For the first time they were able to "read" participants’ intentions out of their brain activity. This was made possible by a new combination of functional magnetic resonance imaging and sophisticated computer algorithms (Current Biology, 20th February 2007, online: 8th February).
Our secret intentions remain concealed until we put them into action -so we believe. Now researchers have been able to decode these secret intentions from patterns of their brain activity. They let subjects freely and covertly choose between two possible tasks - to either add or subtract two numbers. They were then asked to hold in mind their intention for a while until the relevant numbers were presented on a screen. The researchers were able to recognize the subjects intentions with 70% accuracy based alone on their brain activity - even before the participants had seen the numbers and had started to perform the calculation.
Participants made their choice covertly and initially did not know the two numbers they were supposed to add or subtract. Only a few seconds later the numbers appeared on a screen and the participants could perform the calculation. This ensured that the intention itself was being read out, rather than brain activity related to performing the calculation or pressing the buttons to indicate the response. "It has been previously assumed that freely selected plans might be stored in the middle regions of the prefrontal cortex, whereas plans following external instructions could be stored on the surface of the brain. We were able to confirm this theory in our experiments", Haynes explained.
The work of Haynes and his colleagues goes far beyond simply confirming previous theories. It has never before been possible to read out of brain activity how a person has decided to act in the future. The trick by which the invisible is made visible lies in a new method called "multivariate pattern recognition". A computer is programmed to recognize characteristic activation patterns in the brain that typically occur in association with specific thoughts. Once this computer has been "trained" it can be used to predict the decisions of subjects from their brain activity alone. An important technical innovation also lies in combining information across extended regions of the brain to strongly increase sensitivity.
The study also reveals fundamental principles about the way the brain stores intentions. "The experiments show that intentions are not encoded in single neurons but in a whole spatial pattern of brain activity", says Haynes. They furthermore reveal that different regions of the prefrontal cortex perform different operations. Regions towards the front of the brain store the intention until it is executed, whereas regions further back take over when subjects become active and start doing the calculation. "Intentions for future actions that are encoded in one part of the brain need to be copied to a different region to be executed", says Haynes.
These findings also raise hope for improvement of clinical and technical applications. Already today the first steps are being made in easing the lives of paralyzed patients with computer-assisted prosthetic devices and so-called brain computer interfaces. These devices focus on reading out the movement the patient intends to - but is unable to - perform. Previous research has shown that patients can move artificial limbs or computer cursors purely by the power of their mind. The current research by Haynes and colleagues now opens up a completely new perspective. In future it will be possible to read even abstract thoughts and intentions out of patients’ brains. One day even the intention to "open the blue folder" or "reply to the email" could be picked up by brain scanners and turned into the appropriate action.