How the US military tries to read minds

The new research program DARPA is developing a brain-computer interface with which it will be possible to "control swarms of drones, working at the speed of thought." What if it works out?







In August, three graduate students from Carnegie Mellon University crammed into a tiny, windowless basement laboratory room to shock a mouse brain using an impromptu setup assembled from a 3D printer.



This piece of brain, carved from the hippocampus , looked like a thin slice of garlic. He rested on a platform located closer to the center of the device. A thin tube washed the slice with a solution of salt, glucose and amino acids. Thus, he was maintained in a state somewhat similar to living: the neurons inside him continued to be activated, allowing the experimenters to collect data. An array of electrodes located below gave out electrical discharges, and a metal probe, similar to a syringe, measured the response of neurons. Bright LED lamps illuminated the cup. It all looked assembled on the knee.



On the monitor next to the device, stimuli and a response were visible: a few milliseconds after the electrical impulses, the neurons were activated. Later, the experimenters put material similar in conductivity and transparency to the bones of the skull between a piece of tissue and electrodes to find out if they can stimulate the mouse hippocampus through imitation of the skull.



They did this because they wanted to recognize and control the signals of the human brain without the need to cut the skull and touch the delicate brain tissue. Their goal is to develop accurate and sensitive brain-computer interfaces that can be removed and put on like a helmet or headband - without any surgery.



The thickness of the bones of the human skull is less than a centimeter. The exact numbers vary from person to person, as well as from one place to another. They blur the waveforms, whether it is electric current, light or sound. Brain neurons can be as small as a few thousandths of a millimeter, and generate electrical impulses with a voltage of twenties of a volt.



In the experiments of the graduate students, it was supposed to collect basic data that could be compared with the results of applying the new technology, which Pulkit Grover, the team’s chief scientific consultant, hopes to develop.



“So far it’s impossible to do this, and it’s a very difficult task,” says Grover. He leads one of the six teams participating in the Next-generation Nonsurgical Neurotechnology Program, or N 3 , a $ 104 million project launched this year by DARPA. Grover's team works with electricity and ultrasound; other teams use magnets or optics. And if any of them succeeds, the results will be revolutionary.



Surgery is expensive, and surgical intervention to create a super-soldier is also ethically difficult. A device for reading thoughts that does not require surgical intervention will open up a huge number of possibilities. Neuro-computer interfaces (NQIs) were used to return paralyzed people partial control over the body and allow war veterans in Iraq and Afghanistan who have lost limbs to control artificial ones. N 3 is the first serious attempt by the US military to develop NKI for more aggressive use. “Working with individual drones and their flocks, at the speed of thought, and not at the speed of mechanical devices - that’s why these devices are really needed,” says Al Emondi, director of program N 3 .



Computer Specialist at the University of California, Los Angeles, Jacques J. Vidal, first used the term brain-computer interface back in the early 1970s; and it turned out to be one of those phrases, such as “artificial intelligence”, whose definition evolves in parallel with the development of the capabilities it describes. Electroencephalography (EEG), recording brain activity using electrodes placed on the skull, can be considered the first interface between the brain and the computer. By the end of the 1990s, researchers from Keyes University of the Western Reserve Region used EEG to interpret brain waves of a paralyzed person, which allowed him to move the cursor on a computer using electrodes connected to his skull.



Since then, technologies for reading signals from the brain of both types, both invasive and non-invasive, have been actively developed. Devices are also being developed that stimulate the brain with electrical signals to treat diseases such as epilepsy. The most powerful mechanism to date is the microelectrode array, known as the Utah array. It looks like a tiny bed with nails the size of half a nail on the little finger, able to penetrate into a given part of the brain.



Once, in 2010, while on vacation at the Outer Banks - a strip of narrow sandy barrier islands off the coast of North Carolina - Ian Burkhart ducked into the ocean and hit his head on a sandbank. He damaged his spinal cord and lost functions from the sixth cervical nerve and below. He could move his shoulders and elbows, but not with his hands and feet. Physiotherapy did not help much. He asked doctors at the Wexner Medical Center at Ohio University if they could do anything else. It turned out that Veksner wanted to conduct a study together with the non-profit research company Battelle to find out if mobility could be restored to the limbs of the paralytic using the Utah massif.



If the EEG shows the total activity of countless neurons, then Utah arrays can record pulses coming from a small number of them, or even from one neuron. In 2014, doctors implanted the Utah array into Burkhart's head. The array measured the electric field at 96 points of its motor cortex 30,000 times per second. Burkhart visited the laboratory several times a week for more than a year, and researchers at Battelle trained signal processing algorithms to perceive his intentions, while he systematically and intensely thought about how he wanted to move his hand.



A thick cable connected to a platform on Burkhart's skull sent the pulses measured by the Utah array to a computer. The computer decrypted them and transmitted the signals to a sleeve filled with electrodes and covering his right forearm. The sleeve activated his muscles to carry out the desired movement - grabbing, lifting a load, emptying a bottle, or removing a card from a wallet.



This made Burkhart one of the first people to regain muscle control through a “neural bypass”. Now Battelle, also participating in the N 3 program, is working with him, trying to find out whether it is possible to achieve the same results without implants in the skull.



The question is not only in the creation of new devices, but also in the development of better signal processing techniques that can recognize weak, muffled signals perceived from the outside of the skull. Therefore, the Carnegie Mellon team is led by Grover, an electrical engineer, not a neuroscientist.



Soon after Grover arrived at the Carnegie Mellon Institute, a friend of his from a medical school at the University of Pittsburgh invited him to participate in clinical meetings with patients with epilepsy. He began to suspect that much more information can be extracted from the EEG than everyone thought before - and, therefore, ingenious manipulations of external signals can affect the deeper layers of the brain. A few years later, a team led by Edward Boyden from the Center for Neurobiological Engineering at MIT published a noteworthy article that goes far beyond Grover’s initial assumption.



Boyden's group applied two electrical signals of high, but slightly different frequencies, to the outer part of the skull. They influenced the work of neurons, but not those located on the surface of the brain, but those that were located deeper. As part of a phenomenon called constructive interference, they received a signal of a lower frequency, stimulating the activation of neurons.



Grover and his team are now working on expanding Boyden's results, using hundreds of electrodes located on the surface of the skull to accurately target small areas of the inner regions of the brain and control the signal, switching it from one part of the brain to another without moving the electrodes. Grover says such an idea would hardly have occurred to neuroscientists.



Meanwhile, at the Johns Hopkins University Laboratory of Applied Physics (APL), another team from Project N 3 uses a completely different approach: close to infrared light.



According to modern concepts, nerve tissue swells and contracts when neurons emit electrical signals. Scientists also record these signals using EEG, the Utah array or other technologies. APL Dave Blodget claims that swelling and contraction of tissue may not be the worst quality signal, and wants to create an optical system that measures these changes.



Past technologies could not capture such tiny physical movements. But Blodget and the team have already shown that they are able to register the nervous activity of the mouse when it moves one of the antennae. Ten milliseconds after the movement of the antennae, Blodget recorded the activation of the corresponding neurons using his optical technology. And in exposed nerve tissue, his team recorded the activity of neurons in 10 microseconds - at the same speed as the Utah array or other electrical methods.



The next task to be solved will be recording through the bones of the skull. It sounds like something impossible: after all, the skull is opaque to visible light. However, light close to infrared can pass through the bones. The Blodgett team scans the skull with low-energy infrared lasers, and measures the scattering of light from these lasers. He hopes that from this it will be possible to extract information about the activity of neurons. This approach has less evidence than the use of electrical signals, however, it is for such risks that the DARPA programs are designed.



At Battelle, Gaurav Sharma is developing a new type of nanoparticle capable of penetrating the blood-brain barrier . This technology is called minimally invasive in DARPA. In nanoparticles, a core sensitive to magnetic fields is surrounded by a shell of a material that generates electricity at pressure. If these nanoparticles are placed in a magnetic field, then the core of the particle will press on the shell, which will produce a small current. A magnetic field is far better suited to shining through the skull than light, says Sharma. Different magnetic coils allow scientists to target certain parts of the brain, and this process can be reversed - to convert electric currents into magnetic fields by reading the signal.



It is not yet known which of these approaches will succeed, and whether. Other teams from N 3 use various combinations of light, electric, magnetic and ultrasonic waves to transmit signals to the brain and read them from the outside. All this, no doubt, is very interesting. However, for all these enthusiasms, one should not forget how bad the situation is in the Pentagon and in corporations such as Facebook (which also develops NCI), with a huge number of ethical, legal and social issues that the non-invasive NCI raises. How do flocks of drones controlled by the human brain change the nature of wars? Emondi, chapter 3 , says that NKIs will be used as needed. However, military necessity is a loose concept.



In August, I visited the laboratory in Battelle, where Burkhart worked for several hours with a new sleeve equipped with 150 electrodes that stimulate the muscles of the arm. He and the researchers hoped they could get the sleeve to work without relying on Utah's array of brain signals.





Ian Burkhart and Researcher





Utah Array



With damage to the spinal cord, it is very difficult to think about arm movement. Burkhart was tired. “Everything works in steps - the more active I think, the stronger the movement,” he told me. - Before, I didn’t have to think “open my hand” - I just took and raised the bottle. But I am very motivated by the result - more than everyone else present. ” Thanks to him, it is easy to see the potential of this technology.



He said that since the start of work with the Utah massif, he has become more powerful and agile, even in those periods when he is not using it. He can already live almost on his own, and he needs help only a few hours a day. “I can talk more with my hands. I can hold the phone, he says. “If this project results in something that can be used daily, I will wear it as long as I can.”



All Articles