Press "Enter" to skip to content

DARPA developing bi-directional wireless brain-to-machine interface to control weapons and other applications, like making you see and feel things

DARPA wants to wirelessly connect human brains to machines, allowing soldiers to fully control weapon systems, view things remotely like a Three-Eyed Raven, or feel stuff remotely.

Imagine the potential applications, not only for military control of airplanes and weapons, but also for civilian uses, from controlling artificial vision systems to completely immersive virtual reality systems that will transport your consciousness to alternate relatives.

It’s the stuff of science fiction, but scientists have already made some advances in the field. The first human neuroprosthetic devices appeared in the mid-1990s, allowing patients to crudely control artificial limbs. But those required surgical interventions. More experiments during the early 2010s showed that scientists could reconstruct people’s vision into digital video, but that’s far from being real time. Most recently, scientists at Carnegie Mellon University have figured out how to connect two brains, using a machine to transfer information wirelessly.

Now Darpa wants to take this nascent technology to its inevitable conclusion. According to the agency, the N3 project — Next-Generation Non-Surgical Neurotechnology — aims to “create reliable neural interfaces without the need for surgery or implanted electrodes.”

As DARPA says in its presentation, the technology has to be “read and write”, meaning that it will be bi-directional. It will not only be used for soldiers to control a drone swarm — one of the actual examples used by the Defense Advanced Research Projects Agency — but also put sensory information inside people’s brains, making them feel pressure or actually see things.

The latter scenario is actually something Rice University — one of the recipients of DARPA’s multi-million dollar funding for N3 — is working on: a system that will allow a blind person or anyone connected to it the vision of what other person is seeing. From there, the next step will be to emulate the brain activity to reproduce images taken with a digital camera.

In addition to Rice, the project will give millions of dollars to laboratories at Carnegie Mellon University, Johns Hopkins University, Palo Alto Research Center (where the graphic computer revolution started among many other things), Teledyne Scientific, and the Battelle Memorial Institute.

DARPA envisions two ways to make this happen. One is completely non-invasive — which will use something similar to a helmet, a diadem, or some other apparatus to transmit radio frequency waves that will transmit information into and out of the brain. They mention ultra-sound, light, RF, and magnetic fields. This system will include algorithms to decode and encode the brain’s motor and cognitive signals, affecting specific areas of the brain.

CONTINUE @ TOMS GUIDE