This helicopter is being controlled by a human brain

What’s cooler than a quadrotor? How about one that can be controlled by human thought alone?

On the face of it, this achievement might not sound particularly special. It seems that every day we hear about a new EEG advance in which some kind of gadget is controlled by brainwaves. And in fact, it was just yesterday that we reported on the world’s first mind-controlled exoskeleton.

But this accomplishment is unique for a couple of reasons.

First, there’s the order of complexity to consider. This quadcopter has to be navigated across three different dimensions. To make it work, engineers at the University of Minnesota College of Science and Engineering fitted a cap with 64 electrodes. Participants were then asked to imagine using their right hand, left hand, and both hands together in order to instruct the quadrotor to turn right, left, lift, and fall, respectively. The flying device was configured with a pre-set forward moving velocity to make it manageable.

The users sat in front of a screen which showed images of the quadrotor’s flight in realtime as it was being recorded by its onboard camera. The brain signals were recorded by the cap and transmitted to the copter via WiFi.

Incredibly, the copter can be seen zipping around the room as it flies through various sets of rings. It’s wild to think that it’s being navigated by an external, human mind.

Second, the achievement offers yet another example of the potential for remote presence. Thought-controlled interfaces will not only allow people to move objects on a computer screen, or devices attached to themselves — but also external devices with capacities that significantly exceed our own. In this case, a flying toy. In future, we can expect to see remote presence technologies applied to even more powerful robotic devices, further blurring the boundary that separates our body from the environment.

The research team, which was led by Bin He, says the technology could also be used to assist, augment, or repair human cognitive or sensory-motor functions, including hearing, sight, and motor control. Looking ahead, the researchers are hoping to apply the same technology to the external movement of robotic arms.

Check out the entire study at IOP Science, “Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface.”

Image: University of Minnesota College of Science and Engineering