Meta Wristband Interface: AI Translates Brain Signals

Meta Wristband Interface: AI Translates Brain Signals

Imagine the ability to control machines with your mind, instead of having to type on a keyboard or click on a mouse. Now Facebook’s parent company Meta is aiming for the next best thing—a new wristband that can, with the help of AI, infer electrical commands sent from the brain to muscles and convert them into computer signals, all in a noninvasive way. Although experts doubt it will replace keyboards and mice for traditional computing, it might have new uses for a wide range of applications, such as wearable interfaces for mobile devices, or thought-controlled assistive technologies for people with disabilities.

The bracelet from Reality Labs at Meta uses metal contacts placed against the skin to detect electrical signals from muscles—a technique known as surface electromyography (sEMG)—which are generated in response to commands from the brain. The highly sensitive new system transmits this data to a computer using Bluetooth to help it recognize gestures such as pointing and pinching in real time, findings detailed in Nature on 23 July.

The bracelet is not a direct interface with the brain. “It is not a mind-reading system. It cannot make you act in a different way as imposed by your will, it does not connect you to other people neurally, it does not predict your intentions,” says Dario Farina, chair in neurorehabilitation engineering at Imperial College, London, who did not take part in Meta’s research but has tested the technology. (Meta was unable to make anyone available for comment as of press time.)

How AI Enables Meta’s Wristband

Previous “neuromotor” devices, such as the discontinued Myo armband, also sought to use sEMG for computer interfaces. A key challenge these earlier devices faced was how they each needed time-consuming personalized calibration for each user to account for differences in human anatomy and behavior.

See how the device detects thumb swipes, finger taps, and handwriting gestures.Reality Labs at Meta

In contrast, Meta says its bracelet can essentially work off-the-shelf. The key was training deep learning artificial intelligence systems on data from more than 6,000 paid volunteers who wore the device. This generated models that could accurately interpret user input across different people without requiring individual calibration, says Joe Paradiso, head of the Responsive Environments Research Group at the MIT Media Lab, who did not participate in this study.

“The amount of information decoded with this device is very large, far larger than any previous attempt,” Farina says. “The device can recognize handwriting, for example, which would have not been conceivable before.”

The wristband uses surface electromyography to non-invasively measure electrical activity of the muscles that control hand gestures.Reality Labs at Meta

A possible concern with using this wristband is how users might not want every hand motion interpreted as…

Read full article: Meta Wristband Interface: AI Translates Brain Signals

The post “Meta Wristband Interface: AI Translates Brain Signals” by Charles Q. Choi was published on 08/13/2025 by spectrum.ieee.org