Farm Robots Navigate With SonicBoom’s Sound-Based Sensing

Farm Robots Navigate With SonicBoom’s Sound-Based Sensing

This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

Agricultural robots could help farmers harvest food under tough environmental conditions, especially as temperatures continue to rise. However, creating affordable robotic arms that can gracefully and accurately navigate the thick network of branches and trunks of plants can be challenging.

In a recent study, researchers developed a sensing system, called SonicBoom, which allows autonomous robots to use sound to sense the objects it touches. The approach, which can accurately localize or “feel” the objects it encounters with centimeter-level precision, is described in a study published 2 June in IEEE Robotics and Automation Letters.

Moonyoung (Mark) Lee isa fifth year Ph.D. student at Carnegie Mellon University’s Robotics Institute who was involved in developing SonicBoom. He notes that many autonomous robots currently rely on a collection of tiny camera-based tactile sensors, which use mini cameras under a protective gel pack lining the surface, and visually estimate deformation of the gel to gain tactile information. However, this approach isn’t ideal in agricultural settings, when branches are likely to occlude the visual sensors. What’s more, camera-based sensors can be expensive and could be easily damaged in this context.

Another option is pressure sensors, Lee notes, but these would need to cover much of the surface area of the robot in order to effectively sense when it comes into contact with branches. “Imagine covering the entire robot arm surface with that kind of [sensor]. It would be expensive,” he says.

Instead, Lee and his colleagues are proposing a completely different approach that relies on sound for sensing. The system involves an array of contact microphones, which detect physical touch as sound signals that propagate through solid materials.

How Does SonicBoom Work?

When a robotic arm touches a branch, the resulting sound waves travel down the robotic arm until they encounter the array of contact microphones. Tiny differences in sound wave properties (such as signal intensity and phase) across the array of microphones are used to localize where the sound originated, and thus the point of contact.

In this video, see SonicBoom in action during laboratory testing. youtu.be

Lee notes that this approach allows microphones to be embedded deeper in the robotic arm. This means they are less prone to damage compared to traditional visual sensors on the exterior of a robotic arm. “The contact microphones can be easily protected from very harsh, abrasive contacts,” he explains.

As well, the approach uses a small handful of microphones dispersed across the robotic arm, rather than many visual or pressure sensors more densely coating it.

To help SonicBoom better localize points of contact, the researchers used an AI model, trained on data collected by tapping the robotic…

Read full article: Farm Robots Navigate With SonicBoom’s Sound-Based Sensing

The post “Farm Robots Navigate With SonicBoom’s Sound-Based Sensing” by Michelle Hampson was published on 06/28/2025 by spectrum.ieee.org