LIDAR gives laser vision to robotic bees

Oct. 24, 2015
SUNY Buffalo leads a RoboBee initiative project funded by a $1.1 million NSF grant to create robotic bees that can see.

The State University of New York (SUNY) University at Buffalo (UB) is leading a research project funded by a $1.1 million National Science Foundation grant that includes researchers from Harvard University and the University of Florida. It is an offshoot of the RoboBee initiative, led by Harvard and Northeastern University, that aims to create insect-inspired robots that someday may be used in agriculture and disaster relief.

RELATED ARTICLE: LIDAR nears ubiquity as miniature systems proliferate

By equipping robotic insects with tiny laser-powered sensors that act as eyes, the miniature robotic bees and other insect-like robots are able to sense the size, shape, and distance of approaching objects.

"Essentially, it's the same technology that automakers are using to ensure that driverless cars don't crash into things," says UB computer scientist Karthik Dantu. "Only we need to shrink that technology so it works on robot bees that are no bigger than a penny."

Researchers have shown that robot bees are capable of tethered flight and moving while submerged in water. One of their limitations, however, is a lack of depth perception. For example, a robot bee cannot sense what's in front of it.

This is problematic if you want the bee to avoid flying into a wall or have it land in a flower, says Dantu, who worked on the RoboBee project as a postdoctoral researcher at Harvard before joining UB’s School of Engineering and Applied Sciences in 2013 as an assistant professor.

The UB-led research team will address the limitation by outfitting the robot bee with remote sensing technology called LIDAR, the same laser-based sensor system that is making driverless cars possible.

After sensors measure the time it takes for illuminating light to return from an object, this information is then analyzed by computer algorithms to form a coherent image of the car's path. This enables the car to "see" its environment and follow traffic signs, avoid obstacles and make other adjustments. These systems, which are typically mounted on the car roof, are about the size of a traditional camping lantern. The team Dantu leads wants to make them much smaller, a version called "micro-lidar."

University of Florida researchers will develop the tiny sensor that measures the light's reflection, while Dantu will create novel perception and navigation algorithms that enable the bee to process and map the world around it. Harvard researchers will then incorporate the technology into the bees.

The technology the team develops likely won’t be limited to robot insects. The sensors could be used, among other things, in wearable technology; endoscopic tools; and smartphones, tablets and other mobile devices.

SOURCE: SUNY University at Buffalo; http://www.buffalo.edu/news/releases/2015/10/042.html

About the Author

Gail Overton | Senior Editor (2004-2020)

Gail has more than 30 years of engineering, marketing, product management, and editorial experience in the photonics and optical communications industry. Before joining the staff at Laser Focus World in 2004, she held many product management and product marketing roles in the fiber-optics industry, most notably at Hughes (El Segundo, CA), GTE Labs (Waltham, MA), Corning (Corning, NY), Photon Kinetics (Beaverton, OR), and Newport Corporation (Irvine, CA). During her marketing career, Gail published articles in WDM Solutions and Sensors magazine and traveled internationally to conduct product and sales training. Gail received her BS degree in physics, with an emphasis in optics, from San Diego State University in San Diego, CA in May 1986.

Sponsored Recommendations

Brain Computer Interface (BCI) electrode manufacturing

Jan. 31, 2025
Learn how an industry-leading Brain Computer Interface Electrode (BCI) manufacturer used precision laser micromachining to produce high-density neural microelectrode arrays.

Electro-Optic Sensor and System Performance Verification with Motion Systems

Jan. 31, 2025
To learn how to use motion control equipment for electro-optic sensor testing, click here to read our whitepaper!

How nanopositioning helped achieve fusion ignition

Jan. 31, 2025
In December 2022, the Lawrence Livermore National Laboratory's National Ignition Facility (NIF) achieved fusion ignition. Learn how Aerotech nanopositioning contributed to this...

Nanometer Scale Industrial Automation for Optical Device Manufacturing

Jan. 31, 2025
In optical device manufacturing, choosing automation technologies at the R&D level that are also suitable for production environments is critical to bringing new devices to market...

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!