Vision the Future for Mobile Robotic Navigation?

Subscribe To Read This Insight

2Q 2021 | IN-6124

 

Vision as the Future of Perception for Mobile Robotics

NEWS


In ABI Research’s recent competitive assessment on autonomous material handling robots, it was found that the most successful Autonomous Mobile Robots (AMR) vendors currently, generally utilize SICK AG LiDAR sensors alongside a 3D camera. Sensor fusion is thus an increasingly common reality for navigation and perception in the mobile robotics space. That being said, a small number of companies are offering mapping, positioning and perception (in effect the full SLAM capability) purely through the use of cameras. Among these include Seegrid, 634 AI, Sevensense, and Gideon Brothers, although others are entering the market.

The Vision vs LIDAR Debate

IMPACT


The debate around vision-based vSLAM vs LiDAR-enabled SLAM as a way to help robots navigate has been waged since iRobot challenged Neato’s LiDAR robots with its own ceiling based visual navigation. Just as in the consumer space, LiDAR continues to play a critical role for localization, mapping, and navigation for autonomous mobile robots. The reasons…

You must be a subscriber to view this ABI Insight.
To find out more about subscribing contact a representative about purchasing options.

Services