The Role of LiDAR Scanner and Time of Flight (ToF) Cameras in Mobile AR Experiences
Simultaneous Localization and Mapping (SLAM) has been recognized as one of the most essential elements for enabling Augmented Reality (AR) experiences. More specifically, SLAM enables devices to orient themselves and map unknown environments without any external location reference or tracking technologies (like GPS). Consequently, devices are able to understand a user’s environment contextually and sometimes semantically (identifying specific objects or environments) and precisely overlay digital content. The use of SLAM in AR experiences so far relies on a device’s camera in combination with Inertial Measurement Unit (IMU) data (accelerometer, gyroscope), which is called Visual-Inertial Odometry (VIO). However, the main challenge of visual SLAM is that it can lack in precise distance measurement and depth recognition, so virtual content may float in the space and be unrealistic, with applications that require accuracy suffering as a result.
Another sensor used for SLAM, which is used in autonomous cars and robots that r…
You must be a subscriber to view this ABI Insight.
To find out more about subscribing contact a representative about purchasing options.