SLAM
SLAM (Simultaneous Localization and Mapping) is a technology used in robotics and autonomous systems to simultaneously keep track of the device’s location and build a map of an unknown environment, especially in environments where GNSS cannot be used. It focuses on accurately determining the device’s position using data from sensors like cameras, LIDAR, Radar, and IMUs.
Visual Inertial Odometry
Cameras are small, lightweight sensors that capture rich visual information about the environment. VIO combines visual data from cameras with inertial data from IMUs to estimate the position and orientation of a device.
Radar Inertial Odometry
Radars are powerful sensors that can detect objects and measure distances with high precision, even in challenging environmental conditions where cameras and lidars might fail. They are unaffected by changes in illumination, robust against repetitive structures, and can operate effectively in environments with smoke or fog due to their long wavelengths. RIO integrates radar data with inertial data from IMUs to estimate the position and orientation of a device, ensuring reliable operation even when cameras and lidars are compromised.