Calculate a position within 1-inch with 65fps. SLAM for Monocular + IMU / Stereo + IMU / FishEye + Stereo + IMU.
This SLAM SDK allows developers to fuse data from multiple sensor inputs to precisely calculate position.
- Simultaneous Localization and Mapping (SLAM) is a computer vision technology
- SLAM allows an entity to track its location while simultaneously mapping its surroundings
- Cross platform development: iOS, Android, Linux
- Dynamic Sensor Input(Sensor Fusion): cameras(Mono or Stereo), Lasers, Sonar, IMU(accelerometer, gyro) etc.
- Minimal Hardware platform*:
Raspberry Pi 3 with standard Raspberry Pi camera sensor for SLAM real time processing.
- Implemented functionality:
SLAM Core, Tracking, Global Pose Optimization, Deep IMU Fusion, Recovery, Mapping, Surface Reconstruction etc.
Raspberry Pi 3: Visual SLAM + Sensor Fuson – 47+ fps (depends on Camera’ limitations);
Odroid-C2: Visual SLAM + Sensor Fuson – 65+ fps (depends on Camera’ limitations);
iPhone 7s: Visual SLAM + Sensor Fuson – 120 fps.
Demos(Visual SLAM, June 2016):