Calculate a position within 1-inch. Use just a monocular camera, or a variety of other sensors
This SLAM SDK allows developers to fuse data from multiple sensor inputs to precisely calculate position.
- Simultaneous Localization and Mapping (SLAM) is a computer vision technology
- SLAM allows an entity to track its location while simultaneously mapping its surroundings
- Cross platform development: iOS, Android, PC, Linux
- Dynamic Sensor Input allows: cameras, lasers, sonar, IMU and more
- Minimal Hardware platform*:
Raspberry Pi 3 with standard Raspberry Pi camera sensor for SLAM real time processing.
- Implemented functionality:
SLAM Core, Global Pose Optimization, Deep IMU Fusion, Recovery.
- Performance(Raspberry Pi 3):
Visual SLAM – 60 fps (depends on Camera’ limitations)
Visual SLAM with Sensor Fusion – 60 fps.
Visual SLAM – 80 fps(depends on Camera’ limitations)
Visual SLAM with Sensor Fusion – 80 fps.