Calculate a position within 1-inch with 65fps. SLAM for Monocular + IMU / Stereo + IMU / FishEye + Stereo + IMU.



This SLAM SDK allows developers to fuse data from multiple sensor inputs to precisely calculate position.

The Basics:

  1. Simultaneous Localization and Mapping (SLAM) is a computer vision technology
  2. SLAM allows an entity to track its location while simultaneously mapping its surroundings

SDK Specifications:

  1. Cross platform development: iOS, Android, Linux
  2. Dynamic Sensor Input(Sensor Fusion): cameras(Mono  or Stereo), Lasers, Sonar, IMU(accelerometer, gyro) etc. 
  3. Minimal Hardware platform*:
    Raspberry Pi 3 with standard Raspberry Pi camera sensor for SLAM real time processing.
  4. Implemented functionality:
    SLAM Core, Tracking, Global Pose Optimization, Deep IMU Fusion, Recovery, Mapping, Surface Reconstruction etc.
  5. Performance:
    Raspberry Pi 3: Visual SLAM + Sensor Fuson  – 47+ fps (depends on Camera’ limitations);
    Odroid-C2: Visual SLAM + Sensor Fuson  – 65+ fps (depends on Camera’ limitations);
    iPhone 7s: Visual SLAM + Sensor Fuson  – 120 fps.

Demos(Visual SLAM, June 2016):

Leave a Reply