Slam sdk
Platform

SLAM SDK

Calculate a position within 1-inch. Use just a monocular camera, or a variety of other sensors

  • SLAM SDK
  • SLAM SDK

General


This SLAM SDK allows developers to fuse data from multiple sensor inputs to precisely calculate position.

The Basics:

  1. Simultaneous Localization and Mapping (SLAM) is a computer vision technology
  2. SLAM allows an entity to track its location while simultaneously mapping its surroundings

SDK Specifications:

  1. Cross platform development: iOS, Android, PC, Linux
  2. Dynamic Sensor Input allows: cameras, lasers, sonar, IMU and more
  3. Minimal Hardware platform*:
    Raspberry Pi 3 with standard Raspberry Pi camera sensor for SLAM real time processing.
  4. Implemented functionality:
    SLAM Core, Global Pose Optimization, Deep IMU Fusion, Recovery.
  5. Performance(Raspberry Pi 3):
    Visual SLAM – 60 fps (depends on Camera’ limitations)
    Visual SLAM with Sensor Fusion – 60 fps.
  6. Performance(ODROID):
    Visual SLAM – 80 fps(depends on Camera’ limitations)
    Visual SLAM with Sensor Fusion – 80 fps.

Demos:



Advantages:

Screen Shot 2016-08-07 at 1.30.34 PM

Screen Shot 2016-08-07 at 1.30.42 PM

Leave a Reply