Skip to content

ubicray/neuronav-slam-sdk

 
 

Repository files navigation

Neuronav SLAM SDK

The SLAM Abstraction Layer for Robotics Platforms

Write once, run on any sensor. Simple abstraction layer over ROS2 SLAM pipelines.

Documentation Follow

License Python ROS2

Quick Start

RealSense Camera:

from neuronav import RealSenseSensor, RTABMapSLAM, run_slam

sensor = RealSenseSensor()
run_slam(sensor, RTABMapSLAM())

OAK-D Camera:

from neuronav import OAKDSensor, RTABMapSLAM, run_slam

sensor = OAKDSensor()
run_slam(sensor, RTABMapSLAM())

With Visualization:

run_slam(sensor, RTABMapSLAM(), visualize=True)
# Opens Foxglove at ws://localhost:8765

Features

  • 2-Line API - Start SLAM instantly
  • Sensor Agnostic - RealSense, OAK-D, or add your own
  • RTAB-Map Integration - Production-ready visual SLAM
  • Foxglove Visualization - Web-based 3D viewing
  • Extensible - Add sensors in 20 min, SLAM algorithms in 1 hour

Installation

Docker (Recommended):

./docker_build.sh
./docker_run.sh

Local:

pip install -e .

Requires: ROS2 Humble, Python 3.8+, camera drivers (setup guide)

Supported Hardware

Sensors:

  • Intel RealSense D435i, D455, D415
  • Luxonis OAK-D Pro/W

SLAM:

  • RTAB-Map RGB-D SLAM

License

Licensed under Apache 2.0.

Credits

Built on RTAB-Map (Copyright (c) 2010-2025, Mathieu Labbé - IntRoLab - Université de Sherbrooke, BSD 3-Clause License).

See ACKNOWLEDGMENTS.md for complete credits and licenses.


Issues?GitHub Issues

About

Unified SLAM SDK for Robots

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 87.7%
  • Dockerfile 6.6%
  • Shell 5.7%