The SLAM Abstraction Layer for Robotics Platforms
Write once, run on any sensor. Simple abstraction layer over ROS2 SLAM pipelines.
RealSense Camera:
from neuronav import RealSenseSensor, RTABMapSLAM, run_slam
sensor = RealSenseSensor()
run_slam(sensor, RTABMapSLAM())OAK-D Camera:
from neuronav import OAKDSensor, RTABMapSLAM, run_slam
sensor = OAKDSensor()
run_slam(sensor, RTABMapSLAM())With Visualization:
run_slam(sensor, RTABMapSLAM(), visualize=True)
# Opens Foxglove at ws://localhost:8765- 2-Line API - Start SLAM instantly
- Sensor Agnostic - RealSense, OAK-D, or add your own
- RTAB-Map Integration - Production-ready visual SLAM
- Foxglove Visualization - Web-based 3D viewing
- Extensible - Add sensors in 20 min, SLAM algorithms in 1 hour
Docker (Recommended):
./docker_build.sh
./docker_run.shLocal:
pip install -e .Requires: ROS2 Humble, Python 3.8+, camera drivers (setup guide)
Sensors:
- Intel RealSense D435i, D455, D415
- Luxonis OAK-D Pro/W
SLAM:
- RTAB-Map RGB-D SLAM
Licensed under Apache 2.0.
Built on RTAB-Map (Copyright (c) 2010-2025, Mathieu Labbé - IntRoLab - Université de Sherbrooke, BSD 3-Clause License).
See ACKNOWLEDGMENTS.md for complete credits and licenses.
Issues? → GitHub Issues