Performance Metrics
Position accuracy
±2.5 cm in typical environments
update rate
Up to 200 Hz
initialization time
<1 second
Maximum Velocity
20 m/s with full accuracy
Operating Range
Unlimited (environment-dependent)
Drift
<0.1% of distance traveled

See how it works
Benefits
Reliable in Any Environment
Maintains accurate localization in challenging conditions like night, smoke, or fog.
Enhanced Mission Resilience
Operates independently of GPS, ideal for contested or denied areas.
Improved Situational Awareness
Fuses EO and IR data for a more complete understanding of the environment.


visionairy®
SLAM - EO | IR
SLAM for EO | IR creates real-time maps and tracks position using visual and thermal imagery, enabling navigation and situational awareness even without GPS.
Features
Our advanced sensor fusion algorithms combine these inputs to create a comprehensive understanding of the environment that surpasses the capabilities of any single sensor type.
Multi-Spectral Sensor Fusion - Combines Electro-Optical (EO) and Infrared (IR) data for robust mapping and localization.
GPS-Denied Navigation - Operates reliably in environments where GPS is unavailable or jammed.
Real-Time Processing - Delivers low-latency pose estimation and map generation for time-critical operations.
Visual-Inertial Integration (Optional) - Supports fusion with IMU data for enhanced accuracy during rapid motion or occlusion.
Dynamic Environment Adaptation - Handles changes in lighting, temperature, and terrain with high stability.
Edge-Optimized Performance - Runs efficiently on embedded hardware for airborne or mobile platforms.
Why SLAM - EO | IR?
Traditional SLAM systems struggle in low-light, smoke, fog, or visually complex environments, limiting their use in real-world defense and autonomous scenarios. Spleenlab’s SLAM EO | IR overcomes this by fusing electro-optical (EO) and infrared (IR) sensor data, enabling robust, GPS-independent localization and mapping in all visibility conditions.
This multi-spectral approach ensures reliable navigation both day and night, across thermally active or obscured terrain. It delivers superior situational awareness and operational resilience—making it a powerful advantage over conventional visual-only SLAM solutions.
GPS-Free Localization
All-Condition Navigation
EO + IR Fusion
These performance metrics are for demonstrative purposes only, based on configurations with proven results. Actual performance may vary by setup. Our algorithms are optimized for use with any chip, platform, or sensor. Contact us for details.
Position accuracy
±2.5 cm in typical environments
update rate
Up to 200 Hz
initialization time
<1 second
Maximum Velocity
20 m/s with full accuracy
Operating Range
Unlimited (environment-dependent)
Drift
<0.1% of distance traveled
Supported companion hardware
Nvidia Jetson, ModalAI Voxl2 / Mini, Qualcomm RB5, IMX7, IMX8, Raspberry PI
Supported flight controllers
PX4, APX4, Ardupilot
Basis-SW/OS
Linux, Docker required
Interfaces
ROS2 or Mavlink
Input - Sensoren
• Any type of camera (sensor agnostic)
• Any type of global orientation (optional)
Input - Data
• Any type of camera (sensor agnostic)
• Any type of IMU
• GNSS position data for intial position referencing
• In case of aerial vehicles: current flight altitude
• Intrinsic & extrinsiv sensor calibration
Output - Data
• 3D map of the environment
The information provided reflects recommended hardware specifications based on insights gained from successful customer projects and integrations. These recommendations are not limitations, and actual requirements may vary depending on the specific configuration.
Our algorithms are compatible with any chip, platform, sensor, and individual configuration. Please contact us for further information.Minimum
Recommended
RAM
2 GB
4 GB
Storage
20 GB
50 GB
Camera
640 x 480 px, 10 FPS
1920 x 1080 px, 30 FPS
IMU
100 Hz
300 Hz






