
The Master360 Project
The Master360 project was initiated to develop a modular and scalable sensor and visualization platform for next-generation pilot assistance systems in helicopters. Its primary goal was to enhance situational awareness by merging data from various sources into a unified, real-time 3D representation of the aircraft’s surroundings. By combining innovative sensor fusion with intuitive visualization, the project aimed to improve operational safety and support future autonomous flight capabilities, particularly for platforms like Airbus Helicopters’ PioneerLab.

1
Accurate Obstacle Detection Using Only Camera Input
Developing a robust system that could reliably detect and interpret obstacles in 3D space using monocular vision in highly dynamic flight environments was a significant technical hurdle.
2
Real-Time Processing for Pilot Assistance
Ensuring that the SLAM algorithms and perception models could operate in real time with low latency was critical to avoid delays in pilot feedback and maintain flight safety.
3
Seamless Integration into Cockpit Systems
Adapting the AI-powered perception output into an intuitive and non-distracting pilot display required careful human-machine interface design and extensive testing.
Challenges
Our AI Solution
As part of the Master360 initiative, Spleenlab engineers developed a vision-based collision avoidance prototype relying solely on camera input. The core of this AI solution used advanced SLAM (Simultaneous Localization and Mapping) techniques to estimate visual odometry and extract sparse 3D points from the environment in real time.
These data points were interpreted as potential obstacles, enabling dynamic scene understanding without requiring additional external sensors. This allowed the helicopter to continuously build and update a 3D model of its surroundings during flight.
The processed environmental data was integrated into a pilot assistance interface, providing intuitive visual feedback directly in the cockpit. This solution not only improved pilot awareness but also supported Airbus’ broader goals of enabling fully or semi-autonomous features, such as automated take-off and landing (ATOL) and Detect and Avoid (DAA) functionalities. By demonstrating how AI-powered perception can be embedded into helicopter platforms, Spleenlab helped pave the way for smarter, safer, and more sustainable flight operations.

Impact
Enhanced Pilot Situational Awareness
By fusing data from multiple sensors into a unified 3D view, the project significantly improved a pilot’s ability to perceive and react to environmental threats in real time.
Advancement of AI-Powered Collision Avoidance
The integration of SLAM-based vision systems demonstrated the feasibility of using AI for real-time obstacle detection and avoidance in aerial environments.
Foundation for Autonomous Flight Functions
Technologies developed in Master360 laid the groundwork for autonomous capabilities like ATOL (Automated Take-Off and Landing), pushing forward helicopter autonomy.
Contribution to Sustainable Flight Innovation
Through its research and development, the project supported Airbus Helicopters' broader goals of creating safer and more efficient flight systems, aligning with emissions reduction and operational sustainability efforts.