Presentation + Paper
13 November 2024 MUSAL: towards multisource 4D scene modeling by autonomous robot systems for the surveillance of critical infrastructure
Boitumelo Ruf, Max Hermann, Antonio Araujo, Robert Zimmermann, Janko Petereit, Christian Frey
Author Affiliations +
Abstract
This paper presents the MUSAL ecosystem, which allows temporal scene modeling with multi-sourced data and provides a continuous scene representation at different abstraction layers. It consists of individual building blocks, each of which implements different functionalities and is orchestrated by a ROS2-based middleware. The building blocks are comprised of (i) data stores, which implement the actual persistence and management of the sensor data; (ii) data recorders, connecting the actual sensors with the data stores; (iii) data processors, that advertise services to process the sensor data on different abstraction levels; as well as a (iv) universal search engine, providing a unified and transparent interface to the user. The functionality of the system is demonstrated through four use cases, namely the generation of obstacle maps and their distribution between multiple autonomous mobile robots, collaborative online 3D mapping and point cloud merging, multi-source image-based 3D reconstruction, and the monitoring of industrial facilities.
Conference Presentation
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Boitumelo Ruf, Max Hermann, Antonio Araujo, Robert Zimmermann, Janko Petereit, and Christian Frey "MUSAL: towards multisource 4D scene modeling by autonomous robot systems for the surveillance of critical infrastructure", Proc. SPIE 13207, Autonomous Systems for Security and Defence, 1320704 (13 November 2024); https://doi.org/10.1117/12.3030904
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data storage

Ecosystems

Point clouds

Data processing

Sensors

3D image processing

3D modeling

Back to Top