This repository contains datasets acquired using three different LiDAR sensors, mounted on a robotic mobile platform. The data is collected in various environments to facilitate comparison of the sensors' performance in 3D SLAM applications. This README provides an overview of the data acquisition setup, the environments, and instructions for using the dataset.
The dataset includes LiDAR point cloud data acquired using the following sensors:
- Velodyne Puck (mounted at the bottom)
- Ouster OS1 (mounted in the middle)
- Livox Mid-360 (mounted at the top)
The specifications of the sensors used can be seen bellow:
The sensors were vertically stacked on a Clearpath Robotics Jackal robot, ensuring minimal occlusion of the field of view (FOV) and enabling direct comparison of sensor data captured from the same trajectory.
The dataset consists of data captured in three different environments:
The dataset contains point clouds for the following environments:
-
Outdoor Garden:
- Trees, benches, lampposts, surrounding buildings.
- Best suited for outdoor SLAM or object detection tasks.
-
Road Segment:
- Road with construction signs, vehicles, and surrounding buildings.
- Suitable for urban navigation and obstacle detection comparisons.
-
Indoor Office Space:
- Narrow corridors, desks, large windows.
- Ideal for indoor navigation and mapping experiments.
The sensors were mounted vertically on the Jackal robot to ensure consistent trajectory and environmental conditions for data collection. A two-meter proximity filter was applied to remove small-distance measurements and reduce variations between sensors for nearby objects.
- Power: All sensors were powered using the Jackal’s 12V power supply.
- Data Transmission: Sensors were connected to a TP-Link Archer MR600 router via LAN, and an external laptop was used to access the sensors and acquire data.
- Control and Data Capture: The ROS (Robot Operating System) framework was used to control the robot and manage data acquisition.
The setup allowed the robot to be teleoperated in various environments to collect comprehensive datasets for comparison between the three sensors.
- CloudCompare: Used for point cloud visualization and sensor data comparison.
- 3D SLAM: The collected data can be used for 3D SLAM experiments, with the same SLAM parameters applied across all sensors.
To use this dataset, download the relevant environment datasets from the provided links. You can visualize the data using CloudCompare or any other point cloud processing software. The dataset is suitable for:
- Comparing sensor performance in 3D mapping.
- Benchmarking SLAM algorithms using different LiDAR sensor data.
- Point cloud processing experiments.
The dataset contains point clouds for the following environments:
-
Outdoor Garden:
- Trees, benches, lampposts, surrounding buildings.
- Best suited for outdoor SLAM or object detection tasks.
-
Road Segment:
- Road with construction signs, vehicles, and surrounding buildings.
- Suitable for urban navigation and obstacle detection comparisons.
-
Indoor Office Space:
- Narrow corridors, desks, large windows.
- Ideal for indoor navigation and mapping experiments.
The datasets are publicly available for download:
If you use this dataset in your research, please cite the following paper:
[Include citation to the relevant paper here when available]
For any questions or contributions, please feel free to submit an issue or a pull request.
This dataset is licensed under the Creative Commons Attribution 4.0 International (CC BY 4.0) License. Please provide appropriate credit if using this dataset.