Getting Started » Supported Datasets

The EuRoC MAV Dataset

The ETH ASL EuRoC MAV dataset [1] is one of the most used datasets in the visual-inertial / simultaneous localization and mapping (SLAM) research literature. The reason for this is the synchronised inertial+camera sensor data and the high quality groundtruth. The dataset contains different sequences of varying difficulty of a Micro Aerial Vehicle (MAV) flying in an indoor room. Monochrome stereo images are collected by a two Aptina MT9V034 global shutter cameras at 20 frames per seconds, while a ADIS16448 MEMS inertial unit provides linear accelerations and angular velocities at a rate of 200 samples per second. We recommend that most users start testing on this dataset before moving on to the other datasets that our system support or before trying with your own collected data. The machine hall datasets have the MAV being picked up in the beginning and then set down, we normally skip this part, but it should be able to be handled by the filter if SLAM features are enabled. Please take a look at the run_ros_eth.sh script for some reasonable default values (they might still need to be tuned).

Dataset NameLength (m)Dataset LinkGroundtruth Traj.Example Launch
Vicon Room 1 0158rosbaglinklaunch
Vicon Room 1 0276rosbaglinklaunch
Vicon Room 1 0379rosbaglinklaunch
Vicon Room 2 0137rosbaglinklaunch
Vicon Room 2 0283rosbaglinklaunch
Vicon Room 2 0386rosbaglinklaunch

TUM Visual-Inertial Dataset

The TUM Visual-Inertial Dataset [17] is a more recent dataset that was presented to provide a way to evaluate state-of-the-art visual inertial odometry approaches. As compared to the EuRoC MAV datasets, this dataset provides photometric calibration of the cameras which has not been available in any other visual-inertal dataset for researchers. Monochrome stereo images are collected by two IDS uEye UI-3241LE-M-GL global shutter cameras at 20 frames per second, while a Bosch BMI160 inertial unit provides linear accelerations and angular velocities at a rate of 200 samples per second. Not all datasets have groundtruth available throughout the entire trajectory as the motion capture system is limited to the starting and ending room. There are quite a few very challenging outdoor handheld datasets which are a challenging direction for research. Note that we focus on the room datasets as full 6 dof pose collection is available over the total trajectory.

Dataset NameLength (m)Dataset LinkGroundtruth Traj.Example Launch
room1147rosbaglinklaunch
room2142rosbaglinklaunch
room3136rosbaglinklaunch
room469rosbaglinklaunch
room5132rosbaglinklaunch
room667rosbaglinklaunch

RPNG OpenVINS Dataset

In additional the community maintained datasets, we have also released a few datasets. Please cite the OpenVINS paper if you use any of these datasets in your works. Here are the specifics of the sensors that each dataset uses:

  • ArUco Datasets:
    • Core visual-inertial sensor is the VI-Sensor
    • Stereo global shutter images at 20 Hz
    • ADIS16448 IMU at 200 Hz
    • Kalibr calibration file can be found here
  • Ironsides Datasets:
    • Core visual-inertial sensor is the ironsides
    • Has two Reach RTK one subscribed to a base station for corrections
    • Stereo global shutter fisheye images at 20 Hz
    • InvenSense IMU at 200 Hz
    • GPS fixes at 5 Hz (/reach01/tcpfix has corrections from NYSNet)
    • Kalibr calibration file can be found here

Most of these datasets do not have perfect calibration parameters, and some are not time synchronised. Thus, please ensure that you have enabled online calibration of these parameters. Additionally, there is no groundtruth for these datasets, but some do include GPS messages if you wish to compare relative to something.

Dataset NameLength (m)Dataset LinkGroundtruth Traj.Example Launch
ArUco Room 0127rosbagnonelaunch aruco
ArUco Room 0293rosbagnonelaunch aruco
ArUco Hallway 01190rosbagnonelaunch aruco
ArUco Hallway 02105rosbagnonelaunch aruco
Neighborhood 012300rosbagnonelaunch ironsides
Neighborhood 027400rosbagnonelaunch ironsides

UZH-FPV Drone Racing Dataset

The UZH-FPV Drone Racing Dataset [17] is a dataset focused on high-speed agressive 6dof motion with very high levels of optical flow as compared to other datasets. A FPV drone racing quadrotor has on board a Qualcomm Snapdragon Flight board which can provide inertial measurement and has two 640x480 grayscale global shutter fisheye camera's attached. The groundtruth is collected with a Leica Nova MS60 laser tracker. There are four total sensor configurations and calibration provides including: indoor forward facing stereo, indoor 45 degree stereo, outdoor forward facing, and outdoor 45 degree. A top speed of 12.8 m/s (28 mph) is reached in the indoor scenarios, and 23.4 m/s (54 mphs) is reached in the outdoor datasets. Each of these datasets is picked up in the beginning and then set down, we normally skip this part, but it should be able to be handled by the filter if SLAM features are enabled. Please take a look at the run_ros_uzhfpv.sh script for some reasonable default values (they might still need to be tuned).

Dataset NameLength (m)Dataset LinkGroundtruth Traj.Example Launch
Indoor 5157rosbaglinklaunch
Indoor 6204rosbaglinklaunch
Indoor 7314rosbaglinklaunch
Indoor 9136rosbaglinklaunch
Indoor 10129rosbaglinklaunch
Indoor 45deg 2207rosbaglinklaunch
Indoor 45deg 4164rosbaglinklaunch
Indoor 45deg 12112rosbaglinklaunch
Indoor 45deg 13159rosbaglinklaunch
Indoor 45deg 14211rosbaglinklaunch

KAIST Urban Dataset

The KAIST urban dataset [7] is a dataset focus on autonomous driving and localization in challenging complex urban environments. The dataset was collected in Korea with a vehicle equipped with stereo camera pair, 2d SICK LiDARs, 3d Velodyne LiDAR, Xsens IMU, fiber optic gyro (FoG), wheel encoders, and RKT GPS. The camera is 10 Hz, while the Xsens IMU is 100 Hz sensing rate. A groundtruth "baseline" trajectory is also provided which is the resulting output from fusion of the FoG, RKT GPS, and wheel encoders.

git clone https://github.com/irapkaist/file_player.git
git clone https://github.com/irapkaist/irp_sen_msgs.git
catkin build
rosrun file_player file_player

To use the dataset, the dataset's file player can be used to publish the sensor information on to ROS. See the above commands on what packages you need to clone into your ROS workspace. One can record a rosbag after manually and use the serial OpenVINS processing node, or use the live node and manually playback the datasets. It is important to disable the "skip stop section" to ensure that we have continuous sensor feeds. Typically we process the datasets at 2x rate so we get a 20 Hz image feed and the datasets can be processed in a more efficient manor.

Dataset NameLength (km)Dataset LinkGroundtruth Traj.Example Launch
Urban 2811.47downloadlinklaunch
Urban 3811.42downloadlinklaunch
Urban 3911.06downloadlinklaunch