Getting Started » Sensor Calibration

Visual-Inertial Sensors

One may ask why use a visual-inertial sensor? The main reason is because of the complimentary nature of the two different sensing modalities. The camera provides high density external measurements of the environment, while the IMU measures internal ego-motion of the sensor platform. The IMU is crucial in providing robustness to the estimator while also providing system scale in the case of a monocular camera.

However, there are some challenges when leveraging the IMU in estimation. An IMU requires estimating of additional bias terms and requires highly accurate calibration between the camera and IMU. Additionally small errors in the relative timestamps between the sensors can also degrade performance very quickly in dynamic trajectories. Within this OpenVINS project we address these by advocating the online estimation of these extrinsic and time offset parameters between the cameras and IMU.

Image

Camera Intrinsic Calibration (Offline)

The first task is to calibrate the camera intrinsic values such as the focal length, camera center, and distortion coefficients. Our group often uses the Kalibr [5] calibration toolbox to perform both intrinsic and extrinsic offline calibrations, by proceeding the following steps:

  1. Clone and build the Kalibr toolbox
  2. Print out a calibration board to use (we normally use the Aprilgrid 6x6 0.8x0.8 m (A0 page))
  3. Ensure that your sensor driver is publishing onto ROS with correct timestamps.
  4. Sensor preparations
    • Limit motion blur by decreasing exposure time
    • Publish at low framerate to allow for larger variance in dataset (2-5hz)
    • Ensure that your calibration board can be viewed in all areas of the image
    • Ensure that your sensor is in focus (can use their kalibr_camera_focus or just manually)
  5. Record a ROS bag and ensure that the calibration board can be seen from different orientations, distances, and in each part of the image plane. You can either move the calibration board and keep the camera still or move the camera and keep the calibration board stationary.
  6. Finally run the calibration
    • Use the kalibr_calibrate_cameras with your specified topic
    • Depending on amount of distortion, use the pinhole-equi for fisheye, or if a low distortion then use the pinhole-radtan
    • Depending on how many frames are in your dataset this can take on the upwards of a few hours.
  7. Inspect the final result, pay close attention to the final reprojection error graphs, with a good calibration having less than < 0.2-0.5 pixel reprojection errors.

IMU Intrinsic Calibration (Offline)

The other imporatnt intrinsic calibration is to compute the inertial sensor intrinsic noise characteristics, which are needed for the batch optimization to calibrate the camera to IMU transform and in any VINS estimator so that we can properly probabilistically fuse the images and inertial readings. Unfortunately, there is no mature open sourced toolbox to find these values, while one can try our kalibr_allan project, which is not optimized though. Specifically we are estimating the following noise parameters:

ParameterYAML elementSymbolUnits
Gyroscope "white noise"gyroscope_noise_density $\sigma_g$ $\frac{rad}{s}\frac{1}{\sqrt{Hz}}$
Accelerometer "white noise"accelerometer_noise_density $\sigma_a$ $\frac{m}{s^2}\frac{1}{\sqrt{Hz}}$
Gyroscope "random walk"gyroscope_random_walk $\sigma_{b_g}$ $\frac{rad}{s^2}\frac{1}{\sqrt{Hz}}$
Accelerometer "random walk"accelerometer_random_walk $\sigma_{b_a}$ $\frac{m}{s^3}\frac{1}{\sqrt{Hz}}$

The standard way as explained in
[IEEE Standard Specification Format Guide and Test Procedure for Single-Axis Interferometric Fiber Optic Gyros (page 71, section C)] is that we can compute an Allan variance plot of the sensor readings over different observation times (see below).

Image

As shown in the above figure, if we compute the Allan variance we we can look at the value of a line at $\tau=1$ with a -1/2 slope fitted to the left side of the plot to get the white noise of the sensor. Similarly, a line with 1/2 fitted to the right side can be evaluated at $\tau=3$ to get the random walk noise. We have a package that can do this in matlab, but actual verification and conversion into a C++ codebase has yet to be done. Please refer to our [kalibr_allan] github project for details on how to generate this plot for your sensor and calculate the values. Note that one may need to inflate the calculated values by 10-20 times to get usable sensor values.

Dynamic IMU-Camera Calibration (Offline)

After obtaining the intrinsic calibration of both the camera and IMU, we can now perform dynamic calibration of the transform between the two sensors. For this we again leverage the [Kalibr calibration toolbox]. For these collected datasets, it is important to minimize the motion blur in the camera while also ensuring that you excite all axes of the IMU. One needs to have at least one translational motion along with two degrees of orientation change for these calibration parameters to be observable (please see our recent paper on why: [Degenerate Motion Analysis for Aided INS With Online Spatial and Temporal Sensor Calibration]). We recommend having as much change in orientation as possible in order to ensure convergence.

  1. Clone and build the Kalibr toolbox
  2. Print out a calibration board to use (we normally use the Aprilgrid 6x6 0.8x0.8 m (A0 page))
  3. Ensure that your sensor driver is publishing onto ROS with correct timestamps.
  4. Sensor preparations
    • Limit motion blur by decreasing exposure time
    • Publish at high-ish framerate (20-30hz)
    • Publish your inertial reading at a reasonable rate (200-500hz)
  5. Record a ROS bag and ensure that the calibration board can be seen from different orientations, distances, and mostly in the center of the image. You should move in smooth non-jerky motions with a trajectory that excites as many orientation and translational directions as possible at the same time. A 30-60 second dataset is normally enough to allow for calibration.
  6. Finally run the calibration
    • Use the kalibr_calibrate_imu_camera
    • Input your static calibration file which will have the camera topics in it
    • You will need to make an imu.yaml file with your noise parameters.
    • Depending on how many frames are in your dataset this can take on the upwards of half a day.
  7. Inspect the final result. You will want to make sure that the spline fitted to the inertial reading was properly fitted. Ensure that your estimated biases do not leave your 3-sigma bounds. If they do your trajectory was too dynamic, or your noise values are not good. Sanity check your final rotation and translation with hand-measured values.