Bibliography

[1]

Sameer Agarwal, Keir Mierle, and Others. Ceres solver. http://ceres-solver.org.

[2]

Yaakov Bar-Shalom, X Rong Li, and Thiagalingam Kirubarajan. Estimation with applications to tracking and navigation: theory algorithms and software. John Wiley & Sons, 2001.

[3]

Timothy D Barfoot. State estimation for robotics. Cambridge University Press, 2017.

[4]

Michael Burri, Janosch Nikolic, Pascal Gohl, Thomas Schneider, Joern Rehder, Sammy Omari, Markus W Achtelik, and Roland Siegwart. The euroc micro aerial vehicle datasets. The International Journal of Robotics Research, 35(10):1157–1163, 2016.

[5]

Averil Burton Chatfield. Fundamentals of high accuracy inertial navigation, volume 174. Aiaa, 1997.

[6]

Chuchu Chen, Yulin Yang, Patrick Geneva, and Guoquan Huang. Fej2: A consistent visual-inertial state estimator design. In 2022 International Conference on Robotics and Automation (ICRA), 2022.

[7]

Chuchu Chen, Patrick Geneva, Yuxiang Peng, Woosik Lee, and Guoquan Huang. Monocular visual-inertial odometry with planar regularities. In 2023 International Conference on Robotics and Automation (ICRA), 2023.

[8]

Gregory Chirikjian. Stochastic Models, Information Theory, and Lie Groups, Volume 2: Analytic Methods and Modern Applications, volume 2. Springer Science & Business Media, 2011.

[9]

Pavel Davidson, Jani Hautamäki, Jussi Collin, and Jarmo Takala. Improved vehicle positioning in urban environment through integration of gps and low-cost inertial sensors. In Proceedings of the European Navigation Conference (ENC), Naples, Italy, pages 3–6, 2009.

[10]

Jeffrey Delmerico and Davide Scaramuzza. A benchmark comparison of monocular visual-inertial odometry algorithms for flying robots. In 2018 IEEE International Conference on Robotics and Automation (ICRA), pages 2502–2509. IEEE, 2018.

[11]

Tue-Cuong Dong-Si and Anastasios I Mourikis. Estimator initialization in vision-aided inertial navigation with unknown camera-imu calibration. In 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 1064–1071. IEEE, 2012.

[12]

Kevin Eckenhoff, Patrick Geneva, and Guoquan Huang. Continuous preintegration theory for visual-inertial navigation. Technical Report RPNG-2018-CPI, University of Delaware, 2018. Available: http://udel.edu/ ghuang/papers/tr_cpi.pdf.

[13]

Kevin Eckenhoff, Patrick Geneva, and Guoquan Huang. Closed-form preintegration methods for graph-based visual-inertial navigation. International Journal of Robotics Research, 38(5), 2019.

[14]

Jay Farrell. Aided navigation: GPS with high rate sensors. McGraw-Hill, Inc., 2008.

[15]

Paul Furgale, Joern Rehder, and Roland Siegwart. Unified temporal and spatial calibration for multi-sensor systems. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 1280–1286. IEEE, 2013.

[16]

Patrick Geneva and Guoquan Huang. vicon2gt: Derivations and analysis. Technical Report RPNG-2020-VICON2GT, University of Delaware, 2020.

[17]

Joel A. Hesch, Dimitrios G. Kottas, Sean L. Bowman, and Stergios I. Roumeliotis. Observability-constrained vision-aided inertial navigation. Technical report, Dept. of Computer Science & Engineering, University of Minnesota, 2012. Available: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.721.6118&rep=rep1&type=pdf.

[18]

Joel A Hesch, Dimitrios G Kottas, Sean L Bowman, and Stergios I Roumeliotis. Consistency analysis and improvement of vision-aided inertial navigation. IEEE Transactions on Robotics, 30(1):158–176, 2013.

[19]

Joel A Hesch, Dimitrios G Kottas, Sean L Bowman, and Stergios I Roumeliotis. Camera-imu-based localization: Observability analysis and consistency improvement. The International Journal of Robotics Research, 33(1):182–201, 2014.

[20]

Guoquan P Huang, Anastasios I Mourikis, and Stergios I Roumeliotis. A first-estimates jacobian ekf for improving slam consistency. In Experimental Robotics: The Eleventh International Symposium, pages 373–382. Springer, 2009.

[21]

Guoquan P Huang, Anastasios I Mourikis, and Stergios I Roumeliotis. Observability-based rules for designing consistent ekf slam estimators. The International Journal of Robotics Research, 29(5):502–528, 2010.

[22]

Jinwoo Jeon, Sungwook Jung, Eungchang Lee, Duckyu Choi, and Hyun Myung. Run your visual-inertial odometry on nvidia jetson: Benchmark tests on a micro aerial vehicle. IEEE Robotics and Automation Letters, 6(3):5332–5339, 2021.

[23]

Jinyong Jeong, Younggun Cho, Young-Sik Shin, Hyunchul Roh, and Ayoung Kim. Complex urban dataset with multi-level sensors from highly diverse urban environments. The International Journal of Robotics Research, 38(6):642–657, 2019.

[24]

Steven M Kay. Fundamentals of statistical signal processing. Prentice Hall PTR, 1993.

[25]

Woosik Lee, Kevin Eckenhoff, Yulin Yang, Patrick Geneva, and Guoquan Huang. Visual-inertial-wheel odometry with online calibration. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 4559–4566. IEEE, 2020.

[26]

Mingyang Li and Anastasios I. Mourikis. Consistency of ekf-based visual-inertial odometry. Technical report, Dept. of Electrical Engineering, University of California, Riverside, 2012. Available: https://intra.ece.ucr.edu/ mourikis/tech_reports/VIO.pdf.

[27]

Mingyang Li and Anastasios I Mourikis. High-precision, consistent ekf-based visual-inertial odometry. The International Journal of Robotics Research, 32(6):690–711, 2013.

[28]

Mingyang Li and Anastasios I Mourikis. Online temporal calibration for camera–imu systems: Theory and algorithms. The International Journal of Robotics Research, 33(7):947–964, 2014.

[29]

Mingyang Li, Hongsheng Yu, Xing Zheng, , and Anastasios I. Mourikis. High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation. In IEEE International Conference on Robotics and Automation (ICRA), pages 409–416, May 2014.

[30]

Mingyang Li. Visual-inertial odometry on resource-constrained systems. PhD thesis, UC Riverside, 2014.

[31]

Peter S Maybeck. Stochastic models, estimation, and control, volume 3. Academic press, 1982.

[32]

Anastasios I Mourikis and Stergios I Roumeliotis. A multi-state constraint kalman filter for vision-aided inertial navigation. In Proceedings 2007 IEEE International Conference on Robotics and Automation, pages 3565–3572. IEEE, 2007.

[33]

E. Mueggler, G. Gallego, H. Rebecq, and D. Scaramuzza. Continuous-time visual-inertial odometry for event cameras. IEEE Transactions on Robotics, pages 1–16, 2018.

[34]

Alonso Patron-Perez, Steven Lovegrove, and Gabe Sibley. A spline-based trajectory representation for sensor fusion and rolling shutter cameras. International Journal of Computer Vision, 113(3):208–219, 2015.

[35]

Tong Qin, Peiliang Li, and Shaojie Shen. VINS-Mono: A robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 34(4):1004–1020, 2018.

[36]

Arvind Ramanandan, Anning Chen, and Jay A Farrell. Inertial navigation aiding by stationary updates. IEEE Transactions on Intelligent Transportation Systems, 13(1):235–248, 2011.

[37]

Joern Rehder and Roland Siegwart. Camera/imu calibration revisited. IEEE Sensors Journal, 17(11):3257–3268, 2017.

[38]

Thomas Schneider, Mingyang Li, Cesar Cadena, Juan Nieto, and Roland Siegwart. Observability-aware self-calibration of visual and inertial sensors for ego-motion estimation. IEEE Sensors Journal, 19(10):3846–3860, 2019.

[39]

David Schubert, Thore Goll, Nikolaus Demmel, Vladyslav Usenko, J  ö  rg St  ü  ckler, and Daniel Cremers. The tum vi benchmark for evaluating visual-inertial odometry. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 1680–1687. IEEE, 2018.

[40]

Nikolas Trawny and Stergios I Roumeliotis. Indirect kalman filter for 3d attitude estimation. University of Minnesota, Dept. of Comp. Sci. & Eng., Tech. Rep, 2:2005, 2005.

[41]

Brandon Wagstaff, Valentin Peretroukhin, and Jonathan Kelly. Improving foot-mounted inertial navigation through real-time motion classification. In 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), pages 1–8. IEEE, 2017.

[42]

Kejian J Wu, Chao X Guo, Georgios Georgiou, and Stergios I Roumeliotis. VINS on wheels. In 2017 IEEE International Conference on Robotics and Automation (ICRA), pages 5155–5162. IEEE, 2017.

[43]

Yulin Yang, Chuchu Chen, and Guoquan Huang. Supplementary material: Analytic combined imu integrator (aci2) for visual inertial navigation. Technical report, RPNG, University of Delaware, 2019. Available: http://udel.edu/ yuyang/downloads/suplementary_2020ICRA.pdf.

[44]

Yulin Yang, B. P. W. Babu, Chuchu Chen, Guoquan Huang, and Liu Ren. Analytic combined imu integration (aci2) for visual-inertial navigation. In Proc. of the IEEE International Conference on Robotics and Automation, Paris, France, 2020.

[45]

Yulin Yang, Patrick Geneva, Xingxing Zuo, and Guoquan Huang. Online imu intrinsic calibration: Is it necessary?. Proc. of Robotics: Science and Systems (RSS), Corvallis, Or, 2020.

[46]

Xingxing Zuo Guoquan Huang Yulin Yang, Patrick Geneva. Online self-calibration for visual-inertial navigation: Models, analysis and degeneracy. IEEE Transactions on Robotics, 2023.

[47]

Zichao Zhang and Davide Scaramuzza. A tutorial on quantitative trajectory evaluation for visual(-inertial) odometry. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 7244–7251. IEEE, 2018.