3D Sensing and Sensor Fusion

Type of instruction




Part of degree program


Recommended in

Semester 3

Typically offered in

Autumn semester

Course description

Operation principles of 3D sensors. Active and passive 3D sensing for autonomous vehicles. Cameras, video cameras, depth cameras, LiDAR sensors, radars, sonars. Comparison of sensors, application areas, advantages and limitations. Sensor fusion on data and feature level and in state space. Camera LiDAR and camera-depth camera fusion. Sensor fusion and semantic segmentation.


Compulsory readings:

  • J. Janai et al, Computer Vision for Autonomous Vehicles: Problems, Datasets and State-of- the-Art, arXiv preprint arXiv:1704.05519, 2017.
  • I. Eichhardt, D. Chetverikov, Z. Jankó, Image-guided ToF depth upsampling: a survey Machine Vision and Applications, vol. 28, pp. 267-282, 2017.
  • Grzegorzek, M., Theobalt, C., Koch, R., Kolb, A. (eds.) Time-of- Flight and Depth Imaging. Sensors, Algorithms, and Applications, Lecture Notes in Computer Science, vol. 8200, Springer, 2013. ISBN: 978-3- 642-44963- 5 (Print), 978-3- 642-44964- 2 (Online).

Recommended readings:

  • H. Fourati, Ed. Multisensor Data Fusion: From Algorithms and Architectural Design to Applications, CRC Press, 2015, ISBN: 9781482263749.
  • M. Liggins II, D. Hall, J. Llinas, Handbook of Multisensor Data Fusion: Theory and Practice, CRC Press, 2008. ISBN: 9781420053081.