HOW TO CALIBRATE YOUR EVENT CAMERA

UNIVERSITY OF ZURICH, ETH ZURICH

 

Manasi Muglikar, Mathias Gehrig, Daniel Gehrig, Davide Scaramuzza

 

ABSTRACT

 

We propose a generic event camera calibration framework using image reconstruction. Instead of relying on blinking patterns or external screens, we show that neural network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras. The advantage of our proposed approach is that we can use standard calibration patterns that do not rely on active illumination. Furthermore, our approach enables the possibility to perform extrinsic calibration between frame-based and event-based sensors without additional complexity. Both simulation and real-world experiments indicate that calibration through image reconstruction is accurate under common distortion models and a wide variety of distortion parameters.

 

 

Source: Proceedings of the IEEE/CVF

PRODUCTS USED IN THIS PAPER

SEARCH PUBLICATION LIBRARY

Don’t miss a bit,

follow us to be the first to know

✉️ Join Our Newsletter