REAL-TIME OPTICAL FLOW FOR VEHICULAR PERCEPTION WITH LOW- AND HIGH-RESOLUTION EVENT CAMERAS

UNIVERSITE DE TECHNOLOGIE DE COMPIEGNE (UTC)

 

Vincent Brebion, Julien Moreau, Franck Davoine

ABSTRACT

Event cameras capture changes of illumination in the observed scene rather than accumulating light to create images. Thus, they allow for applications under high-speed motion and complex lighting conditions, where traditional framebased sensors show their limits with blur and over- or underexposed pixels. Thanks to these unique properties, they represent nowadays an highly attractive sensor for ITS-related applications. Event-based optical flow (EBOF) has been studied following the rise in popularity of these neuromorphic cameras. The recent arrival of high-definition neuromorphic sensors, however, challenges the existing approaches, because of the increased resolution of the events pixel array and a much higher throughput. As an answer to these points, we propose an optimized framework for computing optical flow in real-time with both low- and high-resolution event cameras. We formulate a novel dense representation for the sparse events flow, in the form of the “inverse exponential distance surface”. It serves as an interim frame, designed for the use of proven, state-of-the-art frame-based optical flow computation methods. We evaluate our approach on both low- and high-resolution driving sequences, and show that it often achieves better results than the current state of the art, while also reaching higher frame rates, 250Hz at 346 x 260 pixels and 77Hz at 1280 x 720 pixels.

 Source: Arxiv

PRODUCTS USED IN THIS PAPER

SEARCH PUBLICATION LIBRARY

Don’t miss a bit,

follow us to be the first to know

✉️ Join Our Newsletter