ETRAM: EVENT-BASED TRAFFIC MONITORING DATASET

ARIZONA STATE UNIVERSITY


Aayush Atul Verma, Bharatesh Chakravarthi, Arpitsinh Vaghela, Hua Wei, Yezhou Yang

ABSTRACT

Event cameras, with their high temporal and dynamic range and minimal memory usage, have found applications in various fields. However, their potential in static traffic monitoring remains largely unexplored. To facilitate this exploration, we present eTraM – a first-of-itskind, fully event-based traffic monitoring dataset. eTraM offers 10 hr of data from different traffic scenarios in various lighting and weather conditions, providing a comprehensive overview of real-world situations. Providing 2M bounding box annotations, it covers eight distinct classes of traffic participants, ranging from vehicles to pedestrians and micro-mobility. eTraM’s utility has been assessed using state-of-the-art methods for traffic participant detection, including RVT, RED, and YOLOv8. We quantitatively evaluate the ability of event-based models to generalize on nighttime and unseen scenes. Our findings substantiate the compelling potential of leveraging event cameras for traffic monitoring, opening new avenues for research and application. eTraM is available at https://eventbasedvision.github.io/eTraM

 

INVENTORS COMMUNITY

Our fast-growing network of 12,000+ researchers and developers is revealing the invisible with Prophesee’s neuromorphic vision technologies. From giving sight back to the blind to touching cells or tracking space debris, they have shown incredible imagination and innovation by leveraging Event-Based Metavision.

Don’t miss a bit,

follow us to be the first to know

✉️ Join Our Newsletter