Welcome to the Prophesee Research Library, where academic innovation meets the world’s most advanced event-based vision technologies
We have brought together groundbreaking research from scholars who are pushing the boundaries with Prophesee Event-based Vision technologies to inspire collaboration and drive forward new breakthroughs in the academic community.
Introducing Prophesee Research Library, the largest curation of academic papers, leveraging Prophesee event-based vision.
Together, let’s reveal the invisible and shape the future of Computer Vision.
Asynchronous Multi-Object Tracking with an Event Camera
In this paper, the Asynchronous Event Multi-Object Tracking (AEMOT) algorithm is presented for detecting and tracking multiple objects by processing individual raw events asynchronously. AEMOT detects salient event blob features by identifying regions of consistent optical flow using a novel Field of Active Flow Directions built from the Surface of Active Events. Detected features are tracked as candidate objects using the recently proposed Asynchronous Event Blob (AEB) tracker to construct small intensity patches of each candidate object.
FRED: The Florence RGB-Event Drone Dataset
The Florence RGB-Event Drone dataset (FRED) is a novel multimodal dataset specifically designed for drone detection, tracking, and trajectory forecasting, combining RGB video and event streams. FRED features more than 7 hours of densely annotated drone trajectories, using five different drone models and including challenging scenarios such as rain and adverse lighting conditions.
RGB-Event Fusion with Self-Attention for Collision Prediction
This paper proposes a neural network framework for predicting the time and collision position of an unmanned aerial vehicle with a dynamic object, using RGB and event-based vision sensors. The proposed architecture consists of two separate encoder branches, one for each modality, followed by fusion by self-attention to improve prediction accuracy. To facilitate benchmarking, the ABCD dataset is leveraged, enabling detailed comparisons of single-modality and fusion-based approaches.
The neurobench framework for benchmarking neuromorphic computing algorithms and systems
This article presents NeuroBench, a benchmark framework for neuromorphic algorithms and systems. It introduces a common set of tools and systematic methodology for inclusive benchmark measurement, delivering an objective reference framework for quantifying neuromorphic approaches in both hardware-independent and hardware-dependent settings.
Looking into the Shadow: Recording a Total Solar Eclipse with High-resolution Event Cameras
This paper presents the first recording of a total solar eclipse with a pair of high-resolution event cameras, with accompanying methodology. A method is proposed to stabilize the recordings to counteract manual tripod adjustments required to track celestial bodies in-frame. A high-dynamic range image of the sun is also generated during the eclipse, showing how event cameras excel in this aspect compared to traditional CMOS-based cameras.
Graph Neural Network Combining Event Stream and Periodic Aggregation for Low-Latency Event-based Vision
This paper proposes a novel architecture combining an asynchronous accumulation-free event branch and a periodic aggregation branch to break the accuracy-latency trade-off. The solution enables ultra low-latency and low-power optical flow prediction from event cameras, achieving per-event prediction with a latency of tens of microseconds.
Don’t miss the next story.
Subscribe to our newsletter!
INVENTORS AROUND THE WORLD
Feb 2025