Welcome to the Prophesee Research Library, where academic innovation meets the world’s most advanced event-based vision technologies
We have brought together groundbreaking research from scholars who are pushing the boundaries with Prophesee Event-based Vision technologies to inspire collaboration and drive forward new breakthroughs in the academic community.
Introducing Prophesee Research Library, the largest curation of academic papers, leveraging Prophesee event-based vision.
Together, let’s reveal the invisible and shape the future of Computer Vision.
Astrometric Calibration and Source Characterisation of the Latest Generation Neuromorphic Event-based Cameras for Space Imaging
In this paper, the traditional techniques of conventional astronomy are reconsidered to properly utilise the event-based camera for space imaging and space situational awareness.
Motion Segmentation for Neuromorphic Aerial Surveillance
This paper addresses these challenges by introducing a novel motion segmentation method that leverages self-supervised vision transformers on both event data and optical flow information. Our approach eliminates the need for human annotations and reduces dependency on scene-specific parameters.
CoSEC: A Coaxial Stereo Event Camera Dataset for Autonomous Driving
This paper introduces hybrid coaxial event-frame devices to build the multimodal system, and propose a coaxial stereo event camera (CoSEC) dataset for autonomous driving. As for the multimodal system, it first utilizes the microcontroller to achieve time synchronization, and then spatially calibrate different sensors, where they perform intra- and inter-calibration of stereo coaxial devices.
Ev-Layout: A Large-scale Event-based Multi-modal Dataset for Indoor Layout Estimation and Tracking
This paper presents Ev-Layout, a novel large-scale event-based multi-modal dataset designed for indoor layout estimation and tracking. Ev-Layout makes key contributions to the community by: Utilizing a hybrid data collection platform (with a head-mounted display and VR interface) that integrates both RGB and bio-inspired event cameras to capture indoor layouts in motion.
RGBE-Gaze: A Large-Scale Event-Based Multimodal Dataset for High Frequency Remote Gaze Tracking
This paper presents dataset characteristics such as head pose, gaze direction, and pupil size. Furthermore, it introduces a hybrid frame-event based gaze estimation method specifically designed for the collected dataset. Moreover, it performs extensive evaluations of different benchmarking methods under various gaze-related factors.
Synthetic Lunar Terrain: A Multimodal Open Dataset for Training and Evaluating Neuromorphic Vision Algorithms
Synthetic Lunar Terrain (SLT) is an open dataset collected from an analogue test site for lunar missions, featuring synthetic craters in a high-contrast lighting setup. It includes several side-by-side captures from event-based and conventional RGB cameras, supplemented with a high-resolution 3D laser scan for depth estimation.
Don’t miss the next story.
Subscribe to our newsletter!
INVENTORS AROUND THE WORLD
Feb 2025