RESEARCH PAPERS
MMDVS-LF: Multi-Modal Dynamic Vision Sensor and Eye-Tracking Dataset for Line Following
This paper proposes a novel, computationally efficient regularizer to mitigate event collapse in the CMax framework. From a theoretical point of view, the regularizer is designed based on geometric principles of motion field deformation (measuring area rate of change along point trajectories).
eTraM: Event-based Traffic Monitoring Dataset
Event cameras offer high temporal resolution and efficiency but remain underutilized in static traffic monitoring. We present eTraM, a first-of-its-kind event-based dataset with 10 hours of traffic data, 2M annotations, and eight participant classes. Evaluated with RVT, RED, and YOLOv8, eTraM highlights the potential of event cameras for real-world applications.
SEVD: Synthetic Event-based Vision Dataset for Ego and Fixed Traffic Perception
This paper evaluates a dataset using state-of-the-art event-based (RED, RVT) and frame-based (YOLOv8) methods for traffic participant detection tasks and provide baseline benchmarks for assessment. Additionally, the authors conduct experiments to assess the synthetic event-based dataset’s generalization capabilities.
Object Detection using Event Camera: A MoE Heat Conduction based Detector and A New Benchmark Dataset
This paper introduces a novel MoE (Mixture of Experts) heat conduction-based object detection algorithm that strikingly balances accuracy and computational efficiency. Initially, we employ a stem network for event data embedding, followed by processing through our innovative MoE-HCO blocks.
x-RAGE: eXtended Reality – Action & Gesture Events Dataset
This paper presents the first event-camera based egocentric gesture dataset for enabling neuromorphic, low-power solutions for XR-centric gesture recognition.
Event Stream based Sign Language Translation: A High-Definition Benchmark Dataset and A New Algorithm
Unlike traditional SLT based on visible light videos, which is easily affected by factors such as lighting, rapid hand movements, and privacy breaches, this paper proposes the use of high-definition Event streams for SLT, effectively mitigating the aforementioned issues.
Event Stream-based Visual Object Tracking: A High-Resolution Benchmark Dataset and A Novel Baseline
This paper proposes a novel hierarchical knowledge distillation framework that can fully utilize multi-modal / multi-view information during training to facilitate knowledge transfer, enabling us to achieve high-speed and low-latency visual tracking during testing by using only event signals.
A Benchmark Dataset for Event-Guided Human Pose Estimation and Tracking in Extreme Conditions
This paper introduces a new hybrid dataset encompassing both RGB and event data for human pose estimation and tracking in two extreme scenarios: low-light and motion blur environments.
EventSTR: A Benchmark Dataset and Baselines for Event Stream based Scene Text Recognition
This paper proposes to recognize the scene text using bio-inspired event cameras by collecting and annotating a large-scale benchmark dataset, termed EventSTR. It contains 9,928 high-definition (1280 × 720) event samples and involves both Chinese and English characters.
Falcon ODIN: an event-based camera payload
This paper talks about the mission design and objectives for Falcon ODIN along with ground-based testing of all four camera. Falcon ODIN contains two event based cameras (EBC) and two traditional framing cameras along with mirrors mounted on azimuth elevation rotation stages which allow the field of regard of the EBCs to move.
Person Re-Identification without Identification via Event anonymization
This paper proposes an end-to-end network architecture jointly optimized for the twofold objective of preserving privacy and performing a downstream task such as person ReId.
Ultra-Efficient On-Device Object Detection on AI-Integrated Smart Glasses with TinyissimoYOLO
This paper illustrates the design and implementation of tiny machine-learning algorithms exploiting novel low-power processors to enable prolonged continuous operation in smart glasses.
Fast 3D reconstruction via event-based structured light with spatio-temporal coding
This work builds an event-based SL system that consists of a laser point projector and an event camera, and devises a spatial-temporal coding strategy that realizes depth encoding in dual domains through a single shot.
Asynchronous Blob Tracker for Event Cameras
This work builds an event-based SL system that consists of a laser point projector and an event camera, and devises a spatial-temporal coding strategy that realizes depth encoding in dual domains through a single shot.
Heart Rate Detection Using an Event Camera
This paper proposes to harness the surface of the skin caused by the pulsatile flow of blood in the wrist region. It investigates whether an event camera could be used for continuous noninvasive monitoring of heart rate (HR). Event camera video data from 25 participants, comprising varying age groups and skin colours, was collected and analysed.
SPADES: A Realistic Spacecraft Pose Estimation Dataset using Event Sensing
This paper proposes an effective data filtering method to improve the quality of training data, thus enhancing model performance. Additionally, it introduces an image-based event representation that outperforms existing representations.
Event Camera and LiDAR based Human Tracking for Adverse Lighting Conditions in Subterranean Environments
This paper proposes a novel LiDAR and event camera fusion modality for subterranean (SubT) environments for fast and precise object and human detection in a wide variety of adverse lighting conditions, such as low or no light, high-contrast zones and in the presence of blinding light sources
Learning Parallax for Stereo Event-based Motion Deblurring
This work proposes a novel coarse-to-fine framework, named NETwork of Event-based motion Deblurring with STereo event and intensity cameras (St-EDNet), to recover high-quality images directly from the misaligned inputs, consisting of a single blurry image and the concurrent event streams.
blurry image and the concurrent event streams.
Neural Image Re-Exposure
This work aims at re-exposing the captured photo in the post-processing, providing a more flexible way to address issues within a unified framework. Specifically, it propose a neural network based image re-exposure framework.
Neuromorphic Computing and Sensing in Space
The term “neuromorphic” refers to systems that closely resemble the architecture and dynamics of biological neural networks. From brain-inspired computer chips to sensory devices mimicking human vision and olfaction, neuromorphic computing aims to achieve efficiency levels comparable to biological organisms.
Real-time 6-DoF pose estimation by an event-based camera using active LED markers
This paper proposes a simple but effective event-based pose estimation system using active LED markers (ALM) for fast and accurate pose estimation. The proposed algorithm is able to operate in real time with a latency below 0.5 ms while maintaining output rates of 3 kHz.
The development of a Hardware-in-the-Loop test setup for event-based vision near-space space objects
This paper proposes to develop a Hardware-in-the-Loop imaging setup that enables experimenting with an event-based and frame-based camera under simulated space conditions. The generated data sets were used to compare visual navigation algorithms in terms of an event-based and frame-based feature detection and tracking algorithm.
Collision detection for UAVs using Event Cameras
This paper explores the use of event cameras for collision detection in unmanned aerial vehicles (UAVs). Traditional cameras have been widely used in UAVs for obstacle avoidance and navigation, but they suffer from high latency and low dynamic range. Event cameras, on the other hand, capture only the changes in the scene and can operate at high speeds with low latency.
Demystifying event-based camera latency: sensor speed dependence on pixel biasing, light, and spatial activity
This report explores how various mechanisms effect the response time of event-based cameras (EBCs) are based on unconventional electro-optical IR vision sensors, which are only sensitive to changing light. Because their operation is essentially “frameless,” their response time is not dependent to a frame rate or readout time, but rather the number of activated pixels, the magnitude of background light, local fabrication defects, and analog configuration of the pixel.blurry image and the concurrent event streams.
On the Generation of a Synthetic Event-Based Vision Dataset for Navigation and Landing
This paper presents a methodology and a software pipeline for generating event-based vision datasets from optimal landing trajectories during the approach of a target body. It constructs sequences of photorealistic images of the lunar surface with the Planet and Asteroid Natural Scene Generation Utility (PANGU) at different viewpoints along a set of optimal descent trajectories obtained by varying the boundary conditions.
Time-resolved velocity profile measurement using event-based imaging
This paper presents the implementation of time-resolved velocity profile measurement using event-based vision
(EBV) employing an event-camera in-place of a high-speed camera.
To change or not to change: Exploring the potential of event-based detectors for wavefront sensing
This paper presents the modelling and preliminary experimental results of a Shack-Hartmann tip-tilt wavefront sensor equipped with an event-based detector, demonstrating its ability to estimate spot displacement with remarkable speed and sensitivity in low-light conditions.
Widefield Diamond Quantum Sensing with Neuromorphic Vision Sensors
This paper proposes a novel, computationally efficient regularizer to mitigate event collapse in the CMax framework. From a theoretical point of view, the regularizer is designed based on geometric principles of motion field deformation (measuring area rate of change along point trajectories).
Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion
The MSMO algorithm uses the velocities of each event to create an average of the scene and filter out dissimilar events. This work shows the study performed on the velocity values of the events and explains why ultimately an average-based velocity filter is insufficient for lightweight MSMO detection and tracking of objects using an EBS camera.
Are High-Resolution Event Cameras Really Needed?
The MSMO algorithm uses the velocities of each event to create an average of the scene and filter out dissimilar events. This work shows the study performed on the velocity values of the events and explains why ultimately an average-based velocity filter is insufficient for lightweight MSMO detection and tracking of objects using an EBS camera.
Neuromorphic Imaging with Super-Resolution
This paper introduces the first self-supervised neuromorphic super-resolution prototype. It can be self-adaptive to per input source from any low-resolution camera to estimate an optimal, high-resolution counterpart of any scale, without the need of side knowledge and prior training.
Neuromorphic Drone Detection: an Event-RGB Multimodal Approach
This paper presents the implementation of time-resolved velocity profile measurement using event-based vision
(EBV) employing an event-camera in-place of a high-speed camera.
Interpolation-Based Event Visual Data Filtering Algorithms
In this paper, we propose a method for event data that is capable of removing approximately 99% of noise while preserving the majority of the valid signal. It proposes four algorithms based on the matrix of infinite impulse response (IIR) filters method.
Event-Based Shape from Polarization with Spiking Neural Networks
This paper investigates event-based shape from polarization using Spiking Neural Networks (SNNs), introducing the Single-Timestep and Multi-Timestep Spiking UNets for effective and efficient surface normal estimation.
Low-Complexity Lossless Coding of Asynchronous Event Sequences for Low-Power Chip Integration
This paper introduces a groundbreaking low-complexity lossless compression method for encoding asynchronous event sequences, designed for efficient memory usage and low-power hardware integration.
Event-Based Shape From Polarization
This paper tackles the speed-resolution trade-off using event cameras. Event cameras are efficient highspeed vision sensors that asynchronously measure changes in brightness intensity with microsecond resolution.
Neuromorphic Seatbelt State Detection for In-Cabin Monitoring with Event Cameras
This paper tackles the speed-resolution trade-off using event cameras. Event cameras are efficient highspeed vision sensors that asynchronously measure changes in brightness intensity with microsecond resolution.
Evaluating Image-Based Face and Eye Tracking with Event Cameras
This paper showcases the viability of integrating conventional algorithms with event-based data, transformed into a frame format while preserving the unique benefits of event cameras.
A Fast Geometric Regularizer to Mitigate Event Collapse in the Contrast Maximization Framework
This paper proposes a novel, computationally efficient regularizer to mitigate event collapse in the CMax framework. From a theoretical point of view, the regularizer is designed based on geometric principles of motion field deformation (measuring area rate of change along point trajectories).
Pedestrian Detection with High-Resolution Event Camera
This paper compares two methods of processing event data by means of deep learning for the task of pedestrian detection. It uses a representation in the form of video frames, convolutional neural networks and asynchronous sparse convolutional neural networks.