RESEARCH PAPERS

Falcon ODIN: an event-based camera payload

Falcon ODIN: an event-based camera payload

This paper talks about the mission design and objectives for Falcon ODIN along with ground-based testing of all four camera. Falcon ODIN contains two event based cameras (EBC) and two traditional framing cameras along with mirrors mounted on azimuth elevation rotation stages which allow the field of regard of the EBCs to move.

Asynchronous Blob Tracker for Event Cameras

Asynchronous Blob Tracker for Event Cameras

This work builds an event-based SL system that consists of a laser point projector and an event camera, and devises a spatial-temporal coding strategy that realizes depth encoding in dual domains through a single shot.

Heart Rate Detection Using an Event Camera

Heart Rate Detection Using an Event Camera

This paper proposes to harness the surface of the skin caused by the pulsatile flow of blood in the wrist region. It investigates whether an event camera could be used for continuous noninvasive monitoring of heart rate (HR). Event camera video data from 25 participants, comprising varying age groups and skin colours, was collected and analysed.

Learning Parallax for Stereo Event-based Motion Deblurring

Learning Parallax for Stereo Event-based Motion Deblurring

This work proposes a novel coarse-to-fine framework, named NETwork of Event-based motion Deblurring with STereo event and intensity cameras (St-EDNet), to recover high-quality images directly from the misaligned inputs, consisting of a single blurry image and the concurrent event streams.
blurry image and the concurrent event streams.

Neural Image Re-Exposure

Neural Image Re-Exposure

This work aims at re-exposing the captured photo in the post-processing, providing a more flexible way to address issues within a unified framework. Specifically, it propose a neural network based image re-exposure framework.

Neuromorphic Computing and Sensing in Space

Neuromorphic Computing and Sensing in Space

The term “neuromorphic” refers to systems that closely resemble the architecture and dynamics of biological neural networks. From brain-inspired computer chips to sensory devices mimicking human vision and olfaction, neuromorphic computing aims to achieve efficiency levels comparable to biological organisms.

Collision detection for UAVs using Event Cameras

Collision detection for UAVs using Event Cameras

This paper explores the use of event cameras for collision detection in unmanned aerial vehicles (UAVs). Traditional cameras have been widely used in UAVs for obstacle avoidance and navigation, but they suffer from high latency and low dynamic range. Event cameras, on the other hand, capture only the changes in the scene and can operate at high speeds with low latency.

Demystifying event-based camera latency: sensor speed dependence on pixel biasing, light, and spatial activity

Demystifying event-based camera latency: sensor speed dependence on pixel biasing, light, and spatial activity

This report explores how various mechanisms effect the response time of event-based cameras (EBCs) are based on unconventional electro-optical IR vision sensors, which are only sensitive to changing light. Because their operation is essentially “frameless,” their response time is not dependent to a frame rate or readout time, but rather the number of activated pixels, the magnitude of background light, local fabrication defects, and analog configuration of the pixel.blurry image and the concurrent event streams.

On the Generation of a Synthetic Event-Based Vision Dataset for Navigation and Landing

On the Generation of a Synthetic Event-Based Vision Dataset for Navigation and Landing

This paper presents a methodology and a software pipeline for generating event-based vision datasets from optimal landing trajectories during the approach of a target body. It constructs sequences of photorealistic images of the lunar surface with the Planet and Asteroid Natural Scene Generation Utility (PANGU) at different viewpoints along a set of optimal descent trajectories obtained by varying the boundary conditions.

Widefield Diamond Quantum Sensing with Neuromorphic Vision Sensors

Widefield Diamond Quantum Sensing with Neuromorphic Vision Sensors

This paper proposes a novel, computationally efficient regularizer to mitigate event collapse in the CMax framework. From a theoretical point of view, the regularizer is designed based on geometric principles of motion field deformation (measuring area rate of change along point trajectories).

Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion

Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion

The MSMO algorithm uses the velocities of each event to create an average of the scene and filter out dissimilar events. This work shows the study performed on the velocity values of the events and explains why ultimately an average-based velocity filter is insufficient for lightweight MSMO detection and tracking of objects using an EBS camera.

Are High-Resolution Event Cameras Really Needed?

Are High-Resolution Event Cameras Really Needed?

The MSMO algorithm uses the velocities of each event to create an average of the scene and filter out dissimilar events. This work shows the study performed on the velocity values of the events and explains why ultimately an average-based velocity filter is insufficient for lightweight MSMO detection and tracking of objects using an EBS camera.

Neuromorphic Imaging with Super-Resolution

Neuromorphic Imaging with Super-Resolution

This paper introduces the first self-supervised neuromorphic super-resolution prototype. It can be self-adaptive to per input source from any low-resolution camera to estimate an optimal, high-resolution counterpart of any scale, without the need of side knowledge and prior training.

Interpolation-Based Event Visual Data Filtering Algorithms

Interpolation-Based Event Visual Data Filtering Algorithms

In this paper, we propose a method for event data that is capable of removing approximately 99% of noise while preserving the majority of the valid signal. It proposes four algorithms based on the matrix of infinite impulse response (IIR) filters method.

Event-Based Shape From Polarization

Event-Based Shape From Polarization

This paper tackles the speed-resolution trade-off using event cameras. Event cameras are efficient highspeed vision sensors that asynchronously measure changes in brightness intensity with microsecond resolution.

Pedestrian Detection with High-Resolution Event Camera

Pedestrian Detection with High-Resolution Event Camera

This paper compares two methods of processing event data by means of deep learning for the task of pedestrian detection. It uses a representation in the form of video frames, convolutional neural networks and asynchronous sparse convolutional neural networks.

YCB-Ev 1.1: Event-vision dataset for 6DoF object pose estimation

YCB-Ev 1.1: Event-vision dataset for 6DoF object pose estimation

This work introduces the YCB-Ev dataset, which contains synchronized RGB-D frames and event data that enables evaluating 6DoF object pose estimation algorithms using these modalities. This dataset provides ground truth 6DoF object poses for the same 21 YCB objects that were used in the YCB-Video (YCB-V) dataset, allowing for cross-dataset algorithm performance evaluation.

EvTTC: An Event Camera Dataset for Time-to-Collision Estimation

EvTTC: An Event Camera Dataset for Time-to-Collision Estimation

To explore the potential of event cameras in the above-mentioned challenging cases, this paper proposes EvTTC, which is the first multi-sensor dataset focusing on TTC tasks under high-relative-speed scenarios. EvTTC consists of data collected using standard cameras and event cameras, covering various potential collision scenarios in daily driving and involving multiple collision objects.

Event-based vision in magneto-optic Kerr effect microscopy

Event-based vision in magneto-optic Kerr effect microscopy

This paper explores the use of event cameras as an add-on to traditional MOKE microscopy to enhance time resolution for observing magnetic domains. Event cameras improve temporal resolution to 1 µs, enabling real-time monitoring and post-processing of fast magnetic dynamics. A proof-of-concept feedback control experiment demonstrated a latency of just 25 ms, highlighting the potential for dynamic material research. Limitations of current event cameras in this application are also discussed.

Learned Event-based Visual Perception for Improved Space Object Detection

Learned Event-based Visual Perception for Improved Space Object Detection

This paper presents a hybrid image- and event-based architecture for detecting dim space objects in geosynchronous orbit using dynamic vision sensing. Combining conventional and point-cloud feature extractors like PointNet, the approach enhances detection performance in high-background activity scenes. An event-based imaging simulator is also developed for model training and sensor parameter optimization, demonstrating improved recall for dim objects in challenging conditions.

Dataset collection from a SubT environment

Dataset collection from a SubT environment

This paper introduces a dataset from a subterranean (SubT) environment, captured with state-of-the-art sensors like RGB, RGB-D, event-based, and thermal cameras, along with 2D/3D lidars, IMUs, and UWB positioning systems. Synchronized raw data is provided in ROS message format, enabling evaluations of navigation, localization, and mapping algorithms.

Don’t miss a bit,

follow us to be the first to know

✉️ Join Our Newsletter