RESEARCH PAPERS

Photonic Neuromorphic Accelerators for Event-Based Imaging Flow Cytometry

Photonic Neuromorphic Accelerators for Event-Based Imaging Flow Cytometry

In this work, we present experimental results of a high-speed label-free imaging cytometry system that seamlessly merges the high-capturing rate and data sparsity of an event-based CMOS camera with lightweight photonic neuromorphic processing. The results confirm that neuromorphic sensing and neuromorphic computing can be efficiently merged to a unified bio-inspired system, offering a holistic enhancement in emerging bio-imaging applications.

Re-interpreting the Step-Response Probability Curve to Extract Fundamental Physical Parameters of Event-Based Vision Sensors

Re-interpreting the Step-Response Probability Curve to Extract Fundamental Physical Parameters of Event-Based Vision Sensors

In this work, we detail the method for generating accurate S-curves by applying an appropriate stimulus and sensor configuration to decouple 2nd-order effects from the parameter being studied. We use an EVS pixel simulation to demonstrate how noise and other physical constraints can lead to error in the measurement, and develop two techniques that are robust enough to obtain accurate estimates.

Event Cameras Meet SPADs for High-Speed, Low-Bandwidth Imaging

Event Cameras Meet SPADs for High-Speed, Low-Bandwidth Imaging

We introduce a sensor fusion framework to combine single-photon avalanche diodes (SPADs) with event cameras to improve the reconstruction of high-speed, low-light scenes while reducing the high bandwidth cost associated with using every SPAD frame. Our evaluation, on both synthetic and real sensor data, demonstrates significant enhancements (> 5 dB PSNR) in reconstructing low-light scenes at high temporal resolution (100 kHz) compared to conventional cameras. Event-SPAD fusion shows great promise for real-world applications, such as robotics or medical imaging.

Towards a Dynamic Vision Sensor-based Insect Camera Trap

Towards a Dynamic Vision Sensor-based Insect Camera Trap

This paper introduces a visual real-time insect monitoring approach capable of detecting and tracking tiny and fast-moving objects in cluttered wildlife conditions using an RGB-DVS stereo-camera system. Our study suggests that DVS-based sensing can be used for visual insect monitoring by enabling reliable real-time insect detection in wildlife conditions while significantly reducing the necessity for data storage, manual labour and energy.

Spiking Neural Networks for Fast-Moving Object Detection on Neuromorphic Hardware Devices Using an Event-Based Camera

Spiking Neural Networks for Fast-Moving Object Detection on Neuromorphic Hardware Devices Using an Event-Based Camera

In this paper, we propose a novel solution that combines an eventbased camera with Spiking Neural Networks (SNNs) for ball detection. We use multiple state-of-the-art SNN frameworks and develop a SNN architecture for each of them, complying with their corresponding constraints. Additionally, we implement the SNN solution across multiple neuromorphic edge devices, conducting comparisons of their accuracies and run-times.

APEIRON: a Multimodal Drone Dataset Bridging Perception and Network Data in Outdoor Environments

APEIRON: a Multimodal Drone Dataset Bridging Perception and Network Data in Outdoor Environments

In this work, we introduce APEIRON, a rich multimodal aerial dataset that simultaneously collects perception data from a stereocamera and an event based camera sensor, along with measurements of wireless network links obtained using an LTE module. The assembled dataset consists of both perception and network data, making it suitable for typical perception or communication applications.

Helios: An extremely low power event-based gesture recognition for always-on smart eyewear

Helios: An extremely low power event-based gesture recognition for always-on smart eyewear

This paper introduces Helios, the first extremely low-power, real-time, event-based hand gesture recognition system designed for all-day on smart eyewear. Helios can recognize seven classes of gestures, including subtle microgestures like swipes and pinches, with 91% accuracy. We also demonstrate real-time performance across 20 users at a remarkably low latency of 60ms.

SEVD: Synthetic Event-based Vision Dataset for Ego and Fixed Traffic Perception

SEVD: Synthetic Event-based Vision Dataset for Ego and Fixed Traffic Perception

In this paper, we present SEVD, a first-of-its-kind multi-view ego, and fixed perception synthetic event-based dataset using multiple dynamic vision sensors within the CARLA simulator. We evaluate the dataset using state-of-the-art event-based (RED, RVT) and frame-based (YOLOv8) methods for traffic participant detection tasks and provide baseline benchmarks for assessment.

Optimization of Event Camera Bias Settings for a Neuromorphic Driver Monitoring System

Optimization of Event Camera Bias Settings for a Neuromorphic Driver Monitoring System

This research is the first to investigate the impact of bias modifications on the event-based DMS output and propose an approach for evaluating and comparing DMS performance. The study investigates the impact of pixel-bias alteration on DMS features, which are: face tracking, blink counting, head pose and gaze estimation. The results indicate that the DMS’s functioning is enhanced with proper bias tuning based on the proposed metrics.

eTraM: Event-based Traffic Monitoring for Resource-Efficient Detection and Tracking Across Varied Lighting Conditions

eTraM: Event-based Traffic Monitoring for Resource-Efficient Detection and Tracking Across Varied Lighting Conditions

This study proposes an innovative approach leveraging neuromorphic sensor technology to enhance traffic monitoring efficiency while still exhibiting robust performance when exposed to difficult conditions. The quantitative evaluation of the ability of event-based models to generalize on nighttime and unseen scenes further substantiates the compelling potential of leveraging event cameras for trac monitoring, opening new avenues for research and application.

Feasibility study of in‑line particle image velocimetry

Feasibility study of in‑line particle image velocimetry

This article describes recent tests and developments of imaging and evaluation techniques for particle image velocimetry (PIV) that exploit the forward scattering of tracer particles by placing the camera in-line with the illuminating light source, such as a laser or a light emitting diode. This study highlights the most promising approaches of the various recording configurations and evaluation techniques.

Complementing Event Streams and RGB Frames for Hand Mesh Reconstruction

Complementing Event Streams and RGB Frames for Hand Mesh Reconstruction

In this paper, we propose EvRGBHand – the first approach for 3D hand mesh reconstruction with an event camera and an RGB camera compensating for each other. By fusing two modalities of data across time, space, and information dimensions, EvRGBHand can tackle overexposure and motion blur issues in RGB-based HMR and foreground scarcity as well as background overflow issues in event-based HMR.

Observations of Naturally Occurring Lightning with Event-Based Vision Sensors

Observations of Naturally Occurring Lightning with Event-Based Vision Sensors

This paper demonstrates the effectiveness of Event-Based Vision Sensor in lightning research by presenting data collected during a full lightning storm and provides examples of how event-based data can be used to interpret various lightning features. We conclude that the Event-Based Vision Sensor has the potential to improve high-speed imagery due to its lower cost, data output, and ease of deployment, ultimately establishing it as an excellent complementary tool for lightning observation.

EventPS: Real-Time Photometric Stereo Using an Event Camera

EventPS: Real-Time Photometric Stereo Using an Event Camera

EventPS seamlessly integrates with both optimization-based and deep-learning-based photometric stereo techniques to offer a robust solution for non-Lambertian surfaces. Extensive experiments validate the effectiveness and efficiency of EventPS compared to frame-based counterparts. Our algorithm runs at over 30 fps in real-world scenarios, unleashing the potential of EventPS in time-sensitive and high-speed downstream applications.

Time Lens++: Event-Based Frame Interpolation With Parametric Non-Linear Flow and Multi-Scale Fusion

Time Lens++: Event-Based Frame Interpolation With Parametric Non-Linear Flow and Multi-Scale Fusion

In this work, we introduce multi-scale feature-level fusion and computing one-shot non-linear inter-frame motion—which can be efficiently sampled for image warping—from events and images. We also collect the first large-scale events and frames dataset consisting of more than 100 challenging scenes with depth variations, captured with a new experimental setup based on a beamsplitter.

eTraM: Event-Based Traffic Monitoring Dataset

eTraM: Event-Based Traffic Monitoring Dataset

eTraM offers 10 hr of data from different traffic scenarios in various lighting and weather conditions, providing a comprehensive overview of real-world situations. Providing 2M bounding box annotations, it covers eight distinct classes of traffic participants, ranging from vehicles to pedestrians and micro-mobility.

Time Lens: Event-Based Video Frame Interpolation

Time Lens: Event-Based Video Frame Interpolation

In this work, we introduce Time Lens, a novel method that leverages the advantages of both. We extensively evaluate our method on three synthetic and two real benchmarks where we show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods. We release a new large-scale dataset in highly dynamic scenarios, aimed at pushing the limits of existing methods.

Mamba-FETrack: Frame-Event Tracking via State Space Model

Mamba-FETrack: Frame-Event Tracking via State Space Model

This paper proposes a novel RGB-Event tracking framework, Mamba-FETrack, based on the State Space Model (SSM) to achieve high-performance tracking while effectively reducing computational costs and realizing more efficient tracking. Specifically, we adopt two modality-specific Mamba backbone networks to extract the features of RGB frames and Event streams.

Event-based Vision Contactless Fault Diagnosis With Neuromorphic Computing

Event-based Vision Contactless Fault Diagnosis With Neuromorphic Computing

This letter presents a novel dynamic vision enabled contactless cross-domain fault diagnosis method with neuromorphic computing. The event-based camera is adopted to capture the machine vibration states in the perspective of vision. A specially designed bio-inspired deep transfer spiking neural network (SNN) model is proposed for processing the event streams of visionary data, feature extraction and fault diagnosis.

Low-latency automotive vision with event cameras

Low-latency automotive vision with event cameras

Here we propose a hybrid event- and frame-based object detector that preserves the advantages of each modality and thus does not suffer from this trade-off. Our method exploits the high temporal resolution and sparsity of events and the rich but low temporal resolution information in standard images to generate efficient, high-rate object detections, reducing perceptual and computational latency.

Dsec: A stereo event camera dataset for driving scenarios

Dsec: A stereo event camera dataset for driving scenarios

We propose DSEC, a new dataset that contains demanding illumination conditions and provides a rich set of sensory data. DSEC offers data from a wide-baseline stereo setup of two color frame cameras and two high-resolution monochrome event cameras. In addition, we collect lidar data and RTK GPS measurements, both hardware synchronized with all camera data.

Don’t miss a bit,

follow us to be the first to know

✉️ Join Our Newsletter