TRUSTED BY INDUSTRY LEADERS

>10kHz

Equivalent temporal precision

>140dB

Dynamic range

<1ms

End-to-end latency

4x5mm

GenX320 camera module

0.13g

GenX320 camera module

<2mW

Sensing power / 16 μW standby mode GenX320

MULTIMODAL RGB-EVS DRONE DETECTION

This study explores drone detection using event cameras, focusing on three fusion strategies: pooling-based, asymmetric modality injection, and symmetric fusion.

Results show that event-based models outperform RGB alone, but combining both enhances performance by overcoming their individual limitations.

GPS-DENIED DRONE NAVIGATION

Current GPS systems are vulnerable to interference from large buildings, dense foliage, extreme weather or signal jamming.

EVS outputs a real-time terrain fingerprint for the particular patch of land the vehicle is passing over. This is then compared against a database of terrain fingerprints generated from satellite imagery, which is stored on the vehicle.

Resilient Absolute Positioning 

Absolute positioning information over land in GNSS-denied environments

Resistant to Interference

Robust to external signals, disturbances, or influences

Minimal Processing

Achieves optimal performance while using minor processing resource.

Multitude of Platforms

Broad envelope of operation and low size, weight & power solution.

Passive

Remains covert by refraining from active signal generation or emission.

Minimised Data Storage

Innovative data compression minimises storage requirements.

MULTIMODAL SENSING PLATFORM

Prophesee’s event-based Metavision sensors, integrated into the Cambrian M-Series multimodal sensing platform, revolutionize imaging with low-latency performance and microsecond-precision object tracking.

By capturing only scene changes at the pixel level, they drastically enhance energy efficiency, handle challenging lighting with exceptional dynamic range, and reduce data output for real-time analytics.

DRONE DETECTION PLATFORM

Neurobus revolutionizes onboard surveillance with neuromorphic technology, combining event-based vision sensors and AI processors inspired by the human brain for real-time detection and tracking.

Their system enables ultra-fast, autonomous decision-making with minimal latency and power consumption by leveraging edge computing and processing data directly on the device. Designed for drone surveillance, defense, GPS-denied navigation, and high-speed object detection, Neurobus ensures reliable and efficient situational awareness in any environment.

AI-BASED TARGET RECOGNITION

Flysight uses Prophesee event cameras for AI-based target recognition in challenging scenarios such as persons or drones evolving in low light, camouflaged or over complex background.

In this work, they managed to cut time to first detection from 12 to 2 seconds using Prophesee sensors in a challenging scenario involving camouflaged personnel detection

DRONE OBSTACLE AVOIDANCE

Vision-based autonomous flight in unknown environments is difficult in part due to sensor limitations of traditional onboard cameras.

Event cameras, however, promise nearly zero motion blur and high dynamic range. By leveraging depth prediction, the team pre-trained a reactive obstacle avoidance policy and achieve high-speed obstacle avoidance in indoor and outdoor settings.

ASYNCHRONOUS LASER PULSE DETECTION

Detect, localize and decode multiple laser impulsions simultaneously, in real time, following STANAG 3733 compliance.

– Infrared lasers compatibility
– Code identification following STANAG 3733
– Improved laser impulsions localization precision
– Simultaneous multiple designation detection

SPACE-BASED OBSERVATION

Event-based sensors have many attractive characteristics for space flight applications. These include very low size, weight, power as well as bandwidth.

Falcon ODIN is a breakthrough space experiment leveraging Prophesee HD sensor, designed to observe both natural phenomena (lighting and sprites) as well as man-made objects on the ground, sea, air and space.

SPACE SITUATIONAL AWARENESS

Astrosite, a world first neuromorphic-inspired mobile telescope observatory developed by the International Centre for Neuromorphic Systems (ICNS) at Western Sydney University is using Prophesee Event-Based sensing as a more efficient and low-power alternative for Space Situational Awareness.

140dB Dynamic Range enabling day/night operations

High-Speed µs temporal resolution for spacecraft / satellite / debris tracking and identification

Low-bandwidth communication: 10 to 1,000x less data

The International Centre for Neuromorphic Systems (ICNS) used their Astrosite Mobile Neuromorphic Observatory to record this striking footage of the full lunar eclipse through a neuromorphic Prophesee event-based camera.

The biology-inspired pixels work independently of one another, allowing some to see bright parts of the moon whilst others are looking at much darker parts.

The above video shows the raw outputs from the event-based camera (top left) as the telescope is moved to keep a satellite (a Cz4 rocket body) in the field of view. The location of the satellite is shown using a red circle in both the raw data, and in the generated star map. This star map is generated continuously from the stars as they pass through the field of view (which is shown using a red square).The algorithm is robust to both the movement of the object being tracked and the movement of the telescope itself, as demonstrated by the significant change in the x-position of the object visible in the top graph.

FIND OUT WHAT PROPHESEE METAVISION® TECHNOLOGIES CAN BRING TO YOUR DEFENSE AEROSPACE PROJECTS

RECOGNITION