APPLICATIONS

Prophesee’s Metavision® Sensing and Software solutions set a new standard for computer vision

 

We establish new benchmarks in data efficiency, dynamic range, speed and power efficiency. We successfully tackle major computer vision tasks that today pose severe challenges to conventional systems across many fields of application.

Discover below a curated list of public applications, contact us for more information.

MOBILE

Deblur in low light, Deblur high-speed action

IMAGE DEBLURING

Prophesee Metavision sensor captures continuous motion with extreme precision during the exposure time of your phone’s image sensor, pixel by pixel, with microsecond precision, to repair motion blur even in the most challenging scenes.

LEARN MORE

Low light performance down to 25 lux 

10 000 images per second

CONSUMER

Eye tracking, Hand Tracking for AR, Motion detection and analysis, Intruder detection, Traffic data acquisition, Crowd management, People counting, Always-on visual input, Gesture Detection, Simultaneous Localization And Mapping,
Space Situational Awareness

EYE TRACKING

Typical use cases: Foveated rendering, user interaction

Unlock next-generation eye-tracking capabilities with ultra-low power and high refresh rate Metavision® event-based sensors capabilities. Reach 1ms sampling times for ultra-smooth eye position tracking while optimizing system autonomy and heating performance.
Video courtesy of ZinnLabs

20mW entire gaze-tracking system

1kHz or more eye position tracking rate

LEARN MORE

CONSTELLATION TRACKING

Achieve high-speed LED frequency detection in the 10s of kHz with high tracking precision. Thanks to live frequency analysis, natively filter out parasite flickering light for optimal tracking robustness.

>10kHz High-speed LED frequency detection

Native parasite frequency filtering for optimal tracking robustness

GESTURE RECOGNITION

Typical use cases: Touchless interaction

Achieve high-robustness and smooth gesture recognition and tracking thanks to Metavision® event-based sensors’ high dynamic range (>120dB), low-light cutoff (0.05lux), high power efficiency (down to μW range) and low latency properties.
Video courtesy of Ultraleap

>120dB dynamic range 

Down to 36 μW power efficiency at sensor level

INSIDE-OUT TRACKING

Unlock ultra-fast and smooth inside-out tracking running at >10kHz and benefit from high robustness to lighting conditions (>120dB dynamic range, 0.05 lux low-light cutoff).

>10kHz high-speed pose estimation

>120dB dynamic range 

FALL DETECTION

Typical use cases: AI-enabled monitoring

Detect and classify activities in real time while respecting subject’s privacy at the sensor level. Bring more intelligence to the edge and trigger alerts only on key events such as a person falling in a hospital room while generating 10-1000x less data and benefiting from high robustness to lighting conditions (>120dB dynamic range, 0.05 lux low-light cutoff)
Video courtesy of YunX

Privacy by design: Metavision sensors do not capture images

AI-enabled: Train your models on lighter datasets thanks to background & color invariability event properties

GESTURE DETECTION

CROWD DETECTION AND TRACKING

VISUAL ODOMETRY

TRAFFIC DATA ACQUISITION

INDUSTRIAL AUTOMATION

Industrial processes, Inspection, Monitoring, Object identification, Detection & tracking, Handling, High speed motion control/robotics, AGV

XYT MOTION ANALYSIS

Typical use cases: Movement analysis – Equipment health monitoring – Machine Behavior monitoring

Discover the power of time – space continuity for your application by visualizing your data with our XYT viewer.

See between the frames

Zoom in time and understand fine motion in the scene

ULTRA SLOW MOTION

Typical use cases: Kinematic Monitoring, Predictive Maintenance

Slow down time, down to the time-resolution equivalent of over 200,000+ frames per second, live, while generating orders of magnitude less data than traditional approaches. Understand the finest motion dynamics hiding in ultra fast and fleeting events.

Over 200,000 fps (time resolution equivalent)

OBJECT TRACKING

Typical use cases: Part pick and place – Robot Guidance – Trajectory monitoring

Track moving objects in the field of view. Leverage the low data-rate and sparse information provided by event-based sensors to track objects with low compute power.

Continuous tracking in time: no more “blind spots” between frame acquisitions

Native segmentation: analyze only motion, ignore the static background

OPTICAL FLOW

Typical use cases: Conveyor Speed Measurement – Part/Object Speed Measurement, Trajectory Monitoring, Trajectory Analysis, Trajectory Anticipation

Rediscover this fundamental computer vision building block, but with an event twist. Understand motion much more efficiently, through continuous pixel-by-pixel tracking and not sequential frame by frame analysis anymore.

17x less power compared to traditional image-based approaches 

Get features only on moving objects

HIGH-SPEED COUNTING

Typical use cases: Object counting and gauging – pharmaceutical pill counting – Mechanical part counting 

Count objects at unprecedented speeds, high accuracy, generating less data and without any motion blur. Objects are counted as they pass through the field of view, triggering each pixel independently as the object goes by.

READ MORE

>1,000 Obj/s. Throughput

>99.5% Accuracy @1,000 Obj/s.

SPATTER MONITORING

Typical use cases: Traditional milling, laser & process monitoring, Quality prediction

Track small particles with spatter-like motion. Thanks to the high time resolution and dynamic range of our Event-Based Vision sensor, small particles can be tracked in the most difficult and demanding environment.

Up to 200kHz tracking frequency (5µs time resolution)

Simultaneous XYT tracking  of all particles  

VIBRATION MONITORING

Typical use cases: Motion monitoring, Vibration monitoring, Frequency analysis for predictive maintenance

Monitor vibration frequencies continuously, remotely, with pixel precision, by tracking the temporal evolution of every pixel in a scene. For each event, the pixel coordinates, the polarity of the change and the exact timestamp are recorded, thus providing a global, continuous understanding of vibration patterns.

From 1Hz to kHz range

1 Pixel Accuracy

PARTICLE / OBJECT SIZE MONITORING

Typical use cases: High speed counting, Batch homogeneity & gauging

Control, count and measure the size of objects moving at very high speed in a channel or a conveyor.

Get instantaneous quality statistics in your production line, to control your process.

Up to 500 000 pixels / second speed

99% Counting precision

EDGELET TRACKING

Typical use cases: High speed location, Guiding and fitting for pick & place

Track 3D edges and/or Fiducial markers for your AR/VR application. Benefit from the high temporal resolution of Events to increase accuracy and robustness of your edge tracking application.

Automated 3D object detection with geometrical prior

3D object real-time tracking

VELOCITY & FLUID MONITORING

Typical use cases: Fluid dynamics monitoring, Continous process monitoring of liquid flow 

CABLE / YARN VELOCITY & SLIPPING MONITORING

Typical use cases: Yarn quality control, Cable manufacturing monitoring

PLUME MONITORING

Typical use cases: Dispensing uniformity & Coverage control, Quality & efficiency of dispersion, Fluid dynamics analysis for inline process monitoring

NEUROMORPHIC VISION AND TOUCH COMBINED FOR ENHANCED ROBOTIC CAPABILITIES

Researchers at the Collaborative, Learning, and Adaptive Robots (CLeAR) Lab and TEE Research Group at National University of Singapore are taking advantage of the benefits of Prophesee’s Event-Based Vision, in combination with touch, to build new visual-tactile datasets for the development of better learning systems in robotics. The neuromorphic sensor fusion of touch and vision is being used to help robots grip and identify objects.

READ MORE

1000x times faster than human touch

0.08s rotational slip detection

AUTOMOTIVE & MOBILITY

Autonomous driving, Emergency breaking assist, Driver assistance, Collision avoidance, Pedestrian protection, Occupant identification and classification, Driver Monitoring Systems

Xperi

DRIVER MONITORING SOLUTION

Leveraging event input from Prophesee’s Metavision sensing technologies, DTS, Inc. from Xperi Corporation developed a world-first neuromorphic driver monitoring solution (DMS), powered by Prophesee Metavision® Event-Based Vision sensor. With better low light performance for driver monitoring features as well as never seen before capabilities such as saccadic eye movement or micro-expressions monitoring, it is a breakthrough in next-generation in-cabin experiences and safety.

READ MORE

Xperi

ADVANCED EVENT-BASED VISION DRIVER ASSISTED SYSTEMS

VoxelFlow™ developed by Terranet AB in conjunction with Mercedes-Benz, uses Prophesee Metavision® Event-Based Vision sensor so that autonomous driving (AD) and advanced driver-assistance systems (ADAS) can quickly and accurately understand and decipher what’s in front of them, enhancing existing radar, lidar, and camera systems that particularly struggle within 30 to 40 meters, when an accident is most likely to take place. 

READ MORE

HIGH-SPEED DETECTION AND TRACKING

FLICKERING LED DETECTION

HIGH DYNAMIC RANGE

LOW LIGHT

HIGH DYNAMIC RANGE DETECTION AND TRACKING

MEDICAL

Live sample sterility testing for gene therapy, Vision restoration, Blood cell tracking

SIGHT RESTORATION

Nature Medicine published the first case report of partial recovery of visual function in a blind patient with late stage retinitis pigmentosa (RP).

The study combines gene therapy with a light-stimulating medical device in the form of goggles that uses Prophesee Metavision® Sensor.

“The light-stimulating goggles capture images from the visual world using a neuromorphic camera that detects changes in intensity, pixel by pixel, as distinct events.”

READ MORE

GENE THERAPY: SAMPLE STERILITY TESTING

Today’s state of the art sterility testing relies on decades old microbiology taking 7-14 days which adds a substantial delay in the creation of life-saving cell therapies.

Using Prophesee Metavision sensor and AI models to detect, track and classify cells, Cambridge Consultants was able to build an automated sterility testing system, cutting down required testing time from weeks to milliseconds.

READ MORE

HIGH-SPEED PARTICLE DETECTION & TRACKING IN MICROFLUIDIC DEVICES

Aided by Prophesee’s Metavision sensing technologies, researchers at the University of Glasgow, Heriot-Watt University and University of Stratchlyde have discovered ways to leverage Event-Based Vision’s high-speed particle detection capabilities to perform microfluidic analysis. This enables the detection of particles of size down to 1 µm, and a wide range of fluid velocities, up to 1.54 m/s.

READ MORE

BEYOND

Live sample sterility testing for gene therapy, Vision restoration, Blood cell tracking

SPACE SITUATIONAL AWARENESS

The growing reliance on satellites has led to an increased risk in collisions between space objects. Accurate detection and tracking of satellites has become crucial.

Astrosite, a world first neuromorphic-inspired mobile telescope observatory developed by the International Centre for Neuromorphic Systems (ICNS) at Western Sydney University is using Event-Based sensing as a more efficient and low-power alternative for Space Situational Awareness.

READ MORE

GENE THERAPY: SAMPLE STERILITY TESTING

Researchers at the DLR Institute of Propulsion Technology of the German Aerospace Center explore the possibilities of harnessing event-based vision (EBV) for capturing flow fields in both air and water flows. Using a laser or other light source for illumination, the event-camera captures the motion of small (micrometer) sized particles carried with the flow.

READ MORE