STUDY AND DESIGN OF AN ENERGY EFFICIENT PERCEPTION MODULE COMBINING EVENT-BASED IMAGE SENSORS AND SPIKING NEURAL NETWORK WITH 3D INTEGRATION TECHNOLOGIES

L’UNIVERSITE GRENOBLE ALPES

 

Maxence Bouvier

ABSTRACT

Bio-inspired vision sensors and processors have started to attract attention as, after several decades of research, they start being broadly used for industrial purposes. These sensors, also called event-based, generate sparse data that intrinsically present three characteristics that bring important advantages for many computer vision applications. Indeed, event-driven acquisition permits to generate sparse data, with high acquisition speed at the order of the microsecond, while conserving an exceptionally large dynamic range. Event-driven imagers are thus highly suited for deployment in situations where speed and application robustness are of high importance. However, event-based image sensors come with major drawbacks that render them nearly impracticable in embedded situations. They are noisy, poorly resolved and generate an incredible amount of data relatively to their resolution.This Ph.D. study thus focuses on understanding how they can be used, and how their drawbacks can be alleviated. The work explores bio-inspired applications for tasks where frame-based methods are already successful but present robustness flaws because classical frame-based imagers cannot be intrinsically high speed and high dynamic range. This manuscript provides leads to understand and decide why some algorithms matches more than other to their novel data type. It also tries to touch upon the reasons these sensors cannot be used as they are, but how they could be efficiently integrated into classical frame-based algorithmic pipelines and systems by deploying motion compensation of the raw data.In addition, a bio-inspired hardware-based solution to simultaneously reduce the output bandwidth and filter out noise, directly at the output of a grid of event-based pixels, is presented. It consists in the hardware implementation of a bio-inspired convolutional neural network accelerator – a neuromorphic processor – distributed near-sensor, that takes major advantages from being conceived toward a three-dimensional integration. This system was designed for minimizing its power budget, at the 28nm FDSOI node, and demonstrates a 2.86pJ per synaptic operation – or 93.0aJ per input event per pixel. On top of that, it is scalable for megapixel resolution sensors without induced overhead.

Source: HAL Theses

PRODUCTS USED IN THIS PAPER

SEARCH PUBLICATION LIBRARY

Don’t miss a bit,

follow us to be the first to know

✉️ Join Our Newsletter