Neuromorphic

BIO-PERCEPTION - Next generation of smart vision systems for real-time processing with bio-inspired sensors

Our goal for this project is to set up the basis for a new generation of smart autonomous agents that are able to carry out 3D perception in real-time, using biologically-inspired vision sensors. These sensors independently processed all pixels and only trigger changes in the scene (events) in the case a substantial difference in the intensity luminance happens over time for a specific location (this happens only at object contours and textures). This allows for the reduction of the transmission of redundant information and hence, the data bandwidth.

BRAINAV - BRAIn-inspired visual processing for real-time energy-efficient autonomous NAVigation

BRAINAV builds on the integration of visual processing pipelines for energy-efficient edge processing using neuromorphic strategies, for the application of autonomous navigation. First, visual processing is crucial in perception system components and specifically, for scene awareness in navigation applications. Robotic agents require understanding their context to plan and make decisions accordingly. This is even more relevant in applications such as robotics. However, computer vision is a very demanding task in terms of resources, and effective navigation requires low latencies to close perception-action loops in real-time. Second, regarding 3D perception and scene understanding, state-of-the-art solutions are focused on accuracy performance but other qualities such as energy consumption must be also taken into account.

Dataset for visual navigation with event-based sensors

This work presents a dataset that provides both frame-free event data and classic image, motion and depth data to assess different event-based methods and compare them to frame-based conventional methods. We hope that this will help researchers understand the potential of the new technology of event-based vision.

Real-time clustering and tracking for event-based sensors

This work presents a real-time clustering technique that takes advantage of the unique properties of event-based vision sensors. Our approach redefines the well-known mean-shift clustering method using asynchronous events instead of conventional frames.

Contour detection and proto-segmentation with event sensors

This project presents an approach to learn the location of contours and their border ownership using Structured Random Forests on event-based features. The contour detection and boundary assignment are demonstrated in a proto-segmentation application