Neuromorphic

BIO-PERCEPTION - Next generation of smart vision systems for real-time processing with bio-inspired sensors

Our goal for this project is to set up the basis for a new generation of smart autonomous agents that are able to carry out 3D perception in real-time, using biologically-inspired vision sensors. These sensors independently processed all pixels and only trigger changes in the scene (events) in the case a substantial difference in the intensity luminance happens over time for a specific location (this happens only at object contours and textures). This allows for the reduction of the transmission of redundant information and hence, the data bandwidth.

Dataset for visual navigation with event-based sensors

This work presents a dataset that provides both frame-free event data and classic image, motion and depth data to assess different event-based methods and compare them to frame-based conventional methods. We hope that this will help researchers understand the potential of the new technology of event-based vision.

Real-time clustering and tracking for event-based sensors

This work presents a real-time clustering technique that takes advantage of the unique properties of event-based vision sensors. Our approach redefines the well-known mean-shift clustering method using asynchronous events instead of conventional frames.

Contour detection and proto-segmentation with event sensors

This project presents an approach to learn the location of contours and their border ownership using Structured Random Forests on event-based features. The contour detection and boundary assignment are demonstrated in a proto-segmentation application