Back to overview

Toward Joint Approximate Inference of Visual Quantities on Cellular Processors Arrays

Type of publication Peer-reviewed
Publikationsform Proceedings (peer-reviewed)
Author Martel Julien N.P., Miguel Chau, Dudek Piotr, Cook Matthew,
Project Biological Information in Cortical Communication
Show all

Proceedings (peer-reviewed)

Title of proceedings Circuits and Systems (ISCAS), 2015 IEEE International Symposium on
Place Lisbon
DOI 10.1109/iscas.2015.7169083


The interacting visual maps (IVM) algorithm introduced in [1] is able to perform the joint approximate inference of several visual quantities such as optic-flow, gray-level intensities and ego-motion, using a sparse input coming from a neuromorphic dynamic vision sensor (DVS). We show that features of the model such as the intrinsic parallelism and distributed nature of its computation make it a natural candidate to benefit from the cellular processor array (CPA) hardware architecture. We have now implemented the IVM algorithm on a general-purpose CPA simulator, and here we present results of our simulations and demonstrate that the IVM algorithm indeed naturally fits the CPA architecture. Our work indicates that extended versions of the IVM algorithm could benefit greatly from a dedicated hardware implementation, eventually yielding a high speed, low power visual odometry chip.