Analogue neuro-memristive convolutional dropout nets

O. Krestinskaya, Alex P. James

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Randomly switching neurons ON/OFF while training and inference process is an interesting characteristic of biological neural networks, that potentially results in inherent adaptability and creativity expressed by human mind. Dropouts inspire from this random switching behaviour and in the artificial neural network they are used as a regularization techniques to reduce the impact of over-fitting during the training. The energy-efficient digital implementations of convolutional neural networks (CNN) have been on the rise for edge computing IoT applications. Pruning larger networks and optimization for performance accuracy has been the main direction of work in this field. As opposed to this approach, we propose to build a near-sensor analogue CNN with high-density memristor crossbar arrays. Since several active elements such as amplifiers are used in analogue designs, energy efficiency becomes a main challenge. To address this, we extend the idea of using dropouts in training to also the inference stage. The CNN implementations require a subsampling layer, which is implemented as a mean pooling layer in the design to ensure lower energy consumption. Along with the dropouts, we also investigate the effect of non-idealities of memristor and that of the network.
Original languageEnglish (US)
Pages (from-to)20200210
JournalProceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences
Volume476
Issue number2242
DOIs
StatePublished - Oct 14 2020

Fingerprint

Dive into the research topics of 'Analogue neuro-memristive convolutional dropout nets'. Together they form a unique fingerprint.

Cite this