An optimized recursive learning algorithm for three-layer feedforward neural networks for mimo nonlinear system identifications

Daohang Sha, Vladimir Bajic

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Back-propagation with gradient method is the most popular learning algorithm for feed-forward neural networks. However, it is critical to determine a proper fixed learning rate for the algorithm. In this paper, an optimized recursive algorithm is presented for online learning based on matrix operation and optimization methods analytically, which can avoid the trouble to select a proper learning rate for the gradient method. The proof of weak convergence of the proposed algorithm also is given. Although this approach is proposed for three-layer, feed-forward neural networks, it could be extended to multiple layer feed-forward neural networks. The effectiveness of the proposed algorithms applied to the identification of behavior of a two-input and two-output non-linear dynamic system is demonstrated by simulation experiments.

Original languageEnglish (US)
Pages (from-to)133-147
Number of pages15
JournalIntelligent Automation and Soft Computing
Volume17
Issue number2
DOIs
StatePublished - Jan 1 2011

Keywords

  • Back Propagation
  • Gradient Descent Method
  • Learning Algorithms
  • Neural Networks

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Computational Theory and Mathematics
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'An optimized recursive learning algorithm for three-layer feedforward neural networks for mimo nonlinear system identifications'. Together they form a unique fingerprint.

Cite this