Sensitivity-Based Adaptive Learning Rules for Binary Feedforward Neural Networks

Shuiming Zhong, Xiaoqin Zeng, Shengli Wu, Lixin Han

Research output: Contribution to journalArticlepeer-review

18 Citations (Scopus)

Abstract

This paper proposes a set of adaptive learning rules for binary feedforward neural networks (BFNNs) by means of the sensitivity measure that is established to investigate the effect of a BFNN's weight variation on its output. The rules are based on three basic adaptive learning principles: the benefit principle, the minimal disturbance principle, and the burden-sharing principle. In order to follow the benefit principle and the minimal disturbance principle, a neuron selection rule and a weight adaptation rule are developed. Besides, a learning control rule is developed to follow the burden-sharing principle. The advantage of the rules is that they can effectively guide the BFNN's learning to conduct constructive adaptations and avoid destructive ones. With these rules, a sensitivity-based adaptive learning (SBALR) algorithm for BFNNs is presented. Experimental results on a number of benchmark data demonstrate that the SBALR algorithm has better learning performance than the Madaline rule II and backpropagation algorithms.
Original languageEnglish
Pages (from-to)480-491
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume23
Issue number3
DOIs
Publication statusPublished (in print/issue) - Mar 2012

Fingerprint

Dive into the research topics of 'Sensitivity-Based Adaptive Learning Rules for Binary Feedforward Neural Networks'. Together they form a unique fingerprint.

Cite this