TY - JOUR
T1 - Sensitivity-Based Adaptive Learning Rules for Binary Feedforward Neural Networks
AU - Zhong, Shuiming
AU - Zeng, Xiaoqin
AU - Wu, Shengli
AU - Han, Lixin
PY - 2012/3
Y1 - 2012/3
N2 - This paper proposes a set of adaptive learning rules for binary feedforward neural networks (BFNNs) by means of the sensitivity measure that is established to investigate the effect of a BFNN's weight variation on its output. The rules are based on three basic adaptive learning principles: the benefit principle, the minimal disturbance principle, and the burden-sharing principle. In order to follow the benefit principle and the minimal disturbance principle, a neuron selection rule and a weight adaptation rule are developed. Besides, a learning control rule is developed to follow the burden-sharing principle. The advantage of the rules is that they can effectively guide the BFNN's learning to conduct constructive adaptations and avoid destructive ones. With these rules, a sensitivity-based adaptive learning (SBALR) algorithm for BFNNs is presented. Experimental results on a number of benchmark data demonstrate that the SBALR algorithm has better learning performance than the Madaline rule II and backpropagation algorithms.
AB - This paper proposes a set of adaptive learning rules for binary feedforward neural networks (BFNNs) by means of the sensitivity measure that is established to investigate the effect of a BFNN's weight variation on its output. The rules are based on three basic adaptive learning principles: the benefit principle, the minimal disturbance principle, and the burden-sharing principle. In order to follow the benefit principle and the minimal disturbance principle, a neuron selection rule and a weight adaptation rule are developed. Besides, a learning control rule is developed to follow the burden-sharing principle. The advantage of the rules is that they can effectively guide the BFNN's learning to conduct constructive adaptations and avoid destructive ones. With these rules, a sensitivity-based adaptive learning (SBALR) algorithm for BFNNs is presented. Experimental results on a number of benchmark data demonstrate that the SBALR algorithm has better learning performance than the Madaline rule II and backpropagation algorithms.
U2 - 10.1109/TNNLS.2011.2177860
DO - 10.1109/TNNLS.2011.2177860
M3 - Article
VL - 23
SP - 480
EP - 491
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
SN - 2162-237X
IS - 3
ER -