Sensitivity analysis of multilayer perceptron to input and weight perturbations
- 1 November 2001
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 12 (6), 1358-1366
- https://doi.org/10.1109/72.963772
Abstract
An important issue in the design and implementation of a neural network is the sensitivity of its output to input and weight perturbations. In this paper, we discuss the sensitivity of the most popular and general feedforward neural networks-multilayer perceptron (MLP). The sensitivity is defined as the mathematical expectation of the output errors of the MLP due to input and weight perturbations with respect to all input and weight values in a given continuous interval. The sensitivity for a single neuron is discussed first and an analytical expression that is a function of the absolute values of input and weight perturbations is approximately derived. Then an algorithm is given to compute the sensitivity for the entire MLP. As intuitively expected, the sensitivity increases with input and weight perturbations, but the increase has an upper bound that is determined by the structural configuration of the MLP, namely the number of neurons per layer and the number of layers. There exists an optimal value for the number of neurons in a layer, which yields the highest sensitivity value. The effect caused by the number of layers is quite unexpected. The sensitivity of a neural network may decrease at first and then almost keeps constant while the number increases.Keywords
This publication has 14 references indexed in Scilit:
- Variance analysis of sensitivity information for pruning multilayer feedforward neural networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Sensitivity analysis for feedforward artificial neural networks with differentiable activation functionsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- A neural optimal voltage regulatorPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Using function approximation to analyze the sensitivity of MLP with antisymmetric squashing activation functionIEEE Transactions on Neural Networks, 2002
- Sensitivity analysis of neocognitronIEEE Transactions on Systems, Man and Cybernetics, Part C (Applications and Reviews), 1999
- Perturbation method for deleting redundant inputs of perceptron networksNeurocomputing, 1997
- Sensitivity analysis of single hidden-layer neural networks with threshold functions.IEEE Transactions on Neural Networks, 1995
- The selection of weight accuracies for MadalinesIEEE Transactions on Neural Networks, 1995
- Sensitivity analysis of multilayer perceptron with differentiable activation functionsIEEE Transactions on Neural Networks, 1992
- A MEASURE OF RELATIVE ROBUSTNESS FOR FEEDFORWARD NEURAL NETWORKS SUBJECT TO SMALL INPUT PERTURBATIONSInternational Journal of Neural Systems, 1992