It is well known that offset errors in the multipliers of neural LSIs can have fatal effects on performance. The aim of this study is to understand theoretically how offset errors affect performance of neural LSIs. We have used a single-layer perceptron as an example, and compare our theoretically derived results with computer simulations. We have found that offset errors in the multipliers for the forward process can be canceled out through learning, but those for the updating process cannot be. We have examined the asymptotic behavior of learning for the updating process and derived a mathematical expression for dL, the excess of the averaged loss function L. The derived expression gives us a basis for estimating robustness with respect to the offset errors. Our analysis indicates that dL can be expressed in the form of a quadratic form of offset errors and the inverse of the Hessian matrix of L. We have found that increasing the number of synapses degrades the performacne. We have also learned that enlarging the input signal level and reducing the signal level of the desired response can be effective techniques for reducing the effects of offset errors of the updating process.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Fuyuki OKAMOTO, Hachiro YAMADA, "Analysis of the Effects of Offset Errors in Neural LSIs" in IEICE TRANSACTIONS on Fundamentals,
vol. E80-A, no. 9, pp. 1640-1646, September 1997, doi: .
Abstract: It is well known that offset errors in the multipliers of neural LSIs can have fatal effects on performance. The aim of this study is to understand theoretically how offset errors affect performance of neural LSIs. We have used a single-layer perceptron as an example, and compare our theoretically derived results with computer simulations. We have found that offset errors in the multipliers for the forward process can be canceled out through learning, but those for the updating process cannot be. We have examined the asymptotic behavior of learning for the updating process and derived a mathematical expression for dL, the excess of the averaged loss function L. The derived expression gives us a basis for estimating robustness with respect to the offset errors. Our analysis indicates that dL can be expressed in the form of a quadratic form of offset errors and the inverse of the Hessian matrix of L. We have found that increasing the number of synapses degrades the performacne. We have also learned that enlarging the input signal level and reducing the signal level of the desired response can be effective techniques for reducing the effects of offset errors of the updating process.
URL: https://globals.ieice.org/en_transactions/fundamentals/10.1587/e80-a_9_1640/_p
Copy
@ARTICLE{e80-a_9_1640,
author={Fuyuki OKAMOTO, Hachiro YAMADA, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Analysis of the Effects of Offset Errors in Neural LSIs},
year={1997},
volume={E80-A},
number={9},
pages={1640-1646},
abstract={It is well known that offset errors in the multipliers of neural LSIs can have fatal effects on performance. The aim of this study is to understand theoretically how offset errors affect performance of neural LSIs. We have used a single-layer perceptron as an example, and compare our theoretically derived results with computer simulations. We have found that offset errors in the multipliers for the forward process can be canceled out through learning, but those for the updating process cannot be. We have examined the asymptotic behavior of learning for the updating process and derived a mathematical expression for dL, the excess of the averaged loss function L. The derived expression gives us a basis for estimating robustness with respect to the offset errors. Our analysis indicates that dL can be expressed in the form of a quadratic form of offset errors and the inverse of the Hessian matrix of L. We have found that increasing the number of synapses degrades the performacne. We have also learned that enlarging the input signal level and reducing the signal level of the desired response can be effective techniques for reducing the effects of offset errors of the updating process.},
keywords={},
doi={},
ISSN={},
month={September},}
Copy
TY - JOUR
TI - Analysis of the Effects of Offset Errors in Neural LSIs
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1640
EP - 1646
AU - Fuyuki OKAMOTO
AU - Hachiro YAMADA
PY - 1997
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E80-A
IS - 9
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - September 1997
AB - It is well known that offset errors in the multipliers of neural LSIs can have fatal effects on performance. The aim of this study is to understand theoretically how offset errors affect performance of neural LSIs. We have used a single-layer perceptron as an example, and compare our theoretically derived results with computer simulations. We have found that offset errors in the multipliers for the forward process can be canceled out through learning, but those for the updating process cannot be. We have examined the asymptotic behavior of learning for the updating process and derived a mathematical expression for dL, the excess of the averaged loss function L. The derived expression gives us a basis for estimating robustness with respect to the offset errors. Our analysis indicates that dL can be expressed in the form of a quadratic form of offset errors and the inverse of the Hessian matrix of L. We have found that increasing the number of synapses degrades the performacne. We have also learned that enlarging the input signal level and reducing the signal level of the desired response can be effective techniques for reducing the effects of offset errors of the updating process.
ER -