Offsets inherent in analog circuits have been big obstacle in analog implementations of backpropagation algorithm. In this article the effects of analog multiplier offsets on on-chip learning are systematically analyzed. Offsets in a multiplier are mathematically modeled and incorporated into backpropagation learning equations. The deformed equations are investigated to show how the offsets degrade learning performance and under which conditions the neuron's output fails to converge. Simulation results agree well with analytic calculations
Issue Date
1997-06-09
Language
ENG
Citation
Proceedings of the 1997 IEEE International Conference on Neural Networks. , pp.928 - 932