Although analog circuits provide attractive features in implementing neural network, most analog implementations have been restricted to forward computations only. Since most of computations occur in learning, learning should be incorporated in hardware implementations for truly fast neural hardware. In this thesis, a subthreshold analog circuits for MOS implementation of artificial neural networks is presented with on-chip learning capability. The subthreshold operation provides low power consumption and the chip implements both backpropagation learning and Hebbian learning. All the circuits incorporate modular architecture, and are designed to increase numbers of neurons and layers with pin-to-pin connections of multiple chips.
Previous researchers pointed out that backpropagation learning can overcome several nonidealities of analog hardware but offsets still remains to be a problem. In order to know the effect of multiplier offsets on on-chip learning hardware, a systematic offset analysis is done. The offset analysis shows that offsets cause many phenomena such as output static errors, weight-drift, variable errors dependent upon input training patterns, premature output saturation, etc. Simulation results show these phenomena well. Due to the offset analysis a deeper understanding of practical analog on-chip learning hardwares has been obtained. The offset analysis also provide guidelines determining target values and initial weight values to obtain desired outputs.
A neuro system consisting of the neuro-chips, personal computer, and an interface control logic is integrated. The fabricated chips are measured and tested in several ways to know their characteristics and learning performances. Some experimental results are compared with the offset analysis, and demonstrate good agreements.