1-5hit |
The conventional synthesis procedure of discrete time sparsely interconnected neural networks (DTSINNs) for associative memories may generate the cells with only self-feedback due to the sparsely interconnected structure. Although this problem is solved by increasing the number of interconnections, hardware implementation becomes very difficult. In this letter, we propose the DTSINN system which stores the 2-dimensional discrete Walsh transforms (DWTs) of memory patterns. As each element of DWT involves the information of whole sample data, our system can associate the desired memory patterns, which the conventional DTSINN fails to do.
Takeshi KAMIO Hisato FUJISAKA Mititada MORISUE
Associative memories composed of sparsely interconnected neural networks (SINNs) are suitable for analog hardware implementation. However, the sparsely interconnected structure also gives rise to a decrease in the capability of SINNs for associative memories. Although this problem can be solved by increasing the number of interconnections, the hardware cost goes up rapidly. Therefore, we propose associative memories consisting of multilayer perceptrons (MLPs) with 3-valued weights and SINNs. It is expected that such MLPs can be realized at a lower cost than increasing interconnections in SINNs and can give each neuron in SINNs the global information of an input pattern to improve the storage capacity. Finally, it is confirmed by simulations that our proposed associative memories have good performance.
Takeshi KAMIO Hisato FUJISAKA Mititada MORISUE
Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. MFNNs have been used in many areas of signal and image processing due to high applicability. Although they have been implemented as analog, mixed analog-digital and fully digital VLSI circuits, it is still difficult to realize their hardware implementation with the BP learning function efficiently. This paper describes a special BP algorithm for the logic oriented neural network (LOGO-NN) which we have proposed as a sort of MFNN with quantized weights and multilevel threshold neurons. Both weights and neuron outputs are quantized to integer values in LOGO-NNs. Furthermore, the proposed BP algorithm can reduce high precise calculations. Therefore, it is expected that LOGO-NNs with BP learning can be more effectively implemented as digital type circuits than the common MFNNs with the classical BP. Finally, it is shown by simulations that the proposed BP algorithm for LOGO-NNs has good performance in terms of the convergence rate, convergence speed and generalization capability.
Takeshi KAMIO Hiroshi NINOMIYA Hideki ASAI
In this letter we present an electronic circuit based on a neural net to compute the discrete Walsh transform. We show both analytically and by simulation that the circuit is guaranteed to settle into the correct values.
Masahiro YOSHIDA Takeshi KAMIO Hideki ASAI
This report describes face image recognition by 2-dimensional discrete Walsh transform and multi-layer neural networks. Neural network (NN) is one of the powerful tools for pattern recognition. In the previous researches of face image recognition by NN, the gray levels on each pixel of the face image have been used for input data to NN. However, because the face image has usually too many pixels, a variety of approaches have been required to reduce the number of the input data. In this research, 2-dimensional discrete Walsh transform is used for reduction of input data and the recognition is done by multi-layer neural networks. Finally, the validity of our method is varified.