Author Search Result

[Author] Takeshi KAMIO(5hit)

1-5hit
  • Sparsely Interconnected Neural Networks for Associative Memories Applying Discrete Walsh Transform

    Takeshi KAMIO  Hideki ASAI  

     
    LETTER

      Vol:
    E82-A No:3
      Page(s):
    495-499

    The conventional synthesis procedure of discrete time sparsely interconnected neural networks (DTSINNs) for associative memories may generate the cells with only self-feedback due to the sparsely interconnected structure. Although this problem is solved by increasing the number of interconnections, hardware implementation becomes very difficult. In this letter, we propose the DTSINN system which stores the 2-dimensional discrete Walsh transforms (DWTs) of memory patterns. As each element of DWT involves the information of whole sample data, our system can associate the desired memory patterns, which the conventional DTSINN fails to do.

  • Associative Memories Using Interaction between Multilayer Perceptrons and Sparsely Interconnected Neural Networks

    Takeshi KAMIO  Hisato FUJISAKA  Mititada MORISUE  

     
    PAPER

      Vol:
    E85-A No:6
      Page(s):
    1220-1228

    Associative memories composed of sparsely interconnected neural networks (SINNs) are suitable for analog hardware implementation. However, the sparsely interconnected structure also gives rise to a decrease in the capability of SINNs for associative memories. Although this problem can be solved by increasing the number of interconnections, the hardware cost goes up rapidly. Therefore, we propose associative memories consisting of multilayer perceptrons (MLPs) with 3-valued weights and SINNs. It is expected that such MLPs can be realized at a lower cost than increasing interconnections in SINNs and can give each neuron in SINNs the global information of an input pattern to improve the storage capacity. Finally, it is confirmed by simulations that our proposed associative memories have good performance.

  • Backpropagation Algorithm for LOGic Oriented Neural Networks with Quantized Weights and Multilevel Threshold Neurons

    Takeshi KAMIO  Hisato FUJISAKA  Mititada MORISUE  

     
    PAPER

      Vol:
    E84-A No:3
      Page(s):
    705-712

    Multilayer feedforward neural network (MFNN) trained by the backpropagation (BP) algorithm is one of the most significant models in artificial neural networks. MFNNs have been used in many areas of signal and image processing due to high applicability. Although they have been implemented as analog, mixed analog-digital and fully digital VLSI circuits, it is still difficult to realize their hardware implementation with the BP learning function efficiently. This paper describes a special BP algorithm for the logic oriented neural network (LOGO-NN) which we have proposed as a sort of MFNN with quantized weights and multilevel threshold neurons. Both weights and neuron outputs are quantized to integer values in LOGO-NNs. Furthermore, the proposed BP algorithm can reduce high precise calculations. Therefore, it is expected that LOGO-NNs with BP learning can be more effectively implemented as digital type circuits than the common MFNNs with the classical BP. Finally, it is shown by simulations that the proposed BP algorithm for LOGO-NNs has good performance in terms of the convergence rate, convergence speed and generalization capability.

  • A Neural Net Approach to Discrete Walsh Transform

    Takeshi KAMIO  Hiroshi NINOMIYA  Hideki ASAI  

     
    LETTER

      Vol:
    E77-A No:11
      Page(s):
    1882-1886

    In this letter we present an electronic circuit based on a neural net to compute the discrete Walsh transform. We show both analytically and by simulation that the circuit is guaranteed to settle into the correct values.

  • Face Image Recognition by 2-Dimensional Discrete Walsh Transform and Multi-Layer Neural Network

    Masahiro YOSHIDA  Takeshi KAMIO  Hideki ASAI  

     
    LETTER-Source Coding/Image Processing

      Vol:
    E86-A No:10
      Page(s):
    2623-2627

    This report describes face image recognition by 2-dimensional discrete Walsh transform and multi-layer neural networks. Neural network (NN) is one of the powerful tools for pattern recognition. In the previous researches of face image recognition by NN, the gray levels on each pixel of the face image have been used for input data to NN. However, because the face image has usually too many pixels, a variety of approaches have been required to reduce the number of the input data. In this research, 2-dimensional discrete Walsh transform is used for reduction of input data and the recognition is done by multi-layer neural networks. Finally, the validity of our method is varified.

FlyerIEICE has prepared a flyer regarding multilingual services. Please use the one in your native language.