Author Search Result

[Author] Yonggwan WON(3hit)

1-3hit
  • Shift-Invariant Fuzzy-Morphology Neural Network for Automatic Target Recognition

    Yonggwan WON  

     
    PAPER-Neural Networks

      Vol:
    E81-A No:6
      Page(s):
    1119-1127

    This paper describes a theoretical foundation of fuzzy morphological operations and architectural extension of the shared-weight neural network (SWNN). The network performs shift-invariant filtering using fuzzy-morphological operations for feature extraction. The nodes in the feature extraction stage employ the generalized-mean operator to implement fuzzy-morphological operations. The parameters of the SWNN, weights, morphological structuring element and fuzziness, are optimized by the error back-propagation (EBP) training method. The parameter values of the trained SWNN are then implanted into the extended SWNN (ESWNN) which is a simple convolution neural network. The ESWNN architecture dramatically reduces the amount of computation by avoiding segmentation process. The neural network is applied to automatic recognition of a vehicle in visible images. The network is tested with several sequences of images that include targets ranging from no occlusion to almost full occlusion. The results demonstrate an ability to detect occluded targets, while trained with non-occluded ones. In comparison, the proposed network was superior to the Minimum-Average Correlation filter systems and produced better results than the ordinary SWNN.

  • Small Number of Hidden Units for ELM with Two-Stage Linear Model

    Hieu Trung HUYNH  Yonggwan WON  

     
    PAPER-Data Mining

      Vol:
    E91-D No:4
      Page(s):
    1042-1049

    The single-hidden-layer feedforward neural networks (SLFNs) are frequently used in machine learning due to their ability which can form boundaries with arbitrary shapes if the activation function of hidden units is chosen properly. Most learning algorithms for the neural networks based on gradient descent are still slow because of the many learning steps. Recently, a learning algorithm called extreme learning machine (ELM) has been proposed for training SLFNs to overcome this problem. It randomly chooses the input weights and hidden-layer biases, and analytically determines the output weights by the matrix inverse operation. This algorithm can achieve good generalization performance with high learning speed in many applications. However, this algorithm often requires a large number of hidden units and takes long time for classification of new observations. In this paper, a new approach for training SLFNs called least-squares extreme learning machine (LS-ELM) is proposed. Unlike the gradient descent-based algorithms and the ELM, our approach analytically determines the input weights, hidden-layer biases and output weights based on linear models. For training with a large number of input patterns, an online training scheme with sub-blocks of the training set is also introduced. Experimental results for real applications show that our proposed algorithm offers high classification accuracy with a smaller number of hidden units and extremely high speed in both learning and testing.

  • Weighted Association Rule Mining for Item Groups with Different Properties and Risk Assessment for Networked Systems

    Jungja KIM  Heetaek CEONG  Yonggwan WON  

     
    PAPER-Data Mining

      Vol:
    E92-D No:1
      Page(s):
    10-15

    In market-basket analysis, weighted association rule (WAR) discovery can mine the rules that include more beneficial information by reflecting item importance for special products. In the point-of-sale database, each transaction is composed of items with similar properties, and item weights are pre-defined and fixed by a factor such as the profit. However, when items are divided into more than one group and the item importance must be measured independently for each group, traditional weighted association rule discovery cannot be used. To solve this problem, we propose a new weighted association rule mining methodology. The items should be first divided into subgroups according to their properties, and the item importance, i.e. item weight, is defined or calculated only with the items included in the subgroup. Then, transaction weight is measured by appropriately summing the item weights from each subgroup, and the weighted support is computed as the fraction of the transaction weights that contains the candidate items relative to the weight of all transactions. As an example, our proposed methodology is applied to assess the vulnerability to threats of computer systems that provide networked services. Our algorithm provides both quantitative risk-level values and qualitative risk rules for the security assessment of networked computer systems using WAR discovery. Also, it can be widely used for new applications with many data sets in which the data items are distinctly separated.

FlyerIEICE has prepared a flyer regarding multilingual services. Please use the one in your native language.