Author Search Result

[Author] Ashraf A. M. KHALAF(2hit)

1-2hit
  • A Cascade Form Predictor of Neural and FIR Filters and Its Minimum Size Estimation Based on Nonlinearity Analysis of Time Series

    Ashraf A. M. KHALAF  Kenji NAKAYAMA  

     
    PAPER

      Vol:
    E81-A No:3
      Page(s):
    364-373

    Time series prediction is very important technology in a wide variety of fields. The actual time series contains both linear and nonlinear properties. The amplitude of the time series to be predicted is usually continuous value. For these reasons, we combine nonlinear and linear predictors in a cascade form. The nonlinear prediction problem is reduced to a pattern classification. A set of the past samples x(n-1),. . . ,x(n-N) is transformed into the output, which is the prediction of the next coming sample x(n). So, we employ a multi-layer neural network with a sigmoidal hidden layer and a single linear output neuron for the nonlinear prediction. It is called a Nonlinear Sub-Predictor (NSP). The NSP is trained by the supervised learning algorithm using the sample x(n) as a target. However, it is rather difficult to generate the continuous amplitude and to predict linear property. So, we employ a linear predictor after the NSP. An FIR filter is used for this purpose, which is called a Linear Sub-Predictor (LSP). The LSP is trained by the supervised learning algorithm using also x(n) as a target. In order to estimate the minimum size of the proposed predictor, we analyze the nonlinearity of the time series of interest. The prediction is equal to mapping a set of past samples to the next coming sample. The multi-layer neural network is good for this kind of pattern mapping. Still, difficult mappings may exist when several sets of very similar patterns are mapped onto very different samples. The degree of difficulty of the mapping is closely related to the nonlinearity. The necessary number of the past samples used for prediction is determined by this nonlinearity. The difficult mapping requires a large number of the past samples. Computer simulations using the sunspot data and the artificially generated discrete amplitude data have demonstrated the efficiency of the proposed predictor and the nonlinearity analysis.

  • A Hybrid Nonlinear Predictor: Analysis of Learning Process and Predictability for Noisy Time Series

    Ashraf A. M. KHALAF  Kenji NAKAYAMA  

     
    PAPER

      Vol:
    E82-A No:8
      Page(s):
    1420-1427

    A nonlinear time series predictor was proposed, in which a nonlinear sub-predictor (NSP) and a linear sub-predictor (LSP) are combined in a cascade form. This model is called "hybrid predictor" here. The nonlinearity analysis method of the input time series was also proposed to estimate the network size. We have considered the nonlinear prediction problem as a pattern mapping one. A multi-layer neural network, which consists of sigmoidal hidden neurons and a single linear output neuron, has been employed as a nonlinear sub-predictor. Since the NSP includes nonlinear functions, it can predict the nonlinearity of the input time series. However, the prediction is not complete in some cases. Therefore, the NSP prediction error is further compensated for by employing a linear sub-predictor after the NSP. In this paper, the prediction mechanism and a role of the NSP and the LSP are theoretically and experimentally analyzed. The role of the NSP is to predict the nonlinear and some part of the linear property of the time series. The LSP works to predict the NSP prediction error. Furthermore, predictability of the hybrid predictor for noisy time series is investigated. The sigmoidal functions used in the NSP can suppress the noise effects by using their saturation regions. Computer simulations, using several kinds of nonlinear time series and other conventional predictor models, are demonstrated. The theoretical analysis of the predictor mechanism is confirmed through these simulations. Furthermore, predictability is improved by slightly expanding or shifting the input potential of the hidden neurons toward the saturation regions in the learning process.

FlyerIEICE has prepared a flyer regarding multilingual services. Please use the one in your native language.