1-2hit |
We apply some variants of evolutionary computations to the fully-connected neural network model of associative memory. Among others, when we regard it as a parameter optimization problem, we notice that the model has some favorable properties as a test function of evolutionary computations. So far, many functions have been proposed for comparative study. However, as Whitley and his colleagues suggested, many of the existing common test functions have some problems in comparing and evaluating evolutionary computations. In this paper, we focus on the possibilities of using the fully-connected neural network model as a test function of evolutionary computations.
We apply evolutionary algorithms to neural network model of associative memory. In the model, some of the appropriate configurations of the synaptic weights allow the network to store a number of patterns as an associative memory. For example, the so-called Hebbian rule prescribes one such configuration. However, if the number of patterns to be stored exceeds a critical amount (over-loaded), the ability to store patterns collapses more or less. Or, synaptic weights chosen at random do not have such an ability. In this paper, we describe a genetic algorithm which successfully evolves both the random synapses and over-loaded Hebbian synapses to function as associative memory by adaptively pruning some of the synaptic connections. Although many authors have shown that the model is robust against pruning a fraction of synaptic connections, improvement of performance by pruning has not been explored, as far as we know.