CHNN

AcronymDefinition
CHNNCommunist History Network Newsletter (UK)
CHNNContinuous Hopfield Neural Network
References in periodicals archive ?
A TMQHNN needs only half the connection weight parameters of CHNN and is expected to have a smaller storage capacity.
[1] and Kobayashi [20] applied this technique to CHNNs and rotor Hopfield neural networks (RHNNs), respectively.
The rest of this paper is organized as follows: Sections 2 and 3 introduce CHNNs and TMQHNNs, respectively.
These neuron states and connection weights are denoted in the same way as those of CHNNs. The number of neurons in a TMQHNN is denoted as [N.sub.Q].
During the repetition to exit the local minima and the cycles, the states of the CHNN would be far from the training pattern and would finally reach a rotated pattern.
Caption: Figure 2: (a) A CHNN. (b) A CHNN represented with two layers.
Complex-valued Hopfield neural networks (CHNNs) are one of the most successful models of complex-valued neural networks [9].
CHNNs have two update modes, the asynchronous and synchronous modes.
Section 2 introduces CHNNs and proposes a new recall algorithm.
In the present section, we introduce the complex-valued Hopfield neural networks (CHNNs).
After a training pattern with noise was given to the CHNNs, the retrieval of the original training pattern was attempted.
Thus, the autoconnections generate many fixed points and the CHNNs are easily trapped.