GRNN has a good performance in function approximation and learning ability.
The neurons of first layer in GRNN have the same function of the RBF.
According to the comparison, the fault diagnosis model based on BRNN spends less time owing to the less hidden layer neurons while the model based on RBF or GRNN contains 56 hidden layer neurons.
The results summarized in Table 4 and 5 show that the fault diagnosis model of BW using Bayesian regularized neural network has not only a more excellent generalization but also better ability to recognize the fault pattern of BW than the other algorithm such as RBF, BPNN, GRNN.
For example, in 2250 MHz the algorithm uses the GRNN network trained in 2300 MHz.
Validation was performed by comparing the results of GRNN method with ML one.
GRNN neural network is used to predict when using the function form:
The results predicted by GRNN neural network are shown in Figure 6.
The figure suggests that the HGA has more ability to adapt to discontinuities and sharp peaks in curves despite the level of noise and the small number of samples in contrast to GRNN and RBNN methods.
For example, in test function 5 with 50 samples and SNR = 2 (bottom-left of the figure), the GRNN performs a slightly smaller median of the MSE value than the HGA method, but the difference is not significant.
The GRNN model allows us to carry out an estimate of the joint probability density function f(x, y) between a set of predictor variables x and a set of response variables y.
In accordance with the description made, a set of MLP, RBF, GRNN and RNN network models were designed, from the manipulation of a series of parameters.