BPNN

(redirected from Back Propagation Neural Network)
AcronymDefinition
BPNNBack Propagation Neural Network
BPNNBusiness Partner Network Number
References in periodicals archive ?
A Feed forward back propagation neural network (FFBPN) is an authoritative machine learning technique from the field of deep learning.
" Training back propagation neural networks with genetic algorithm for weather forecasting", IEEE 8th International Symposium on Intelligent Systems and Informatics Year: pp: 465-469.
The sigmoid function is Qnet's default transfer function and it is the most widely used function for back propagation neural networks. Another network design consideration concerns how to control the network's connections.
In this study, four-layer back propagation neural network is used that is done from databases sorted by TEPC0 [8], 18 periodic inspection data of RC beams as patterns.
Back Propagation Neural Network (BPNN) [15] [16] [17] [18] algorithm is one of the famous algorithms.
In training phase, predetermined compression methods with their compression ratios are prepared and trained using backpropagation neural network and in testing phase, the images are given to the back propagation neural network system to achieve the compression ratio and compression method.
(23) predicted a surface roughness using a feed forward back propagation neural network with different structures.
A back propagation neural network controller has been developed for detection of the relative crack location and relative crack depth (Figure 3).
A back propagation neural network (BPNN) can be used to predict outcomes based on previous results called training files.
The main aim of the paper is to explore the suitability of the usability of a Back propagation neural network in cryptography to increase the security aspects.
A feed-forward back propagation neural network was used to predict the mass loss quantities of A390 aluminium alloy.
The first one is a back propagation neural network (BP) with sigmoid transfer functions in hidden layers and linear transfer function in the output layer; the second is a radial basis network (RBN) with Gaussian activation functions.