GNA

(redirected from Gauss-Newton algorithm)
Also found in: Wikipedia.
AcronymDefinition
GNAGhana News Agency
GNAGalanthus Nivalis Agglutinin
GNAGlobal Names Architecture
GNAGoogle News Alert
GNAGeriatric Nursing Assistant
GNAGeorgia Nurses Association
GNAGlobewide Network Academy
GNAGendarmeria Nacional Argentina (Spanish: Argentina National Gendarmerie; Argentine security force)
GNAGood Neighbor Award
GNAGrand National Alliance (Pakistan)
GNAGrounds for Non-Acceptance
GNAGlobal Needs Assessment (UN High Commissioner for Refugees)
GNAGreater Nanticoke Area (school district, Pennsylvania)
GNAGladstein, Neandross & Associates (environmental consulting firm; Santa Monica, CA)
GNAGuam Nikkei Association (est. 2007)
GNAGrassroots Netroots Alliance (Organic Consumers Fund)
GNAGone No Address (undelivered/returned mail)
GNAGauss-Newton algorithm
GNAGrasonia, Nashville, and Ashdown Railroad Company
GNAGreat North Alliance (Minnesota)
GNAGoldwater-Nichols DoD(Department of Defense) Reorganization Act (US DoD)
GNAGuru Nanak Academy
References in periodicals archive ?
Gauss-Newton algorithm is a well-known classical iterative algorithm to solve nonlinear least squares problems [13].
So, according to the characteristics of the PSO and Gauss-Newton algorithm, the combination of the two methods is proposed to solve above nonlinear optimization problem [14].
Using gb as the iterative initial value of Gauss-Newton algorithm, the solution steps are as follows.
In this paper, we employ the Gauss-Newton algorithm to solve this problem [21].
If the loss function L([theta]) reduces rapidly, A will adopt a small value, and then the LMA is similar to the Gauss-Newton algorithm. While the loss function L([theta]) reduces very slowly, A can be increased, giving a step closer to the gradient descent direction, and
If reduction of L(d) is rapid, a smaller value damping factor A can be used, bringing the algorithm closer to the Gauss-Newton algorithm, whereas if an iteration gives insufficient reduction in the residual, A can be increased, giving a step closer to the gradient descent direction.
The Gauss-Newton algorithm converges in 13 iterations but to a very unsatisfying approximation with a squared residual of 6.9470 (there are two poles inside the interval).
An important application of the Gauss-Newton algorithm is to parameter estimation problems in data analysis.
which is a sum of squares in the nonlinear parameters [beta] only so that, at least formally, the Gauss-Newton algorithm can be applied.
This quantity determines the first order convergence multiplier of the Gauss-Newton algorithm. The key to the good large sample behaviour is the result