backpropagation


Also found in: Encyclopedia, Wikipedia.

back·prop·a·ga·tion

 (băk′prŏp′ə-gā′shən)
n.
A common method of training a neural net in which the initial system output is compared to the desired output, and the system is adjusted until the difference between the two is minimized.
American Heritage® Dictionary of the English Language, Fifth Edition. Copyright © 2016 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
References in periodicals archive ?
For example, the heuristics within the genetic algorithm are inspired by the natural process of evolution; deep learning is inspired from the human brain and built using a heuristic-based backpropagation algorithm; and the simulated annealing algorithm uses heuristics inspired from the heat treatment process in the field of metallurgy to introduce structural changes within metals.
Take, for instance, backpropagation, a core method used to train deep (multilayer) neural network (DNN) models that attempt to learn complex data representations by mapping inputs to outputs (i.e., predictions), through mathematical manipulations at multiple layers.
* An evolutionary ANN that uses genetic algorithms to topology design, and integrates the backpropagation algorithm and genetic algorithms for the net learning.
For a period of roughly 10 years starting in the mid-1980s, the field of machine learning exploded with the advent of nonlinear neural networks, backpropagation, the work of John Holland in genetic algorithms, work in Hidden Markov Models, and precursor work for neuromorphic computing performed by Carver Mead.
In Optica, The Optical Society's journal for high impact research, Stanford University researchers report a method for training these networks directly in the device by implementing an optical analogue of the backpropagation' algorithm, which is the standard way to train conventional neural networks.
Much of this machine learning comes from a process originally outlined by Geoffrey Hinton called backpropagation. Hinton has dubbed this method a failed experiment, stating that this method of iterative machine learning wouldn't lead to the development of a true AI.
For the training of multilayer perceptron networks (multilayer), traditionally, backpropagation algorithm is adopted to optimize predictive capacity.
Whereas, LMS tracked the goat in 98sec, RTM tracked the goat in 107sec and backpropagation tracked in 101sec.
The data, after being extracted, is processed by both a supervised learning backpropagation algorithm [9] and an SOM (Self-organising Map) [10].