What is meant by back propagation in an ANN compared to a biological neural network?

本文深入探讨了人工神经网络中的反向传播算法及其在训练多层隐层网络时的作用,同时比较了生物神经元学习过程中观察到的信号反向传播现象。分析了生物神经元学习与人工神经网络反向传播之间的相似性和差异性,并讨论了生物可等效的反向传播算法,如对比性海布学习。文章还提供了关于反向传播算法在不同技术领域的应用和其与生物过程的关联性概述。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

转载自Quora:https://www.quora.com/What-is-meant-by-back-propagation-in-an-ANN-compared-to-a-biological-neural-network

What is back propagation in an artificial neural network?


  • Back propagation in an artificial neural network (ANN) is a method of training a network with hidden neurons (i.e. network with multiple hidden layers). In this method, using training data where input and output is known, the difference or error between desired output and actual output is computed and propagated back into the hidden layers of the neural network to adjust the node weights so as to bring the difference between desired and actual output down.[1]

What is the role of signal back propagation observed within some types of biological neurons?

  • Learning in biological neurons, which essentially boils down to strengthening or weakening of connection strengths between neurons, involves, in some instances, a form of signal back propagation which assists in the strengthening or weakening of connection strengths. But this back propagating signal only influences the connection strengths between adjacent neurons unlike ANN where error back propagation is propagated across all layers of the network. When a biological neuron A is near enough to excite neuron B, the firing of neuron A triggers a back propagating signal within the excited neuron B, that serves to reinforce (or weaken) the connection strength between A and B.  [2][11]

Is there a biological equivalent to ANN back propagation?

  • There have been many papers presenting  biologically plausible mechanisms of ANN back propagation, while others arguing that it is biologically implausible.    [3],   [4]   [5] ,  [9],[10]
  • One argument proposes ANN back propagation and a form of learning (Contrastive Hebbian Learning), presumed to be occurring in a part of the brain, may be equivalent in principle. However, this form of learning does not require back propagation of errors - the output is clamped at desired levels, and the effect of this clamping is allowed to spread through feedback connections across the entire network  [6]  ,[7] ,[8] 




Figure 1.   Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network, Neural computation 2003  [ Open Access] 




References

  1. R. Rojas Neural networks, Chapter 7, 1996 [ Open Access]. A detailed introduction to back propagation
  2. A synaptically controlled, associative signal for Hebbian plasticity in hippocampal neurons, Science 1997
  3. Frequency-based error back-propagation in a cortical network [ Open Access]
  4. A more biologically plausible learning rule than back propagation applied to a network model of cortical area 7A, PNAS 1991      [ Open Access]
  5. Biologically Plausible Error-driven Learning using Local Activation Differences: The Generalized Recirculation Algorithm, CMU, 1996 [ Open Access]
  6.  Equivalence of Backpropagation and Contrastive Hebbian Learning in a Layered Network, Neural computation 2003 [ Open Access] 
  7. Contrastive Hebbian learning  [ Open Access]
  8.  Conditional Routing of Information to the Cortex: A Model of the Basal Ganglia’s Role in Cognitive Coordination  [ Open Access] 
  9.  Backpropagating action potentials in neurones: measurement, mechanisms and potential function,Progress in Biophysics and Molecular Biology, 2005   [ Open Access]
  10. Non-Hebbian spike-timing-dependent plasticity in cerebellar circuits, Neural Circuits 2012 [ Open Access]
  11. Regulation of Synaptic Efficacy by Coincidence of Postsynaptic APs and EPSPs, Science 1997 [ Open Access]
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值