 
  
  
  
  
 Next: Termination Conditions for Backprop
Up: Neural Network Learning
 Previous: Backpropagation Algorithm
 
-  the error   in the delta rule is replaced by in the delta rule is replaced by  
-  for output unit   it is the familiar it is the familiar from
the delta rule multiplied by from
the delta rule multiplied by which is derivative of
the sigmoid squashing function which is derivative of
the sigmoid squashing function
-  for hidden unit   the derivative component is the same but
there is no target value directly available so you sum the error
terms the derivative component is the same but
there is no target value directly available so you sum the error
terms for each output unit influenced by for each output unit influenced by weighting each of the weighting each of the by the weight, by the weight, , from the hidden
unit , from the hidden
unit to the output unit to the output unit . .
-  This weight characterizes the degree to which each hidden unit
  is responsible for the error in output unit is responsible for the error in output unit . .
 
Patricia Riddle 
Fri May 15 13:00:36 NZST 1998