uses back propagation!
uses the hyperbolic tangent as an activation function, the 2/(e^2x + x^-2x) is the derivative of the tanh function uses gradient descent with no momentum (because it's not necessary for 1 Neuron :) source: http://galaxy.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html