Wikipedia:Reference desk/Archives/Computing/2019 June 24

From Wikipedia, the free encyclopedia
Computing desk
< June 23 << May | June | Jul >> June 25 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


June 24[edit]

neural net question[edit]

Hi. I wonder if anyone can point me in the direction of a good resource that will help me understand how to implement back propagation in the simple neural net simulator that I am trying to write.

I have an arbitrary number of layers in my net, with n input layers, m output layers, and nm connections between them. The input of one layer is the same as the output of the previous. (sigmoid activation function) I'm having a bit of trouble figuring out how to calculate the delta for each of the weights in the hidden layer. Do I have this correct :

In the output layer, a weight is increased by the corresponding input * learning rate * (expected - actual output)^2 *deriv of sigmoid (meaning output * (1-output)

In the hidden layer, the (expected-actual) is replaced by the sum of each error terms in the previous layer.

Am I in the neighborhood of the correct answer to that?

Thanks!!

Duomillia (talk) 00:40, 24 June 2019 (UTC)[reply]

If you're taking a class, I would ask the instructor, as that's a rather specific Q to be likely to find an answer here. There may even be multiple answers/techniques, in which case you will want to use the one your instructor recommends. SinisterLefty (talk) 17:58, 24 June 2019 (UTC)[reply]
The article Artificial neural network has sections about Hebbian learning and Backpropagation. The main article Backpropagation explains training methods using the derivatives of activation functions that are known at design time. DroneB (talk) 18:05, 24 June 2019 (UTC)[reply]