Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

The multi-layer network below is being trained using backpropagation. The curren

ID: 3884044 • Letter: T

Question

The multi-layer network below is being trained using backpropagation. The current input/output pair is x_p(k) = (1.0, 1.0, 1.0)^t and d_p(k) = (0.0, 0.5, 1.0)^t. The weights and node outputs are given in the table below (note that, for ease of making the table, I'm using the letter "w" to represent all weights since I've numbered the nodes sequentially). Assume the sigmoid activation function (logistic function) for each node with a = 1. Using a learning rate of 0.1 and a momentum term of 0.4, compute delta_6 (k), delta_4(k) and w_64(k + 1) (assume that the previous weight change was 0.08). First, write out the backpropagation formula

Explanation / Answer

When this algorithm is applied to the XOR we get the following output.