This is a machine learning problem. Write down the update rule for Stochastic Gr
ID: 3306711 • Letter: T
Question
This is a machine learning problem.
Write down the update rule for Stochastic Gradient Descent.
Weight vector is w Rn
Sample from training data is x(i) be the ith sample
Loss function is L(w,x) (assume it's continuous and differentiable)
Observe example x(i) and predict using the current hypothesis w
Then update weight vector
Express the new weight vector as a function of w, L(w,x(i)), and the learning rate R
w ...
Problem 4: Write down the update rule for Stochastic Gradient Descent. Let wE Ro be the weight vector, x(0 be the ith sample from the training data, and L(w,x) be a loss function (assume that it is continuous and differentiable). We observe example x(i) and predict using the current hypothesis w. We then update the weight vector. Express the new weight vector as a function of w, L(w.x(0), and the learning rate R. wExplanation / Answer
w(t) <- w(t-1) - R*(del(L(w, x(I)))/del(w))
Where del represents the partial derivative of L with respect to w