Problem 6: (20 points) Suppose we wise to predict the values of a random variabl
ID: 3053000 • Letter: P
Question
Problem 6: (20 points) Suppose we wise to predict the values of a random variable Y by observing the values of another random variable X. In particular, the available data in Figure I suggest that a good prediction model for Y is the linear prediction Yp aX+ B. Now although Y may be related to X, the values it takes on may be influenced by other sources that do not affect X. Thus, in general, 1Px.y|?l and we expect that there will be an error between the predicted value of Y, that is YP, and the value that Y actually assumes. Determine the coefficients ? and ? to minimize the mean-square error E[(Y-Yp)i. Hint: lf2" is a minimizer of function f(x), then = 0Explanation / Answer
Let a and b be the estimates of ? and ? respectively that minimizes the Mean Square Error, say S, = E[(Y - YP)2].
Then, the predicted value, yP of y for the value x is: yP = a + bx. ……………….(1) S = Sum(i = 1 to n){yi – axi - b}2 …………………………………(2)
The estimates, a and b will be least squares estimates, if S is minimum.
Now, partial derivative of S w.r.t. b = Sum(i = 1 to n)(- 2){ yi – axi - b}.
Equating this to zero and solving for b, nb = [Sum(i = 1 to n)yi - aSum(i = 1 to n)xi] or
by dividing both sides by n, b = ybar - axbar……………………………………….(3)
(3) in (2): S = Sum(i = 1 to n){yi - ybar + axbar -axi}2 = Sum(i = 1 to n){(yi – ybar) – a(xi – xbar)}2 Partial derivative of S w.r.t a = Sum(i = 1 to n)(- 2){(yi – ybar) – a(xi – xbar)}(xi – xbar) = (- 2)Sum(i = 1 to n){(yi – ybar)(xi – xbar)} +2a.Sum(i = 1 to n){(xi – xbar)2}………(4)
Equating (4) to zero and solving for a, a = Sum(i = 1 to n){(yi – ybar)(xi – xbar)}/Sum(i = 1 to n){(xi – xbar)2}……………(5) Or,
If Sum(i = 1 to n){(yi – ybar)(xi – xbar)} = Sxy and Sxx = Sum(i = 1 to n){(xi – xbar)2
a = Sxy/Sxx ……………………………………………………………………………….(6)
Thus, estimators of ? and ? are:
b = ybar – axbar and a = Sxy/Sxx.