True or False: In the simple regression model, both ordinary least squares (OLS) and Method of Moments estimators produce identical estimates. Explain.
The likelihood function associated to the sample is just L(X1, ..., Xn) = Yn i=1 f(Xi ; λ1, ..., λk). For example, if the distribution is N(µ, σ2 ), then L(X1, ..., Xn; ˆµ, σˆ 2 ) = 1 (2π) n/2σˆ n exp ( − 1 2ˆσ 2 (X1 − µˆ) 2 + · · · + (Xn − µˆ) 2 )). The likelihood function is measuring is how likely (X1, ..., Xn) is to have come from the distribution assuming particular values for the hidden parameters; the more likely this is, the closer one would think that those particular choices for hidden parameters are to the true value.The MLE produces biased estimator.
The error line is E := Xn i=1 (Yi − λ1Xi − λ2) 2 .
0 = ∂E ∂λ1 = Xn i=1 2(Yi − λ1Xi − λ2)(−Xi) 0 = ∂E ∂λ2 = Xn i=1 2(Yi − λ1Xi − λ2)(−1).
so, y = λ1f1(x) + · · · + λkfk(x) is a best-fit curve to a set of data points (X1, Y1), ...,(Xk, Yk).
thus both estimates are not equal. The answer is false
Get Answers For Free
Most questions answered within 1 hours.