Question

True or False: In the simple regression model, both ordinary least squares (OLS) and Method of Moments estimators produce identical estimates. Explain.

Answer #1

The likelihood function associated to the sample is just L(X1, ..., Xn) = Yn i=1 f(Xi ; λ1, ..., λk). For example, if the distribution is N(µ, σ2 ), then L(X1, ..., Xn; ˆµ, σˆ 2 ) = 1 (2π) n/2σˆ n exp ( − 1 2ˆσ 2 (X1 − µˆ) 2 + · · · + (Xn − µˆ) 2 )). The likelihood function is measuring is how likely (X1, ..., Xn) is to have come from the distribution assuming particular values for the hidden parameters; the more likely this is, the closer one would think that those particular choices for hidden parameters are to the true value.The MLE produces biased estimator.

The error line is E := Xn i=1 (Yi − λ1Xi − λ2) 2 .

0 = ∂E ∂λ1 = Xn i=1 2(Yi − λ1Xi − λ2)(−Xi) 0 = ∂E ∂λ2 = Xn i=1 2(Yi − λ1Xi − λ2)(−1).

so, y = λ1f1(x) + · · · + λkfk(x) is a best-fit curve to a set of data points (X1, Y1), ...,(Xk, Yk).

thus both estimates are not equal. The answer is
**false**

True or False. Explain your answer:
d) Least squares estimates of the regression coefficients b0,
b1, . . . bn are chosen to maximize R2 .
e) If all the explanatory variables are uncorrelated, the
variance inflation factor (VIF) for each explanatory variable will
be 1.
) b0 and b1 from a simple linear regression model are
independent.

Consider the simple linear regression model for which the
population regression equation can be written in conventional
notation as: yi= Beta1(xi)+
Beta2(xi)(zi)2+ui
Derive the Ordinary Least Squares estimator (OLS) of beta
i.e(BETA)

In the multiple linear regression model with estimation by
ordinary least squares, is it really necessary to perform the
normality analysis of the residues? What if the errors are not
normal? How to proceed with the tests if the errors have a
t-Student distribution with 5 degrees of freedom? (Do not confuse
model errors with waste!)

Showing that residuals, , from the least squares fit of the
simple linear regression model sum to zero

true or false ?/ The Least Squares regression line always passes
through the point (, ).

What are the pitfalls of simple linear regression? True or False
for each
Lacking an awareness of the assumptions of least squares
regression.
Not knowing how to evaluate the assumptions of least squares
regressions.
Not knowing the alternatives to least squares regression if a
particular assumption is violated.
Using a regression model without knowledge of the subject
matter.
Extrapolating outside the relevant range of the X and Y
variables.
Concluding that a significant relationship identified always
reflects a cause-and-effect relationship.

In the multiple linear regression model with estimation by
ordinary least squares, why should we make an analysis of the
scatter plot between each covariable xij, j = 1, 2,. . . ,p with
the residues ei?

1) Which is NOT a fundamental assumption of OLS (Ordinary Least
Squares)?
a) The regression model is nonlinear
in the coefficients and error term.
b) Observations of the
error term are uncorrelated with each other.
c) No independent variable is a
perfect linear function of any other explanatory variables.
d) The error term has
homoscedasticity.
e) All independent variables will be uncorrelated
with the error term.
-----------------------------------------------------------------------------------------------------------------------------------------------
-----------------------------------------------------------------------------------------------------------------------------------------------
2) You test a model that...

Answer true or false and state why.
1. The least-squares estimators are always BLUE and BUE.
2, In large samples the usually standardized ratios follow the
t-distribution.
3. If we rescale the dependent variable in regression by dividing
by 100, the new coefficient and their estimates will be multiplied
by 100.
4. In choosing between models we always seek to maximize
R^2.
(h) In choosing between models we always seek to maximize R2.
3

In simple linear regression, the method of least squares
determines the line that minimizes the sum of squared deviations
between the observed y values and: a. the average of the y values
b. the average of the x values c. the fitted line d. the line of
residual errors

ADVERTISEMENT

Get Answers For Free

Most questions answered within 1 hours.

ADVERTISEMENT

asked 23 minutes ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 2 hours ago

asked 2 hours ago

asked 2 hours ago

asked 2 hours ago

asked 2 hours ago

asked 3 hours ago

asked 5 hours ago