Question

Assume linear model y=Xβ+ε

Assume cov(ε)=σ^{2}I (I is the identity matrix)

Show that cov(β-hat) =
σ^{2}(X^{T}X)^{-1} and discuss its
meaning

Answer #1

This is var cov matrix of estimated beta coefficients.

First we find the inverse of (x'x) matrix then multiply by sigma square will give the variances and covariance of beta hat.

Off diagonal elements will give covariance and diagonal elements will give variance

Based on the definition of the linear regression model in its
matrix form, i.e., y=Xβ+ε, the assumption that
ε~N(0,σ2I), and
the general formula for the point estimators for the parameters of
the model
(b=XTX-1XTy);
show:
how to derivate the formula for the point estimators for the
parameters of the models by means of the Least Square Estimation
(LSE). [Hint: you must minimize
ete]
that the LSE estimator, i.e.,
b=XTX-1XTy,
is unbiased. [Hint:
E[b]=β]

Based on the definition of the linear regression model in its
matrix form, i.e., y=Xβ+ε, the assumption that
ε~N(0,σ2I), the
general formula for the point estimators for the parameters of the
model
(b=XTX-1XTy),
and the definition of
varb=Eb-Ebb-EbT
Show that
varb=σ2XTX-1
Note: the derivations in here need to be done in matrix form.
Simple algebraic method will not be allowed.

Consider the multiple linear regression model
y = β0 +β1x1 +β2x2 +β3x3 +β4x4 +ε
Using the procedure for testing a general linear hypothesis, show
how to test
a. H 0 : β 1 = β 2 = β 3 = β 4 = β
b. H 0 : β 1 = β 2 , β 3 = β 4
c. H0: β1-2β2=4β3
β1+2β2=0

Suppose that your linear regression model includes a constant
term, so that in the linear regression model
Y = Xβ + ε
The matrix of explanatory variables X can be
partitioned as follows: X = [i X1]. The
OLS estimator of β can thus be partitioned
accordingly into b’ = [b0
b1’], where b0 is
the OLS estimator of the constant term and
b1 is the OLS estimator of the slope
coefficients.
a) Use partitioned regression to derive formulas for...

Calculate Cov(e,y hat); show your work(linear regression). Why
should you have known this answer without doing the calculation,
assuming normal error terms? Why does the assumption of normality
matter?

1). Show that if AB = I (where I is the identity matrix) then
A is non-singular and B is non-singular (both A and B are nxn
matrices)
2). Given that det(A) = 3 and det(B) = 2, Evaluate (numerical
answer) each of the following or state that it’s not possible to
determine the value.
a) det(A^2)
b) det(A’) (transpose determinant)
c) det(A+B)
d) det(A^-1) (inverse determinant)

1. Given β = XT 1×nAn×nXn×1, show that the gradient of β with
respect to X has the following form: ∇β = X T (A + A T ). Also,
simplify the above result when A is symmetric. (Hint: β can be
written as Pn j=1 Pn i=1 aijxixj ).
2. In this problem, we consider a probabilistic view of linear
regression y (i) = θ T x (i)+ (i) , i = 1, . . . , n, which...

In the linear regression model ? = ?0 + ?1? + ?, let y be the
selling price of a house in dollars and x be its living area in
square feet. Define a new variable ? ∗ = ? − 1000 (that is, ? ∗ is
the square feet in excess of 1000), and estimate the model ? = ?0 ∗
+ ?1 ∗? ∗ + ?.
a] Show the relationship between the OLS estimators ?1̂∗ and ?̂1
....

1, Which one of the following equations contains the true error
term? Pick 1
Y-hat = α+ β X
None of the above
Y-hat = a+ b X
Y = α+ β X + ε
All of the above
Y = a+ b X + e
2,
After running a regression, we calculate the coefficient of
determination r2 = 0.94. How to interpret this
r2? Pick 1
The variation of Y is 94%.
94% of variation in Y is explained by...

Let X ∼ Beta(α, β).
(a) Show that EX 2 = (α + 1)α (α + β + 1)(α + β) .
(b) Use the fact that EX = α/(α + β) and your answer to the
previous part to show that Var X = αβ (α + β) 2 (α + β + 1).
(c) Suppose X is the proportion of free-throws made over the
lifetime of a randomly sampled kid, and assume that X ∼ Beta(2,
8)
....

ADVERTISEMENT

Get Answers For Free

Most questions answered within 1 hours.

ADVERTISEMENT

asked 28 minutes ago

asked 30 minutes ago

asked 32 minutes ago

asked 32 minutes ago

asked 42 minutes ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 2 hours ago