Question

Consider the model Yi = 2 + βxi +E i , i = 1, 2, ....

Consider the model Yi = 2 + βxi +E i , i = 1, 2, . . . , n where E1, . . . , E n be a sequence of i.i.d. observations from N(0, σ2 ). and x1, . . . , xn are given constants.

(i) Find MLE for β and call it β. ˆ .

(ii) For a sample of size n = 7 let (x1, . . . , x7) = (0, 0, 1, 2, 3, 5, 6) and (y1, . . . , y7) = (2, 4, 5, 8, 8, 10, 13). Calculate βˆ.

Homework Answers

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
X1, … Xn are i.i.d. random variables, and E(Xi ) = 3β, Var(Xi ) = 3β^2...
X1, … Xn are i.i.d. random variables, and E(Xi ) = 3β, Var(Xi ) = 3β^2 , i = 1 … n, β > 0. Two estimators of β are defined as β̂ 1 = (X̅ /3) β̂ 2 = (n /3n+1 ) X̅ Show that MSE(β̂ 2) < MSE(β̂ 1) for a sample size of n = 3.
X1,...,X81 ⇠ N(0,1) and Y1,...,Y81 ⇠ N(3,2 ). For each i, Corr(Xi,Yi)= 1/2. Let Zi =...
X1,...,X81 ⇠ N(0,1) and Y1,...,Y81 ⇠ N(3,2 ). For each i, Corr(Xi,Yi)= 1/2. Let Zi = Xi + Yi. 1. Compute Var(Zi). 2. Approximate P [ Zi > 243]. (Explain your answer.)
Let B > 0 and let X1 , X2 , … , Xn be a random...
Let B > 0 and let X1 , X2 , … , Xn be a random sample from the distribution with probability density function. f( x ; B ) = β/ (1 +x)^ (B+1), x > 0, zero otherwise. (i) Obtain the maximum likelihood estimator for B, β ˆ . (ii) Suppose n = 5, and x 1 = 0.3, x 2 = 0.4, x 3 = 1.0, x 4 = 2.0, x 5 = 4.0. Obtain the maximum likelihood...
Let X1. ..., Xn, be a random sample from Exponential(β) with pdf f(x) = 1/β(e^(-x/β)) I(0,...
Let X1. ..., Xn, be a random sample from Exponential(β) with pdf f(x) = 1/β(e^(-x/β)) I(0, ∞)(x), B > 0 where β is an unknown parameter. Find the UMVUE of β^2.
Review: Central Limit Theorem 1 point possible (graded) The Central Limit Theorem states that if X1,…,Xn...
Review: Central Limit Theorem 1 point possible (graded) The Central Limit Theorem states that if X1,…,Xn are i.i.d. and E[X1]=μ<∞ ; Var(X1)=σ2<∞, then n−−√[(1n∑i=1nXi)−μ]−→−−n→∞(d)Wwhere W∼N(0,?). What is Var(W)? (Express your answer in terms of n, μ and σ).
a) Let Xi for i = 1,2,...n be random variables with E[Xi] = μi (not necessarily...
a) Let Xi for i = 1,2,...n be random variables with E[Xi] = μi (not necessarily independent). Show that E[∑ni =1 Xi] = [∑ni =1 μi]. Show from Definition b) Suppose that random variables Yi for i = 1, 2,...,n are independent and identically distributed withE[Yi] =γ(gamma) and Var[Yi] = σ2, Use part (a) to show that E[Ybar] =γ(gamma). (c) Suppose that random variables Yi for i = 1, 2,...,n are independent and identically distributed with E[Yi] =γ(gamma) and Var[Yi]...
4. Let X1. ..., Xn, be a random sample from Exponential(β) with pdf f(x) = 1/β(e^(-x/β))...
4. Let X1. ..., Xn, be a random sample from Exponential(β) with pdf f(x) = 1/β(e^(-x/β)) I(0, ∞)(x), B > 0 where β is an unknown parameter. Find the UMVUE of β2.
Consider the sequence (xn)n given by x1 = 2, x2 = 2 and xn+1 = 2(xn...
Consider the sequence (xn)n given by x1 = 2, x2 = 2 and xn+1 = 2(xn + xn−1). (a) Let u, w be the solutions of the equation x 2 −2x−2 = 0, so that x 2 −2x−2 = (x−u)(x−w). Show that u + w = 2 and uw = −2. (b) Possibly using (a) to aid your calculations, show that xn = u^n + w^n .
(a) Let Y1,Y2,··· ,Yn be i.i.d. with geometric distribution P(Y = y) = p(1−p)y-1 y=1, 2,...
(a) Let Y1,Y2,··· ,Yn be i.i.d. with geometric distribution P(Y = y) = p(1−p)y-1 y=1, 2, ........, 0<p<1. Find a sufficient statistic for p. (b) Let Y1,··· ,yn be a random sample of size n from a beta distribution with parameters α = θ and β = 2. Find the sufficient statistic for θ.
Consider the model, Yi = B0 + B1 X1,i + B2 X2,i + Ui, where sorting...
Consider the model, Yi = B0 + B1 X1,i + B2 X2,i + Ui, where sorting the residuals based on the X1,i and X2,i gives: X1 X2 Goldfeld-Quandt Statistic 1.362 (X1) 4.527 (X2) If there is heteroskedasticity present at the 5% critical-F value of 1.624, then choose the most appropriate heteroskedasticity correction method. A. Heteroskedastic correction based on X2. B. Heteroskedastic correction based on X1. C. No heteroskedastic correction needed. D. White's heteroskedastic-consistent standard errors E. Not enough information.
ADVERTISEMENT
Need Online Homework Help?

Get Answers For Free
Most questions answered within 1 hours.

Ask a Question
ADVERTISEMENT