Question

Suppose X and Y are independent variables with E(X) = E(Y ) = θ, Var(X) =...

Suppose X and Y are independent variables with E(X) = E(Y ) = θ, Var(X) = 2 and Var(Y ) = 4. The two estimators for θ, W1 = 1/2 X + 1/2 Y and W2 = 3/4 X + 1/4 Y .

(1) Are W1 and W2 unbiased? (2) Which estimator is more efficient (smaller variance)?

Homework Answers

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Suppose that X and Y are random samples of observations from a population with mean μ...
Suppose that X and Y are random samples of observations from a population with mean μ and variance σ2. Consider the following two unbiased point estimators of μ. A = (2/3)X + (1/3)Y B = (5/4)X - (1/4)Y Find variance of A. Var(A) and Var(B) Efficient and unbiased point estimator for μ is = ?
VERY URGENT !!!! Suppose that X and Y are random samples of observations from a population...
VERY URGENT !!!! Suppose that X and Y are random samples of observations from a population with mean μ and variance σ2. Consider the following two unbiased point estimators of μ. A = (7/4)X - (3/4)Y    B = (1/3)X + (2/3)Y [Give your answers as ratio (eg: as number1 / number2 ) and DO NOT make any cancellation]   1.    Find variance of A. Var(A) =(Answer)*σ2 2.    Find variance of B. Var(B) =(Answer)*σ2 3. Efficient and unbiased point...
Suppose that E[X]= E[Y] = mu, where mu is a fixed unknown number. We have independent...
Suppose that E[X]= E[Y] = mu, where mu is a fixed unknown number. We have independent simple random samples of size n each from the distribution of X and Y, respectively. Suppose that Var[X] = 2*Var[Y]. Consider the following estimators of mu: m1 = bar{X} m2 = bar{Y}/2 m3 = 3*bar{X}/4 + 2*bar{Y}/8 where bar{X} and bar{Y} are the sample mean of X and Y values, respectively. Which of the estimators are unbiased?
1) Suppose that X and Y are two random variables, which may be dependent and Var(X)=Var(Y)....
1) Suppose that X and Y are two random variables, which may be dependent and Var(X)=Var(Y). Assume that 0<Var(X+Y)<∞ and 0<Var(X-Y)<∞. Which of the following statements are NOT true? (There may be more than one correct answer) a. E(XY) = E(X)E(Y) b. E(X/Y) = E(X)/E(Y) c. (X+Y) and (X-Y) are correlated d. (X+Y) and (X-Y) are not correlated. 2) S.D(X ± Y) is equal to, where S.D means standard deviation a. S.D(X) ± S.D(Y) b. Var(X) ± Var(Y) c. Square...
Problem 3. Let Y1, Y2, and Y3 be independent, identically distributed random variables from a population...
Problem 3. Let Y1, Y2, and Y3 be independent, identically distributed random variables from a population with mean µ = 12 and variance σ 2 = 192. Let Y¯ = 1/3 (Y1 + Y2 + Y3) denote the average of these three random variables. A. What is the expected value of Y¯, i.e., E(Y¯ ) =? Is Y¯ an unbiased estimator of µ? B. What is the variance of Y¯, i.e, V ar(Y¯ ) =? C. Consider a different estimator...
If X, Y are random variables with E(X) = 2, Var(X) = 3, E(Y) = 1,...
If X, Y are random variables with E(X) = 2, Var(X) = 3, E(Y) = 1, Var(Y) =2, ρX,Y = −0.5 (a) For Z = 3X − 1 find µZ, σZ. (b) For T = 2X + Y find µT , σT (c) U = X^3 find approximate values of µU , σU
Consider two random variables X and Y such that E(X)=E(Y)=120, Var(X)=14, Var(Y)=11, Cov(X,Y)=0. Compute an upper...
Consider two random variables X and Y such that E(X)=E(Y)=120, Var(X)=14, Var(Y)=11, Cov(X,Y)=0. Compute an upper bound to P(|X−Y|>16)
Let X ~ N(1,3) and Y~ N(5,7) be two independent random variables. Find... Var(X + Y...
Let X ~ N(1,3) and Y~ N(5,7) be two independent random variables. Find... Var(X + Y + 32) Var(X -Y) Var(2X - 4Y)
Let X and Y be independent and normally distributed random variables with waiting values E (X)...
Let X and Y be independent and normally distributed random variables with waiting values E (X) = 3, E (Y) = 4 and variances V (X) = 2 and V (Y) = 3. a) Determine the expected value and variance for 2X-Y Waiting value µ = Variance σ2 = σ 2 = b) Determine the expected value and variance for ln (1 + X 2) c) Determine the expected value and variance for X / Y
Let X1, X2, . . . , Xn be iid random variables with pdf f(x|θ) =...
Let X1, X2, . . . , Xn be iid random variables with pdf f(x|θ) = θx^(θ−1) , 0 < x < 1, θ > 0. Is there an unbiased estimator of some function γ(θ), whose variance attains the Cramer-Rao lower bound?