Question

Suppose that X1,X2 and X3 are independent random variables with common mean E(Xi) = μ and variance Var(Xi) = σ2. Let V= X2−X3 and W = X1− 2X2 + X3.

(a) Find E(V) and E(W).

(b) Find Var(V) and Var(W).

(c) Find Cov(V,W).

(d) Find the correlation coefficient ρ(V,W). Are V and W independent?

Answer #1

Suppose that X1, X2, . . . , Xn are independent identically
distributed random
variables with variance σ2. Let Y1 = X2 +X3 , Y2 = X1 +X3 and
Y3 = X1 + X2. Find the following : (in terms of σ2)
(a) Var(Y1)
(b) cov(Y1 , Y2 )
(c) cov(X1 , Y1 )
(d) Var[(Y1 + Y2 + Y3)/2]

Suppose that X1, X2, X3 are independent, have mean 0 and Var(Xi)
= i. Find Cov(X1 − X2, X2 + X3).
.For X1, X2, X3, find ρ(X1−X2, X2+ X3).

Let
x1, x2 x3 ....be a sequence of independent and identically
distributed random variables, each having finite mean E[xi] and
variance Var（xi）.
a）calculate the var （x1+x2）
b）calculate the var（E[xi]）
c） if n-> infinite, what is Var（E[xi]）？

Suppose X1, X2, X3, and
X4 are independent and identically distributed random
variables with mean 10 and variance 16. in addition, Suppose that
Y1, Y2, Y3, Y4, and
Y5are independent and identically distributed random
variables with mean 15 and variance 25. Suppose further that
X1, X2, X3, and X4 and
Y1, Y2, Y3, Y4, and
Y5are independent. Find Cov[bar{X} + bar{Y} + 10,
2bar{X} - bar{Y}], where bar{X} is the sample mean of
X1, X2, X3, and X4 and
bar{Y}...

Let X and Y be independent and identically
distributed random variables with mean μ and variance
σ2. Find the following:
a) E[(X + 2)2]
b) Var(3X + 4)
c) E[(X - Y)2]
d) Cov{(X + Y), (X - Y)}

Let X1, X2 be two normal random variables each with population
mean µ and population variance σ2. Let σ12 denote the covariance
between X1 and X2 and let ¯ X denote the sample mean of X1 and X2.
(a) List the condition that needs to be satisﬁed in order for ¯ X
to be an unbiased estimate of µ. (b) [3] As carefully as you can,
without skipping steps, show that both X1 and ¯ X are unbiased
estimators of...

Let X1, X2, X3, and X4 be mutually independent random variables
from the same distribution. Let
S = X1 + X2 + X3 + X4. Suppose we know that S is a Chi-Square
random variable with 2 degrees of freedom. What
is the distribution of each of the Xi?

Let X1, X2,... be a sequence of
independent random variables distributed exponentially with mean 1.
Suppose that N is a random variable, independent of the Xi-s, that
has a Poisson distribution with mean λ > 0. What is the expected
value of X1 + X2 +···+
XN2?
(A) N2
(B) λ + λ2
(C) λ2
(D) 1/λ2

Let X1, X2, X3, and X4 be a random sample of observations from
a population with mean μ and variance σ2.
Consider the following estimator of μ:
1 = 0.15 X1 + 0.35 X2 + 0.20 X3 + 0.30 X4. Is this a biased
estimator for the mean? What is the variance of the estimator? Can
you find a more efficient estimator?) ( 10 Marks)

a) Let Xi for i = 1,2,...n be random variables with E[Xi] = μi
(not necessarily independent). Show that E[∑ni
=1 Xi] = [∑ni =1 μi]. Show from
Definition
b) Suppose that random variables Yi for i = 1, 2,...,n are
independent and identically distributed withE[Yi] =γ(gamma) and
Var[Yi] = σ2, Use part (a) to show that E[Ybar]
=γ(gamma).
(c) Suppose that random variables Yi for i = 1, 2,...,n are
independent and identically distributed with E[Yi] =γ(gamma) and
Var[Yi]...

ADVERTISEMENT

Get Answers For Free

Most questions answered within 1 hours.

ADVERTISEMENT

asked 9 minutes ago

asked 22 minutes ago

asked 27 minutes ago

asked 54 minutes ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 2 hours ago