Question

Let X1, . . . , Xn and Y1, . . . , Yn be two random samples with the same mean µ and variance σ^2 . (The pdf of Xi and Yj are not specified.)

Show that T = (1/2)Xbar + (1/2)Ybar is an unbiased estimator of µ.

Evaluate MSE(T; µ)

Answer #1

Let X1, X2, ..., Xn be a random sample (of size n) from U(0,θ).
Let Yn be the maximum of X1, X2, ..., Xn.
(a) Give the pdf of Yn.
(b) Find the mean of Yn.
(c) One estimator of θ that has been proposed is Yn. You may
note from your answer to part (b) that Yn is a biased estimator of
θ. However, cYn is unbiased for some constant c. Determine c.
(d) Find the variance of cYn,...

Let (X1, Y1), . . . ,(Xn, Yn), be a random sample from a
bivariate normal distribution with parameters µ1, µ2, σ2 1 , σ2 2 ,
ρ. (Note: (X1, Y1), . . . ,(Xn, Yn) are independent). What is the
joint distribution of (X ¯ , Y¯ )?

Let X1, ..., Xn be i.i.d. N(µ, σ^2 )
We know that S^ 2 is an unbiased estimator for σ^ 2 . Show that
S^2 X is a consistent estimator for σ^ 2

Let X1, X2, . . . , Xn be iid following exponential distribution
with parameter λ whose pdf is f(x|λ) = λ^(−1) exp(− x/λ), x > 0,
λ > 0.
(a) With X(1) = min{X1, . . . , Xn}, find an unbiased estimator
of λ, denoted it by λ(hat).
(b) Use Lehmann-Shceffee to show that ∑ Xi/n is the UMVUE of
λ.
(c) By the definition of completeness of ∑ Xi or other tool(s),
show that E(λ(hat) | ∑ Xi)...

Let Y1, Y2, Y3 ,..,, Yn be a random sample from a normal
distribution with mean µ and standard deviation 1. Then find the
MVUE( Minimum - Variance Unbiased Estimation) for the parameters:
µ^2 and µ(µ+1)

Let Xi, i=1,...,n be independent exponential r.v. with mean
1/ui. Define Yn=min(X1,...,Xn), Zn=max(X1,...,Xn).
1. Define the CDF of Yn,Zn
2. What is E(Zn)
3. Show that the probability that Xi is the smallest one among
X1,...,Xn is equal to ui/(u1+...+un)

Suppose that X is Poisson, with unknown mean λ, and let X1, ...,
Xn be a random sample from X.
a. Find the CRLB for the variance of estimators based on X1,
..., Xn. ̂̅̂
b. Verify that ? = X is an unbiased estimator for λ, and then
show that ? is an efficient estimator for λ.

Let X1, X2, . . . , Xn be iid random variables with pdf
f(x|θ) = θx^(θ−1) , 0 < x < 1, θ > 0.
Is there an unbiased estimator of some function γ(θ), whose
variance attains the Cramer-Rao lower bound?

A random sample X1, X2, . . . , Xn is drawn from a population
with pdf. f(x; β) = (3x^2)/(β^3) , 0 ≤ x ≤ β 0, otherwise
(a) [6] Find the pdf of Yn, the nth order statistic of the
sample.
(b) [4] Find E[Yn].
(c) [4] Find Var[Yn].
(d)[3] Find the mean squared error of Yn when Yn is used as a
point estimator for β
(e) [2] Find an unbiased estimator for β.

Let X1,…, Xn be a sample of iid
Exp(?1, ?2) random variables with common pdf
f (x; ?1, ?2) =
(1/?1)e−(x−?2)/?1 for x
> ?2 and Θ = ℝ × ℝ+.
a) Show that S = (X(1), ∑ni=1
Xi ) is jointly sufficient for (?1, ?2).
b) Determine the pdf of X(1).
c) Determine E[X(1)].
d) Determine E[X2(1) ].
e ) Is X(1) an MSE-consistent estimator of
?2?
f) Given S = (X(1), ∑ni=1
Xi )is a complete sufficient statistic...

ADVERTISEMENT

Get Answers For Free

Most questions answered within 1 hours.

ADVERTISEMENT

asked 3 minutes ago

asked 5 minutes ago

asked 21 minutes ago

asked 33 minutes ago

asked 44 minutes ago

asked 45 minutes ago

asked 1 hour ago

asked 2 hours ago

asked 2 hours ago

asked 2 hours ago

asked 3 hours ago

asked 3 hours ago