Question

Let X_1,…, X_n  be a random sample from the Bernoulli distribution, say P[X=1]=θ=1-P[X=0]. and Cramer Rao Lower...

Let X_1,…, X_n  be a random sample from the Bernoulli distribution, say P[X=1]=θ=1-P[X=0].

and

Cramer Rao Lower Bound of θ(1-θ)

=((1-2θ)^2 θ(1-θ))/n

Find the UMVUE of θ(1-θ) if such exists.

can you proof [part (b) ] using (Leehmann Scheffe Theorem step by step solution) to proof [∑X1-nXbar^2 ]/(n-1) is the umvue , I have the key solution below

x is complete and sufficient.

S^2=∑ [X1-Xbar ]^2/(n-1) is unbiased estimator of θ(1-θ) since the sample variance is an unbiased estimator of the population variance. Furthermore,

S^2= [*∑X1)^2-(∑Xbar)^2 ]/(n-1)= [∑X1)-nXbar^2 ]/(n-1) is a function of ∑Xi , hence, by Leehmann Scheffe Theorem S^2 is UMVUE of θ(1-θ)

Homework Answers

Answer #1

It is known that Xi follows Bernoulli( theta)

It is of the form of one parameter exponential family

where,   

Therefore by one parameter exponential family,

is complete and sufficient for

Now, , because

We know that we can estimate the population variance by sample variance which is also an unbiased estimate of population variance

denotes the sample variance

  

Hence, S2 is an unbiased estimate of

Lehman sceffe theorem says that, if T is a complete sufficient stat for some parameter p , and T' be an unbiased estimator of g(p) for all p, then E(T'|T) is the UMVUE of g(p)

Here,

p=, g(p) =

T' = S2 which is an unbiased estimate of  

is complete and sufficient for

Therefore,

, because S2 is a function of

therefore S2 is the UMVUE by Lehman Sceffe

  

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Let X1, X2, . . . , Xn be iid random variables with pdf f(x|θ) =...
Let X1, X2, . . . , Xn be iid random variables with pdf f(x|θ) = θx^(θ−1) , 0 < x < 1, θ > 0. Is there an unbiased estimator of some function γ(θ), whose variance attains the Cramer-Rao lower bound?
Suppose X_1, X_2, … X_n is a random sample from a population with density f(x) =...
Suppose X_1, X_2, … X_n is a random sample from a population with density f(x) = (2/theta)*x*e^(-x^2/theta) for x greater or equal to zero. Determine if Theta-Hat_1 (MLE) is a minimum variance unbiased estimator for thet Determine if Theta-Hat_2 (MOM) is a minimum variance unbiased estimator for theta.
Let X_1, ..., X_n be a random sample from a normal distribution, N(0, theta). Is theta_hat...
Let X_1, ..., X_n be a random sample from a normal distribution, N(0, theta). Is theta_hat a UMVUE of theta? The above question is from chapter 9 problem 23b of Introduction to Probability and Mathematical Statistics (for which you have a solution posted on this website). I'm confused about the part in the posted solution where we go from the line that says E(x^4 -2\theta * E(x^2) + E(\theta^2) to the line that says (3\theta^2-2\theta^2+\theta^2). Could you please explain this...
Let X1, X2, . . . , Xn be iid exponential random variables with unknown mean...
Let X1, X2, . . . , Xn be iid exponential random variables with unknown mean β. (1) Find the maximum likelihood estimator of β. (2) Determine whether the maximum likelihood estimator is unbiased for β. (3) Find the mean squared error of the maximum likelihood estimator of β. (4) Find the Cramer-Rao lower bound for the variances of unbiased estimators of β. (5) What is the UMVUE (uniformly minimum variance unbiased estimator) of β? What is your reason? (6)...
Suppose the random variable X follows the Poisson P(m) PDF, and that you have a random...
Suppose the random variable X follows the Poisson P(m) PDF, and that you have a random sample X1, X2,...,Xn from it. (a)What is the Cramer-Rao Lower Bound on the variance of any unbiased estimator of the parameter m? (b) What is the maximum likelihood estimator ofm?(c) Does the variance of the MLE achieve the CRLB for all n?
Let X1,X2...Xn be i.i.d. with N(theta, 1) a) find the CR Rao lower-band for the variance...
Let X1,X2...Xn be i.i.d. with N(theta, 1) a) find the CR Rao lower-band for the variance of an unbiased  estimator of theta b)------------------------------------of theta^2 c)-----------------------------------of P(X>0)
Let X 1 , X 2 and X 3 be independently distributed as Bernoulli(p) and let...
Let X 1 , X 2 and X 3 be independently distributed as Bernoulli(p) and let two estimators of p be defined as pˆ 1 = X and pˆ 2 = X 1 + X 2 − X 3 . a) Which estimator(s) is (are) unbiased? b) Obtain their variances. c) Can you say one estimator is better than the other?
Let X1, X2, ·······, Xn be a random sample from the Bernoulli distribution. Under the condition...
Let X1, X2, ·······, Xn be a random sample from the Bernoulli distribution. Under the condition 1/2≤Θ≤1, find a maximum-likelihood estimator of Θ.
Let X1, X2,...,Xn be a random sample from Bernoulli (p). Determine a sufficient statistic for p...
Let X1, X2,...,Xn be a random sample from Bernoulli (p). Determine a sufficient statistic for p and derive the UMVUE and MLE of T(p)=p^2(1-p)^2.
Let X1, X2, ..., Xn be a random sample from a distribution with probability density function...
Let X1, X2, ..., Xn be a random sample from a distribution with probability density function f(x; θ) = (θ 4/6)x 3 e −θx if 0 < x < ∞ and 0 otherwise where θ > 0 . a. Justify the claim that Y = X1 + X2 + ... + Xn is a complete sufficient statistic for θ. b. Compute E(1/Y ) and find the function of Y which is the unique minimum variance unbiased estimator of θ. b.  Compute...