Question

Let X1,X2...Xn be i.i.d. with N(theta, 1) a) find the CR Rao lower-band for the variance...

Let X1,X2...Xn be i.i.d. with N(theta, 1)

a) find the CR Rao lower-band for the variance of an unbiased  estimator of theta

b)------------------------------------of theta^2

c)-----------------------------------of P(X>0)

Homework Answers

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Let X1, X2, . . . , Xn be iid random variables with pdf f(x|θ) =...
Let X1, X2, . . . , Xn be iid random variables with pdf f(x|θ) = θx^(θ−1) , 0 < x < 1, θ > 0. Is there an unbiased estimator of some function γ(θ), whose variance attains the Cramer-Rao lower bound?
Let X1, X2, . . . , Xn be iid exponential random variables with unknown mean...
Let X1, X2, . . . , Xn be iid exponential random variables with unknown mean β. (1) Find the maximum likelihood estimator of β. (2) Determine whether the maximum likelihood estimator is unbiased for β. (3) Find the mean squared error of the maximum likelihood estimator of β. (4) Find the Cramer-Rao lower bound for the variances of unbiased estimators of β. (5) What is the UMVUE (uniformly minimum variance unbiased estimator) of β? What is your reason? (6)...
Let X1, ..., Xn be i.i.d. N(µ, σ^2 ) We know that S^ 2 is an...
Let X1, ..., Xn be i.i.d. N(µ, σ^2 ) We know that S^ 2 is an unbiased estimator for σ^ 2 . Show that S^2 X is a consistent estimator for σ^ 2
let X1,X2,..............,Xn be a r.s from N(θ,1). Find the best unbiased estimator for (θ)^2
let X1,X2,..............,Xn be a r.s from N(θ,1). Find the best unbiased estimator for (θ)^2
Let X1, . . . , Xn be i.i.d from pmf f(x|λ) where f(x) = (e^(−λ)*(λ^x))/x!,...
Let X1, . . . , Xn be i.i.d from pmf f(x|λ) where f(x) = (e^(−λ)*(λ^x))/x!, λ > 0, x = 0, 1, 2 a) Find MoM (Method of Moments) estimator for λ b) Show that MoM estimator you found in (a) is minimal sufficient for λ c) Now we split the sample into two parts, X1, . . . , Xm and Xm+1, . . . , Xn. Show that ( Sum of Xi from 1 to m, Sum...
Let X1, X2, ..., Xn be a random sample (of size n) from U(0,θ). Let Yn...
Let X1, X2, ..., Xn be a random sample (of size n) from U(0,θ). Let Yn be the maximum of X1, X2, ..., Xn. (a) Give the pdf of Yn. (b) Find the mean of Yn. (c) One estimator of θ that has been proposed is Yn. You may note from your answer to part (b) that Yn is a biased estimator of θ. However, cYn is unbiased for some constant c. Determine c. (d) Find the variance of cYn,...
Let X1, X2, ..., Xn be a random sample from a distribution with probability density function...
Let X1, X2, ..., Xn be a random sample from a distribution with probability density function f(x; θ) = (θ 4/6)x 3 e −θx if 0 < x < ∞ and 0 otherwise where θ > 0 . a. Justify the claim that Y = X1 + X2 + ... + Xn is a complete sufficient statistic for θ. b. Compute E(1/Y ) and find the function of Y which is the unique minimum variance unbiased estimator of θ. b.  Compute...
Let X1, X2, . . . , Xn be iid following exponential distribution with parameter λ...
Let X1, X2, . . . , Xn be iid following exponential distribution with parameter λ whose pdf is f(x|λ) = λ^(−1) exp(− x/λ), x > 0, λ > 0. (a) With X(1) = min{X1, . . . , Xn}, find an unbiased estimator of λ, denoted it by λ(hat). (b) Use Lehmann-Shceffee to show that ∑ Xi/n is the UMVUE of λ. (c) By the definition of completeness of ∑ Xi or other tool(s), show that E(λ(hat) |  ∑ Xi)...
Let X1,...,Xn be iid from Poisson(theta), Set k(theta)=exp(-theta). (a) What is the MLE of k(theta)? Is...
Let X1,...,Xn be iid from Poisson(theta), Set k(theta)=exp(-theta). (a) What is the MLE of k(theta)? Is it unbiased? (b) Obtain the CRLB for any unbiased estimator of k(theta).
Let X1, X2,..., Xn be a random sample from a population with probability density function f(x)...
Let X1, X2,..., Xn be a random sample from a population with probability density function f(x) = theta(1-x)^(theta-1), where 0<x<1, where theta is a positive unknown parameter a) Find the method of moments estimator of theta b) Find the maximum likelihood estimator of theta c) Show that the log likelihood function is maximized at theta(hat)