Question

Let X1, X2, ..., Xn be a random sample (of size n) from U(0,θ). Let Yn...

Let X1, X2, ..., Xn be a random sample (of size n) from U(0,θ). Let Yn be the maximum of X1, X2, ..., Xn.

(a) Give the pdf of Yn.

(b) Find the mean of Yn.

(c) One estimator of θ that has been proposed is Yn. You may note from your answer to part (b) that Yn is a biased estimator of θ. However, cYn is unbiased for some constant c. Determine c.

(d) Find the variance of cYn, wherec is the constant you determined in part (c).

Homework Answers

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Let X1, X2 · · · , Xn be a random sample from the distribution with...
Let X1, X2 · · · , Xn be a random sample from the distribution with PDF, f(x) = (θ + 1)x^θ , 0 < x < 1, θ > −1. Find an estimator for θ using the maximum likelihood
Let X1, X2, . . . , Xn be iid random variables with pdf f(x|θ) =...
Let X1, X2, . . . , Xn be iid random variables with pdf f(x|θ) = θx^(θ−1) , 0 < x < 1, θ > 0. Is there an unbiased estimator of some function γ(θ), whose variance attains the Cramer-Rao lower bound?
let X1,X2,..............,Xn be a r.s from N(θ,1). Find the best unbiased estimator for (θ)^2
let X1,X2,..............,Xn be a r.s from N(θ,1). Find the best unbiased estimator for (θ)^2
Let X1, X2, ..., Xn be a random sample from a distribution with probability density function...
Let X1, X2, ..., Xn be a random sample from a distribution with probability density function f(x; θ) = (θ 4/6)x 3 e −θx if 0 < x < ∞ and 0 otherwise where θ > 0 . a. Justify the claim that Y = X1 + X2 + ... + Xn is a complete sufficient statistic for θ. b. Compute E(1/Y ) and find the function of Y which is the unique minimum variance unbiased estimator of θ. b.  Compute...
1. Let X1, X2, . . . , Xn be a random sample from a distribution...
1. Let X1, X2, . . . , Xn be a random sample from a distribution with pdf f(x, θ) = 1 3θ 4 x 3 e −x/θ , where 0 < x < ∞ and 0 < θ < ∞. Find the maximum likelihood estimator of ˆθ.
A random sample X1, X2, . . . , Xn is drawn from a population with...
A random sample X1, X2, . . . , Xn is drawn from a population with pdf. f(x; β) = (3x^2)/(β^3) , 0 ≤ x ≤ β 0, otherwise (a) [6] Find the pdf of Yn, the nth order statistic of the sample. (b) [4] Find E[Yn]. (c) [4] Find Var[Yn]. (d)[3] Find the mean squared error of Yn when Yn is used as a point estimator for β (e) [2] Find an unbiased estimator for β.
Let θ > 1 and let X1, X2, ..., Xn be a random sample from the...
Let θ > 1 and let X1, X2, ..., Xn be a random sample from the distribution with probability density function f(x; θ) = 1/xlnθ , 1 < x < θ. c) Let Zn = nlnY1. Find the limiting distribution of Zn. d) Let Wn = nln( θ/Yn ). Find the limiting distribution of Wn.
6. Let θ > 1 and let X1, X2, ..., Xn be a random sample from...
6. Let θ > 1 and let X1, X2, ..., Xn be a random sample from the distribution with probability density function f(x; θ) = 1/(xlnθ) , 1 < x < θ. a) Obtain the maximum likelihood estimator of θ, ˆθ. b) Is ˆθ a consistent estimator of θ? Justify your answer.
Let X1, X2, · · · , Xn be a random sample from the distribution, f(x;...
Let X1, X2, · · · , Xn be a random sample from the distribution, f(x; θ) = (θ + 1)x^ −θ−2 , x > 1, θ > 0. Find the maximum likelihood estimator of θ based on a random sample of size n above
Let X1, . . . , Xn and Y1, . . . , Yn be two...
Let X1, . . . , Xn and Y1, . . . , Yn be two random samples with the same mean µ and variance σ^2 . (The pdf of Xi and Yj are not specified.) Show that T = (1/2)Xbar + (1/2)Ybar is an unbiased estimator of µ. Evaluate MSE(T; µ)