Question

Let X1, X2, ..., Xn be a random sample from a distribution with probability density function f(x; θ) = (θ 4/6)x 3 e −θx if 0 < x < ∞ and 0 otherwise where θ > 0

. a. Justify the claim that Y = X1 + X2 + ... + Xn is a complete sufficient statistic for θ. b. Compute E(1/Y ) and find the function of Y which is the unique minimum variance unbiased estimator of θ.

b. Compute E(1/Y ) and find the function of Y which is the unique minimum variance unbiased estimator of θ.

Answer #1

Let X1, X2, . . . , Xn be iid random variables with pdf
f(x|θ) = θx^(θ−1) , 0 < x < 1, θ > 0.
Is there an unbiased estimator of some function γ(θ), whose
variance attains the Cramer-Rao lower bound?

Let X1,..., Xn be a random sample from a
distribution with pdf as follows:
fX(x) = e^-(x-θ) , x > θ
0 otherwise.
Find the sufficient statistic for θ.
Find the maximum likelihood estimator of θ.
Find the MVUE of θ,θˆ
Is θˆ a consistent estimator of θ?

6. Let θ > 1 and let X1, X2, ..., Xn be a random sample from
the distribution with probability density function f(x; θ) =
1/(xlnθ) , 1 < x < θ.
a) Obtain the maximum likelihood estimator of θ, ˆθ.
b) Is ˆθ a consistent estimator of θ? Justify your answer.

Let X1, X2,..., Xn be a random sample from a population with
probability density function f(x) = theta(1-x)^(theta-1), where
0<x<1, where theta is a positive unknown parameter
a) Find the method of moments estimator of theta
b) Find the maximum likelihood estimator of theta
c) Show that the log likelihood function is maximized at
theta(hat)

Let X1, X2, ..., Xn be a random sample (of size n) from U(0,θ).
Let Yn be the maximum of X1, X2, ..., Xn.
(a) Give the pdf of Yn.
(b) Find the mean of Yn.
(c) One estimator of θ that has been proposed is Yn. You may
note from your answer to part (b) that Yn is a biased estimator of
θ. However, cYn is unbiased for some constant c. Determine c.
(d) Find the variance of cYn,...

When X1, X2, ..., and Xn are probability samples
from a population whose probability density functions are f(x) =
3x^2 / θ^3, 0 < x < θ.
a. Determine constant c so that cx̅ becomes the
unbiased estimator of θ
b. Find 95% confidence interval of θ.

6. Let X1, X2, ..., Xn be a random sample of a random variable X
from a distribution with density
f (x) ( 1)x 0 ≤ x ≤ 1
where θ > -1. Obtain,
a) Method of Moments Estimator (MME) of parameter θ.
b) Maximum Likelihood Estimator (MLE) of parameter θ.
c) A random sample of size 5 yields data x1 = 0.92, x2 = 0.7, x3 =
0.65, x4 = 0.4 and x5 = 0.75. Compute ML Estimate...

Consider a random sample X1,
X2, ⋯ Xn from the
pdf
fx;θ=.51+θx, -1≤x≤1;0,
o.w., where (this distribution arises in particle
physics).
Find the method of moment estimator of θ.
Compute the variance of your estimator. Hint: Compute the
variance of X and then apply the formula for X, etc.

Let X1, X2, · · · , Xn be a random sample from an exponential
distribution f(x) = (1/θ)e^(−x/θ) for x ≥ 0. Show that likelihood
ratio test of H0 : θ = θ0 against H1 : θ ≠ θ0 is based on the
statistic n∑i=1 Xi.

Let X1,...,Xn be a random sample from the pdf f(x;θ) = θx^(θ−1)
, 0 ≤ x ≤ 1 , 0 < θ < ∞ Find the method of moments estimator
of θ.

ADVERTISEMENT

Get Answers For Free

Most questions answered within 1 hours.

ADVERTISEMENT

asked 1 minute ago

asked 14 minutes ago

asked 24 minutes ago

asked 29 minutes ago

asked 29 minutes ago

asked 37 minutes ago

asked 58 minutes ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago