Question

Suppose X_1, X_2, … X_n is a random sample from a population with density f(x) = (2/theta)*x*e^(-x^2/theta) for x greater or equal to zero.

- Find the Maximum Likelihood Estimator Theta-Hat_1 for theta.
- Find the Method of Moment Estimator Theta-Hat_2 for theta.

Answer #1

a) MLE :

b) Method of Moment:

Suppose X_1, X_2, … X_n is a random sample from a population
with density f(x) = (2/theta)*x*e^(-x^2/theta) for x greater or
equal to zero.
Determine if Theta-Hat_1 (MLE) is a minimum variance unbiased
estimator for thet
Determine if Theta-Hat_2 (MOM) is a minimum variance unbiased
estimator for theta.

Let X_1, ..., X_n be a random sample from a normal
distribution, N(0, theta). Is theta_hat a UMVUE of
theta?
The above question is from chapter 9 problem 23b of Introduction
to Probability and Mathematical Statistics (for which you have a
solution posted on this website). I'm confused about the part in
the posted solution where we go from the line that says E(x^4
-2\theta * E(x^2) + E(\theta^2) to the line that says
(3\theta^2-2\theta^2+\theta^2). Could you please explain this...

Let X_1,…, X_n be a random sample from the Bernoulli
distribution, say P[X=1]=θ=1-P[X=0].
and
Cramer Rao Lower Bound of θ(1-θ)
=((1-2θ)^2 θ(1-θ))/n
Find the UMVUE of θ(1-θ) if such exists.
can you proof [part (b) ] using (Leehmann Scheffe
Theorem step by step solution) to proof
[∑X1-nXbar^2 ]/(n-1) is the umvue , I have the key
solution below
x is complete and sufficient.
S^2=∑ [X1-Xbar ]^2/(n-1) is unbiased estimator of θ(1-θ) since
the sample variance is an unbiased estimator of the...

6. Let X1, X2, ..., Xn be a random sample of a random variable X
from a distribution with density
f (x) ( 1)x 0 ≤ x ≤ 1
where θ > -1. Obtain,
a) Method of Moments Estimator (MME) of parameter θ.
b) Maximum Likelihood Estimator (MLE) of parameter θ.
c) A random sample of size 5 yields data x1 = 0.92, x2 = 0.7, x3 =
0.65, x4 = 0.4 and x5 = 0.75. Compute ML Estimate...

Suppose the random variable X has pdf f(x;?, ?)=??x?−1e−?x? for
x≥0;?, ? > 0.
a) Find the maximum likelihood estimator for ?, assuming that ?
is known.
b) Suppose ? and ? are both unknown. Write down the equations
that would be solved simultaneously to ﬁnd the maximum likelihood
estimators of ? and ?.

Let X1,..., Xn be a random sample from a
distribution with pdf as follows:
fX(x) = e^-(x-θ) , x > θ
0 otherwise.
Find the sufficient statistic for θ.
Find the maximum likelihood estimator of θ.
Find the MVUE of θ,θˆ
Is θˆ a consistent estimator of θ?

6. Let θ > 1 and let X1, X2, ..., Xn be a random sample from
the distribution with probability density function f(x; θ) =
1/(xlnθ) , 1 < x < θ.
a) Obtain the maximum likelihood estimator of θ, ˆθ.
b) Is ˆθ a consistent estimator of θ? Justify your answer.

Suppose that the joint probability density function of the
random variables X and Y is f(x, y) = 8 >< >: x + cy^2 0 ≤
x ≤ 1, 0 ≤ y ≤ 1 0 otherwise.
(a) Sketch the region of non-zero probability density and show
that c = 3/ 2 .
(b) Find P(X + Y < 1), P(X + Y = 1) and P(X + Y > 1).
(c) Compute the marginal density function of X and Y...

suppose we draw a random sample of size n from a Poisson
distribution with parameter λ. show that the maximum likelihood
estimator for λ is an efficient estimator

Suppose that X is a discrete random variable with ?(? = 1) = ?
and ?(? = 2) = 1 − ?. Three independent observations of X are made:
(?1, ?2, ?3) = (1,2,2).
a. Estimate ? through the sample mean (this is an example of the
“method of moment” for estimating a parameter).
b. Find the likelihood function and MLE for ?.

ADVERTISEMENT

Get Answers For Free

Most questions answered within 1 hours.

ADVERTISEMENT

asked 2 minutes ago

asked 2 minutes ago

asked 5 minutes ago

asked 5 minutes ago

asked 5 minutes ago

asked 7 minutes ago

asked 7 minutes ago

asked 7 minutes ago

asked 8 minutes ago

asked 8 minutes ago

asked 14 minutes ago

asked 15 minutes ago