Question

6. Let θ > 1 and let X1, X2, ..., Xn be a random sample from the distribution with probability density function f(x; θ) = 1/(xlnθ) , 1 < x < θ.

a) Obtain the maximum likelihood estimator of θ, ˆθ.

b) Is ˆθ a consistent estimator of θ? Justify your answer.

Answer #1

6. Let X1, X2, ..., Xn be a random sample of a random variable X
from a distribution with density
f (x) ( 1)x 0 ≤ x ≤ 1
where θ > -1. Obtain,
a) Method of Moments Estimator (MME) of parameter θ.
b) Maximum Likelihood Estimator (MLE) of parameter θ.
c) A random sample of size 5 yields data x1 = 0.92, x2 = 0.7, x3 =
0.65, x4 = 0.4 and x5 = 0.75. Compute ML Estimate...

Let X1, X2, ..., Xn be a random sample from a distribution with
probability density function f(x; θ) = (θ 4/6)x 3 e −θx if 0 < x
< ∞ and 0 otherwise where θ > 0
. a. Justify the claim that Y = X1 + X2 + ... + Xn is a complete
sufficient statistic for θ. b. Compute E(1/Y ) and find the
function of Y which is the unique minimum variance unbiased
estimator of θ.
b. Compute...

Let θ > 1 and let X1, X2, ..., Xn be a random sample from the
distribution with probability density function f(x; θ) = 1/xlnθ , 1
< x < θ.
c) Let Zn = nlnY1. Find the limiting distribution of Zn.
d) Let Wn = nln( θ/Yn ). Find the limiting distribution of
Wn.

Let X1, X2, ·······, Xn be a random sample from the Bernoulli
distribution. Under the condition 1/2≤Θ≤1, find a
maximum-likelihood estimator of Θ.

Let X1, X2,..., Xn be a random sample from a population with
probability density function f(x) = theta(1-x)^(theta-1), where
0<x<1, where theta is a positive unknown parameter
a) Find the method of moments estimator of theta
b) Find the maximum likelihood estimator of theta
c) Show that the log likelihood function is maximized at
theta(hat)

Let B > 0 and let X1 , X2 , … , Xn be a random sample from
the distribution with probability density function.
f( x ; B ) = β/ (1 +x)^ (B+1), x > 0, zero otherwise.
(i) Obtain the maximum likelihood estimator for B, β ˆ .
(ii) Suppose n = 5, and x 1 = 0.3, x 2 = 0.4, x 3 = 1.0, x 4 =
2.0, x 5 = 4.0. Obtain the maximum likelihood...

Let X1,..., Xn be a random sample from a
distribution with pdf as follows:
fX(x) = e^-(x-θ) , x > θ
0 otherwise.
Find the sufficient statistic for θ.
Find the maximum likelihood estimator of θ.
Find the MVUE of θ,θˆ
Is θˆ a consistent estimator of θ?

Let X2, ... , Xn denote a random sample
from a discrete uniform distribution over the integers - θ, - θ +
1, ... , -1, 0, 1, ... , θ - 1, θ,
where θ is a positive integer. What is the maximum
likelihood estimator of θ?
A) min[X1, .. , Xn]
B) max[X1, .. , Xn]
C) -min[X1, .. , Xn]
D) (max[X1, .. , Xn] -
min[X1, .. , Xn]) / 2
E) max[|X1| , ... , |Xn|]

Let X1, X2, ..., Xn be a random sample (of size n) from U(0,θ).
Let Yn be the maximum of X1, X2, ..., Xn.
(a) Give the pdf of Yn.
(b) Find the mean of Yn.
(c) One estimator of θ that has been proposed is Yn. You may
note from your answer to part (b) that Yn is a biased estimator of
θ. However, cYn is unbiased for some constant c. Determine c.
(d) Find the variance of cYn,...

Let X1, X2, · · · , Xn be a random sample from an exponential
distribution f(x) = (1/θ)e^(−x/θ) for x ≥ 0. Show that likelihood
ratio test of H0 : θ = θ0 against H1 : θ ≠ θ0 is based on the
statistic n∑i=1 Xi.

ADVERTISEMENT

Get Answers For Free

Most questions answered within 1 hours.

ADVERTISEMENT

asked 2 minutes ago

asked 17 minutes ago

asked 17 minutes ago

asked 28 minutes ago

asked 38 minutes ago

asked 42 minutes ago

asked 45 minutes ago

asked 51 minutes ago

asked 58 minutes ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago