Question

Suppose Y1, . . . , Yn are independent random variables with
common density f_{Y}(y) = e^{μ−y} , y > μ

1. Find the Method of Moments Estimator for μ.

2. Find the MLE for μ. Then find the bias of the estimator

Answer #1

Suppose that Y1, . . . , Yn are iid random variables from the
pdf
f(y | θ) = 6y^5/(θ^6) I(0 ≤ y ≤ θ). (a) Prove that Y(n) = max
(Y1, . . . , Yn) is sufficient for θ. (b) Find the MLE of θ

Let Y1,Y2.....,Yn be independent ,uniformly distributed random
variables on the interval[0,θ].，Y(n)=max(Y1,Y2,....,Yn)，which is
considered as an estimator of θ. Explain why Y is a good estimator
for θ when sample size is large.

Let Y1, Y2, . . ., Yn be a
random sample from a Laplace distribution with density function
f(y|θ) = (1/2θ)e-|y|/θ for -∞ < y < ∞
where θ > 0. The first two moments of the distribution are
E(Y) = 0 and E(Y2) = 2θ2.
a) Find the likelihood function of the sample.
b) What is a sufficient statistic for θ?
c) Find the maximum likelihood estimator of θ.
d) Find the maximum likelihood estimator of the standard
deviation...

5. Let Y1, Y2, ...Yn (independent and identically distributed. ∼
f(y; α) = 1/6 α8y3 · e^(−α2y3 ), 0 ≤
y < ∞, 0 < α < ∞.
(a) (8 points) Find an expression for the Method of Moments
estimator of α, ˜α. Show all work.
(b) (8 points) Find an expression for the Maximum Likelihood
estimator for α, ˆα. Show all work.

1. (a) Y1,Y2,...,Yn form a random sample from a probability
distribution with cumulative distribution function FY (y) and
probability density function fY (y). Let Y(1) = min{Y1,Y2,...,Yn}.
Write the cumulative distribution function for Y(1) in terms of FY
(y) and hence show that the probability density function for Y(1)
is fY(1)(y) = n{1−FY (y)}n−1fY (y). [8 marks]
(b) An engineering system consists of 5 components connected in
series, so, if one components fails, the system fails. The
lifetimes (measured in...

Let Y1, ..., Yn be IID Poisson(λ) random variables. Argue that
Y¯ , the sample mean, is a sufficient statistic for λ by using the
factorization criterion. Assuming that Y¯ is a complete sufficient
statistic, explain why Y¯ is the minimum variance unbiased
estimator.

Suppose that X1 and X2 are independent continuous random
variables with the same probability density function as: f(x) = ( x
2 0 < x < 2, 0 otherwise. Let a new random variable be Y =
min(X1, X2,).
a) Use distribution function method to find the probability
density function of Y, fY (y).
b) Compute P(Y > 1).

Suppose that X1 and X2 are independent continuous random
variables with the same probability density function as: f(x) = ( x
2 0 < x < 2, 0 otherwise. Let a new random variable be Y =
min(X1, X2,).
a) Use distribution function method to find the probability
density function of Y, fY (y).
b) Compute P(Y > 1).
c) Compute E(Y )

Let X and Y be independent random variables with density functions given by fX (x) = 1/2, −1 ≤ x ≤ 1 and fY (y) = 1/2, 3 ≤ y ≤ 5. Find the density function of X-Y.

Let ?? ~ ???? (2?, 4?) independet random variables. for ? =
1,2,… ?.
a)Find an estimator for ? by the method of moments.
b) Find an estimator for ? by the maximum likelihood estimator
(MLE)

ADVERTISEMENT

Get Answers For Free

Most questions answered within 1 hours.

ADVERTISEMENT

asked 11 minutes ago

asked 16 minutes ago

asked 16 minutes ago

asked 19 minutes ago

asked 19 minutes ago

asked 52 minutes ago

asked 53 minutes ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago