Question

Suppose Y1, . . . , Yn are independent random variables with common density fY(y) =...

Suppose Y1, . . . , Yn are independent random variables with common density fY(y) = eμ−y , y > μ

1. Find the Method of Moments Estimator for μ.

2. Find the MLE for μ. Then find the bias of the estimator

Homework Answers

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Suppose that Y1, . . . , Yn are iid random variables from the pdf f(y...
Suppose that Y1, . . . , Yn are iid random variables from the pdf f(y | θ) = 6y^5/(θ^6) I(0 ≤ y ≤ θ). (a) Prove that Y(n) = max (Y1, . . . , Yn) is sufficient for θ. (b) Find the MLE of θ
Suppose we have a random sample Y1, . . . , Yn from a CRV with...
Suppose we have a random sample Y1, . . . , Yn from a CRV with density fY (y; θ) = θ*(y + 1)^(θ+1) where y > 0, θ > 1 Find the MME and MLE for θ.
Let Y1,Y2.....,Yn be independent ,uniformly distributed random variables on the interval[0,θ].,Y(n)=max(Y1,Y2,....,Yn),which is considered as an estimator...
Let Y1,Y2.....,Yn be independent ,uniformly distributed random variables on the interval[0,θ].,Y(n)=max(Y1,Y2,....,Yn),which is considered as an estimator of θ. Explain why Y is a good estimator for θ when sample size is large.
Let Y1, Y2, . . ., Yn be a random sample from a Laplace distribution with...
Let Y1, Y2, . . ., Yn be a random sample from a Laplace distribution with density function f(y|θ) = (1/2θ)e-|y|/θ for -∞ < y < ∞ where θ > 0. The first two moments of the distribution are E(Y) = 0 and E(Y2) = 2θ2. a) Find the likelihood function of the sample. b) What is a sufficient statistic for θ? c) Find the maximum likelihood estimator of θ. d) Find the maximum likelihood estimator of the standard deviation...
5. Let Y1, Y2, ...Yn (independent and identically distributed. ∼ f(y; α) = 1/6 α8y3 ·...
5. Let Y1, Y2, ...Yn (independent and identically distributed. ∼ f(y; α) = 1/6 α8y3 · e^(−α2y3 ), 0 ≤ y < ∞, 0 < α < ∞. (a) (8 points) Find an expression for the Method of Moments estimator of α, ˜α. Show all work. (b) (8 points) Find an expression for the Maximum Likelihood estimator for α, ˆα. Show all work.
1. (a) Y1,Y2,...,Yn form a random sample from a probability distribution with cumulative distribution function FY...
1. (a) Y1,Y2,...,Yn form a random sample from a probability distribution with cumulative distribution function FY (y) and probability density function fY (y). Let Y(1) = min{Y1,Y2,...,Yn}. Write the cumulative distribution function for Y(1) in terms of FY (y) and hence show that the probability density function for Y(1) is fY(1)(y) = n{1−FY (y)}n−1fY (y). [8 marks] (b) An engineering system consists of 5 components connected in series, so, if one components fails, the system fails. The lifetimes (measured in...
Suppose Y1, . . . , Yn ind∼ Gamma(2, β). (a) Write down the likelihood function...
Suppose Y1, . . . , Yn ind∼ Gamma(2, β). (a) Write down the likelihood function for β based on Y1, . . . , Yn. (b) Write down the log-likelihood function for β based on Y1, . . . , Yn. (c) Find an expression for the MLE of β. (d) Give the MoMs estimator of β.
Let Y1, ..., Yn be IID Poisson(λ) random variables. Argue that Y¯ , the sample mean,...
Let Y1, ..., Yn be IID Poisson(λ) random variables. Argue that Y¯ , the sample mean, is a sufficient statistic for λ by using the factorization criterion. Assuming that Y¯ is a complete sufficient statistic, explain why Y¯ is the minimum variance unbiased estimator.
Suppose that X1 and X2 are independent continuous random variables with the same probability density function...
Suppose that X1 and X2 are independent continuous random variables with the same probability density function as: f(x) = ( x 2 0 < x < 2, 0 otherwise. Let a new random variable be Y = min(X1, X2,). a) Use distribution function method to find the probability density function of Y, fY (y). b) Compute P(Y > 1).
Suppose that X1 and X2 are independent continuous random variables with the same probability density function...
Suppose that X1 and X2 are independent continuous random variables with the same probability density function as: f(x) = ( x 2 0 < x < 2, 0 otherwise. Let a new random variable be Y = min(X1, X2,). a) Use distribution function method to find the probability density function of Y, fY (y). b) Compute P(Y > 1). c) Compute E(Y )