Question

Let X1, X2, . . ., Xn be independent, but not identically distributed, samples. All these...

Let X1, X2, . . ., Xn be independent, but not identically distributed, samples. All these Xi ’s are assumed to be normally distributed with

Xi ∼ N(θci , σ^2 ), i = 1, 2, . . ., n,

where θ is an unknown parameter, σ^2 is known, and ci ’s are some known constants (not all ci ’s are zero). We wish to estimate θ.

(a) Write down the likelihood function, i.e., the joint density function of (X1, . . ., Xn), and identify a sufficient statistic.

(b) Use your sufficient statistic to construct MVUE for θ.

Homework Answers

Answer #1

Please give a thumbs up if you like the solution

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Let X1,X2,...,Xn be i.i.d. (independent and identically distributed) from the uniform distribution U(μ,μ+1) where μ∈R is...
Let X1,X2,...,Xn be i.i.d. (independent and identically distributed) from the uniform distribution U(μ,μ+1) where μ∈R is unknown. Find a minimal sufficient statistic for μ parameter.
Suppose that X1, X2, . . . , Xn are independent identically distributed random variables with...
Suppose that X1, X2, . . . , Xn are independent identically distributed random variables with variance σ2. Let Y1 = X2 +X3 , Y2 = X1 +X3 and Y3 = X1 + X2. Find the following : (in terms of σ2) (a) Var(Y1) (b) cov(Y1 , Y2 ) (c) cov(X1 , Y1 ) (d) Var[(Y1 + Y2 + Y3)/2]
Let X1, X2, · · · , Xn be a random sample from an exponential distribution...
Let X1, X2, · · · , Xn be a random sample from an exponential distribution f(x) = (1/θ)e^(−x/θ) for x ≥ 0. Show that likelihood ratio test of H0 : θ = θ0 against H1 : θ ≠ θ0 is based on the statistic n∑i=1 Xi.
Let x1, x2 x3 ....be a sequence of independent and identically distributed random variables, each having...
Let x1, x2 x3 ....be a sequence of independent and identically distributed random variables, each having finite mean E[xi] and variance Var(xi). a)calculate the var (x1+x2) b)calculate the var(E[xi]) c) if n-> infinite, what is Var(E[xi])?
let X1, . . . , Xn i.i.d. Gamma(α, β), β > 0 known and α...
let X1, . . . , Xn i.i.d. Gamma(α, β), β > 0 known and α > 0 unknown. (a) Find the sufficient statistic for α (b) Use the sufficient statistic found in (a) to find the MVUE of α n . b) Use the sufficient statistic found in (a) to find the MVUE of α n
Let X1,..., Xn be a random sample from a distribution with pdf as follows: fX(x) =...
Let X1,..., Xn be a random sample from a distribution with pdf as follows: fX(x) = e^-(x-θ) , x > θ 0 otherwise. Find the sufficient statistic for θ. Find the maximum likelihood estimator of θ. Find the MVUE of θ,θˆ Is θˆ a consistent estimator of θ?
let X1 X2 ...Xn-1 Xn be independent exponentially distributed variables with mean beta a). find sampling...
let X1 X2 ...Xn-1 Xn be independent exponentially distributed variables with mean beta a). find sampling distribution of the first order statistic b). Is this an exponential distribution if yes why c). If n=5 and beta=2 then find P(Y1<=3.6) d). find the probability distribution of Y1=max(X1, X2, ..., Xn)
Let X1, X2, . . . , Xn be iid following exponential distribution with parameter λ...
Let X1, X2, . . . , Xn be iid following exponential distribution with parameter λ whose pdf is f(x|λ) = λ^(−1) exp(− x/λ), x > 0, λ > 0. (a) With X(1) = min{X1, . . . , Xn}, find an unbiased estimator of λ, denoted it by λ(hat). (b) Use Lehmann-Shceffee to show that ∑ Xi/n is the UMVUE of λ. (c) By the definition of completeness of ∑ Xi or other tool(s), show that E(λ(hat) |  ∑ Xi)...
Consider n independent variables, {X1, X2, . . . , Xn} uniformly distributed over the unit...
Consider n independent variables, {X1, X2, . . . , Xn} uniformly distributed over the unit interval, (0, 1). Introduce two new random variables, M = max (X1, X2, . . . , Xn) and N = min (X1, X2, . . . , Xn). (A) Find the joint distribution of a pair (M, N). (B) Derive the CDF and density for M. (C) Derive the CDF and density for N. (D) Find moments of first and second order for...
Let X1, X2, ..., Xn be a random sample from a distribution with probability density function...
Let X1, X2, ..., Xn be a random sample from a distribution with probability density function f(x; θ) = (θ 4/6)x 3 e −θx if 0 < x < ∞ and 0 otherwise where θ > 0 . a. Justify the claim that Y = X1 + X2 + ... + Xn is a complete sufficient statistic for θ. b. Compute E(1/Y ) and find the function of Y which is the unique minimum variance unbiased estimator of θ. b.  Compute...