Question

Let Xi, i=1,...,n be independent exponential r.v. with mean 1/ui. Define Yn=min(X1,...,Xn), Zn=max(X1,...,Xn). 1. Define the...

Let Xi, i=1,...,n be independent exponential r.v. with mean 1/ui. Define Yn=min(X1,...,Xn), Zn=max(X1,...,Xn).

1. Define the CDF of Yn,Zn

2. What is E(Zn)

3. Show that the probability that Xi is the smallest one among X1,...,Xn is equal to ui/(u1+...+un)

Homework Answers

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
let X1 X2 ...Xn-1 Xn be independent exponentially distributed variables with mean beta a). find sampling...
let X1 X2 ...Xn-1 Xn be independent exponentially distributed variables with mean beta a). find sampling distribution of the first order statistic b). Is this an exponential distribution if yes why c). If n=5 and beta=2 then find P(Y1<=3.6) d). find the probability distribution of Y1=max(X1, X2, ..., Xn)
Let X1, X2, ... be i.i.d. r.v. and N an independent nonnegative integer valued r.v. Let...
Let X1, X2, ... be i.i.d. r.v. and N an independent nonnegative integer valued r.v. Let SN=X1 +...+ XN. Assume that the m.g.f. of the Xi, denoted MX(t), and the m.g.f. of N, denoted MN(t) are finite in some interval (-δ, δ) around the origin. 1. Express the m.g.f. MS_N(t) of SN in terms ofMX(t) and MN(t). 2. Give an alternate proof of Wald's identity by computing the expectation E[SN] as M'S_N(0). 3. Express the second moment E[SN2] in terms...
Let X1, X2, . . . , Xn be iid following exponential distribution with parameter λ...
Let X1, X2, . . . , Xn be iid following exponential distribution with parameter λ whose pdf is f(x|λ) = λ^(−1) exp(− x/λ), x > 0, λ > 0. (a) With X(1) = min{X1, . . . , Xn}, find an unbiased estimator of λ, denoted it by λ(hat). (b) Use Lehmann-Shceffee to show that ∑ Xi/n is the UMVUE of λ. (c) By the definition of completeness of ∑ Xi or other tool(s), show that E(λ(hat) |  ∑ Xi)...
Let X1, . . . , Xn and Y1, . . . , Yn be two...
Let X1, . . . , Xn and Y1, . . . , Yn be two random samples with the same mean µ and variance σ^2 . (The pdf of Xi and Yj are not specified.) Show that T = (1/2)Xbar + (1/2)Ybar is an unbiased estimator of µ. Evaluate MSE(T; µ)
Consider n independent variables, {X1, X2, . . . , Xn} uniformly distributed over the unit...
Consider n independent variables, {X1, X2, . . . , Xn} uniformly distributed over the unit interval, (0, 1). Introduce two new random variables, M = max (X1, X2, . . . , Xn) and N = min (X1, X2, . . . , Xn). (A) Find the joint distribution of a pair (M, N). (B) Derive the CDF and density for M. (C) Derive the CDF and density for N. (D) Find moments of first and second order for...
Let X1, X2, · · · , Xn be a random sample from an exponential distribution...
Let X1, X2, · · · , Xn be a random sample from an exponential distribution f(x) = (1/θ)e^(−x/θ) for x ≥ 0. Show that likelihood ratio test of H0 : θ = θ0 against H1 : θ ≠ θ0 is based on the statistic n∑i=1 Xi.
Let xi, i = 1, . . . , n be iid exponential with rate λ....
Let xi, i = 1, . . . , n be iid exponential with rate λ. Find a conjugate prior for the exponential likelihood.
Let X1, X2, . . ., Xn be independent, but not identically distributed, samples. All these...
Let X1, X2, . . ., Xn be independent, but not identically distributed, samples. All these Xi ’s are assumed to be normally distributed with Xi ∼ N(θci , σ^2 ), i = 1, 2, . . ., n, where θ is an unknown parameter, σ^2 is known, and ci ’s are some known constants (not all ci ’s are zero). We wish to estimate θ. (a) Write down the likelihood function, i.e., the joint density function of (X1, ....
   Let {Xi} be i.i.d. random variables with P(Xi=−1) = P(Xi= 1) = 1/2. Let Sn=...
   Let {Xi} be i.i.d. random variables with P(Xi=−1) = P(Xi= 1) = 1/2. Let Sn= 1 +X1+. . .+Xn be symmetric simple random walk with initial point S0 = 1. Find the probability that Sn eventually hits the point 0. Hint: Define the events A={Sn= 0 for some n} and for M >1, AM = {Sn hits 0 before hitting M}. Show that AM ↗ A.
Suppose that X1, X2,   , Xn and Y1, Y2,   , Yn are independent random samples from populations with...
Suppose that X1, X2,   , Xn and Y1, Y2,   , Yn are independent random samples from populations with means μ1 and μ2 and variances σ12 and σ22, respectively. It can be shown that the random variable Un = (X − Y) − (μ1 − μ2) σ12 + σ22 n satisfies the conditions of the central limit theorem and thus that the distribution function of Un converges to a standard normal distribution function as n → ∞. An experiment is designed to test...