Question

Continuous random variables X1 and X2 with joint density fX,Y(x,y) are independent and identically distributed with...

Continuous random variables X1 and X2 with joint density fX,Y(x,y) are independent and identically distributed with expected value μ.

Prove that E[X1+X2] = E[X1] +E[X2].

Homework Answers

Answer #1

Solution:

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Suppose X and Y are continuous random variables with joint density function fX;Y (x; y) =...
Suppose X and Y are continuous random variables with joint density function fX;Y (x; y) = x + y on the square [0; 3] x [0; 3]. Compute E[X], E[Y], E[X2 + Y2], and Cov(3X - 4; 2Y +3).
Suppose X1, X2, X3, and X4 are independent and identically distributed random variables with mean 10...
Suppose X1, X2, X3, and X4 are independent and identically distributed random variables with mean 10 and variance 16. in addition, Suppose that Y1, Y2, Y3, Y4, and Y5are independent and identically distributed random variables with mean 15 and variance 25. Suppose further that X1, X2, X3, and X4 and Y1, Y2, Y3, Y4, and Y5are independent. Find Cov[bar{X} + bar{Y} + 10, 2bar{X} - bar{Y}], where bar{X} is the sample mean of X1, X2, X3, and X4 and bar{Y}...
You are given that X1 and X2 are two independent and identically distributed random variables with...
You are given that X1 and X2 are two independent and identically distributed random variables with a Poisson distribution with mean 2. Let Y = max{X1, X2}. Find P(Y = 1).
Let X and Y be a random variables with the joint probability density function fX,Y (x,...
Let X and Y be a random variables with the joint probability density function fX,Y (x, y) = { cx2y, 0 < x2 < y < x for x > 0 0, otherwise }. compute the marginal probability density functions fX(x) and fY (y). Are the random variables X and Y independent?.
Suppose that X1 and X2 are independent continuous random variables with the same probability density function...
Suppose that X1 and X2 are independent continuous random variables with the same probability density function as: f(x) = ( x 2 0 < x < 2, 0 otherwise. Let a new random variable be Y = min(X1, X2,). a) Use distribution function method to find the probability density function of Y, fY (y). b) Compute P(Y > 1). c) Compute E(Y )
Suppose that X1 and X2 are independent continuous random variables with the same probability density function...
Suppose that X1 and X2 are independent continuous random variables with the same probability density function as: f(x) = ( x 2 0 < x < 2, 0 otherwise. Let a new random variable be Y = min(X1, X2,). a) Use distribution function method to find the probability density function of Y, fY (y). b) Compute P(Y > 1).
Let fX,Y be the joint density function of the random variables X and Y which is...
Let fX,Y be the joint density function of the random variables X and Y which is equal to fX,Y (x, y) = { x + y if 0 < x, y < 1, 0 otherwise. } Compute the probability density function of X + Y . Referring to the problem above, compute the marginal probability density functions fX(x) and fY (y). Are the random variables X and Y independent?
Suppose that X1, X2, . . . , Xn are independent identically distributed random variables with...
Suppose that X1, X2, . . . , Xn are independent identically distributed random variables with variance σ2. Let Y1 = X2 +X3 , Y2 = X1 +X3 and Y3 = X1 + X2. Find the following : (in terms of σ2) (a) Var(Y1) (b) cov(Y1 , Y2 ) (c) cov(X1 , Y1 ) (d) Var[(Y1 + Y2 + Y3)/2]
Let X and Y be continuous random variables with joint density function f(x,y) and marginal density...
Let X and Y be continuous random variables with joint density function f(x,y) and marginal density functions fX(x) and fY(y) respectively. Further, the support for both of these marginal density functions is the interval (0,1). Which of the following statements is always true? (Note there may be more than one)    E[X^2Y^3]=(∫0 TO 1 x^2 dx)(∫0 TO 1 y^3dy)    E[X^2Y^3]=∫0 TO 1∫0 TO 1x^2y^3 f(x,y) dy dx    E[Y^3]=∫0 TO 1 y^3 fX(x) dx   E[XY]=(∫0 TO 1 x fX(x)...
Let x1, x2 x3 ....be a sequence of independent and identically distributed random variables, each having...
Let x1, x2 x3 ....be a sequence of independent and identically distributed random variables, each having finite mean E[xi] and variance Var(xi). a)calculate the var (x1+x2) b)calculate the var(E[xi]) c) if n-> infinite, what is Var(E[xi])?