Question

Suppose X and Y are continuous random variables with joint
density function fX;Y (x; y) = x + y on the square [0; 3] x [0; 3].
Compute E[X], E[Y], E[X^{2} + Y^{2}], and Cov(3X -
4; 2Y +3).

Answer #1

Let X and Y be continuous random variables with joint density
function f(x,y) and marginal density functions fX(x) and fY(y)
respectively. Further, the support for both of these marginal
density functions is the interval (0,1).
Which of the following statements is always true? (Note there
may be more than one)
E[X^2Y^3]=(∫0 TO 1 x^2 dx)(∫0 TO 1 y^3dy)
E[X^2Y^3]=∫0 TO 1∫0 TO 1x^2y^3 f(x,y) dy dx
E[Y^3]=∫0 TO 1 y^3 fX(x) dx
E[XY]=(∫0 TO 1 x fX(x)...

Suppose X and Y are continuous random variables with joint
density function f(x,y) = x + y for 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1.
(a). Compute the joint CDF F(x,y).
(b). Compute the marginal density for X and Y .
(c). Compute Cov(X,Y ). Are X and Y independent?

9. Suppose X and Y are continuous random variables with joint
density function f(x,y) = x + y for 0 ≤ x ≤ 1 and 0 ≤ y ≤ 1.
(a). Compute the joint CDF F(x,y).
(b). Compute the marginal density for X and Y .
(c). Compute Cov(X,Y ). Are X and Y independent?

Let X and Y be a random variables with the joint probability
density function fX,Y (x, y) = { cx2y, 0 < x2 < y < x for
x > 0 0, otherwise }. compute the marginal probability density
functions fX(x) and fY (y). Are the random variables X and Y
independent?.

* The random variables X and Y have a joint density function
given by fX,Y(x, y) = ⇢ 1/y, 0 < y < 1, 0 < x < y, 0,
otherwise. Compute (a) Cov(X,Y) and (b) Corr(X,Y).

Let fX,Y be the joint density function of the random variables X
and Y which is equal to fX,Y (x, y) = { x + y if 0 < x, y <
1, 0 otherwise. } Compute the probability density function of X + Y
. Referring to the problem above, compute the marginal probability
density functions fX(x) and fY (y). Are the random variables X and
Y independent?

Continuous random variables X1 and X2 with joint density
fX,Y(x,y) are independent and identically distributed with expected
value μ.
Prove that E[X1+X2] = E[X1] +E[X2].

1. Let (X; Y ) be a continuous random vector with joint
probability density function
fX;Y (x, y) =
k(x + y^2) if 0 < x < 1 and 0 < y < 1
0 otherwise.
Find the following:
I: The expectation of XY , E(XY ).
J: The covariance of X and Y , Cov(X; Y ).

Let X and Y be a random variables with the joint probability
density function fX,Y (x, y) = { e −x−y , 0 < x, y < ∞ 0,
otherwise } . a. Let W = max(X, Y ) Compute the probability density
function of W. b. Let U = min(X, Y ) Compute the probability
density function of U. c. Compute the probability density function
of X + Y .

A joint density function is given by fX,Y (x, y) = ( kx, 0 <
x < 1, 0 < y < 1 0, otherwise.
(a) Calculate k
(b) Calculate marginal density function fX(x)
(c) Calculate marginal density function fY (y)
(d) Compute P(X < 0.5, Y < 0.1)
(e) Compute P(X < Y )
(f) Compute P(X < Y |X < 0.5)
(g) Are X and Y independent random variables? Show your
reasoning (no credit for yes/no answer).
(h)...

ADVERTISEMENT

Get Answers For Free

Most questions answered within 1 hours.

ADVERTISEMENT

asked 5 minutes ago

asked 41 minutes ago

asked 44 minutes ago

asked 46 minutes ago

asked 55 minutes ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 1 hour ago

asked 2 hours ago

asked 2 hours ago

asked 2 hours ago