Question

.Let X be a random variable that takes the value 0 with probability 1/2, and takes...

.Let X be a random variable that takes the value 0 with probability 1/2, and takes the value 1 with probability 1/2; let Y be a random variable, independent of X, that takes the value 1 with probability 1/2, and takes the value −1 with probability 1/2; and let Z = XY . Show that Cov(X, Z) = 0. Show that X and Z are dependent

Homework Answers

Answer #1

X = 1 with p =1/2

0   with p = 1/2

Y = 1 with p = 1/2

    -1 with p = 1/2

Z = XY

X/Y 1 -1
0 0.25 0.25
1 0.25 0.25
Z p
0 0.5
1 0.25
-1 0.25

Cov(X,Z) = E(XZ) - E(X)E(Z)

joint distribution of X and Z

X/Z 0 1 -1
0 0.5 0 0
1 0 0.25 0.25

hence

Cov(X,Z) = 0

X/Z 0 1 -1
0 0.5 0 0 0.5
1 0 0.25 0.25 0.5
0.5 0.25 0.25 1

P(X = 0, Z =0) = 0.5

P(X = 0) = 0.5 and P(Z = 0) = 0.5

P(X = 0) * P(Z =0) = 0.25

since P(X = 0, Z =0) P(X = 0) * P(Z =0)

X and Z are dependent

Please rate

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Suppose a random variable X takes on the value of -1 or 1, each with the...
Suppose a random variable X takes on the value of -1 or 1, each with the probability of 1/2. Let y=X1+X2+X3+X4, where X1,....X4 are independent. Find E(Y) and Find Var(Y)
1. Let (X; Y ) be a continuous random vector with joint probability density function fX;Y...
1. Let (X; Y ) be a continuous random vector with joint probability density function fX;Y (x, y) = k(x + y^2) if 0 < x < 1 and 0 < y < 1 0 otherwise. Find the following: I: The expectation of XY , E(XY ). J: The covariance of X and Y , Cov(X; Y ).
Let X be a discrete random variable that takes on the values −1, 0, and 1....
Let X be a discrete random variable that takes on the values −1, 0, and 1. If E (X) = 1/2 and Var(X) = 7/16, what is the probability mass function of X?
Let X be a random variable with probability density function fX (x) = I (0, 1)...
Let X be a random variable with probability density function fX (x) = I (0, 1) (x). Determine the probability density function of Y = 3X + 1 and the density function of probability of Z = - log (X).
Suppose we let X be a random variable that takes value 1, 2, 3 with equal...
Suppose we let X be a random variable that takes value 1, 2, 3 with equal probability. Then How many samples of X are needed on average to see all values at least once? Also, what if X is a random variable that takes values 1, 2 or 1,2,3,4? Thank you!
Suppose that X, Y, and Z are independent, with E[X]=E[Y]=E[Z]=2, and E[X2]=E[Y2]=E[Z2]=5. Find cov(XY,XZ). (Enter a...
Suppose that X, Y, and Z are independent, with E[X]=E[Y]=E[Z]=2, and E[X2]=E[Y2]=E[Z2]=5. Find cov(XY,XZ). (Enter a numerical answer.) cov(XY,XZ)= Let X be a standard normal random variable. Another random variable is determined as follows. We flip a fair coin (independent from X). In case of Heads, we let Y=X. In case of Tails, we let Y=−X. Is Y normal? Justify your answer. yes no not enough information to determine Compute Cov(X,Y). Cov(X,Y)= Are X and Y independent? yes no not...
Let the random variable X and Y have the joint pmf f(x, y) = xy^2/c where...
Let the random variable X and Y have the joint pmf f(x, y) = xy^2/c where x = 1, 2, 3; y = 1, 2, x + y ≤ 4 , that is, (x, y) are {(1, 1),(1, 2),(2, 1),(2, 2),(3, 1)} . (a) Find c > 0 . (b) Find μX (c) Find μY (d) Find σ^2 X (e) Find σ^2 Y (f) Find Cov (X, Y ) (g) Find ρ , Corr (X, Y ) (h) Are X...
Let K be a random variable that takes, with equal probability 1/(2n+1), the integer values in...
Let K be a random variable that takes, with equal probability 1/(2n+1), the integer values in the interval [-n,n]. Find the PMF of the random variable Y = In X. Where X = a^[k]. and a is a positive number, let n = 7 and a = 2. Then what is E[Y ]?
Let X and Y be random variable follow uniform U[0, 1]. Let Z = X to...
Let X and Y be random variable follow uniform U[0, 1]. Let Z = X to the power of Y. What is the distribution of Z?
Uncorrelated and Gaussian does not imply independent unless jointly Gaussian. Let X ∼N(0,1) and Y =...
Uncorrelated and Gaussian does not imply independent unless jointly Gaussian. Let X ∼N(0,1) and Y = WX, where p(W = −1) = p(W = 1) = 0 .5. It is clear that X and Y are not independent, since Y is a function of X. a. Show Y ∼N(0,1). b. Show cov[X,Y ]=0. Thus X and Y are uncorrelated but dependent, even though they are Gaussian. Hint: use the definition of covariance cov[X,Y]=E [XY] −E [X] E [Y ] and...
ADVERTISEMENT
Need Online Homework Help?

Get Answers For Free
Most questions answered within 1 hours.

Ask a Question
ADVERTISEMENT