Question

Uncorrelated and Gaussian does not imply independent unless jointly Gaussian. Let X ∼N(0,1) and Y =...

Uncorrelated and Gaussian does not imply independent unless jointly Gaussian. Let X ∼N(0,1) and Y = WX, where p(W = −1) = p(W = 1) = 0 .5. It is clear that X and Y are not independent, since Y is a function of X. a. Show Y ∼N(0,1). b. Show cov[X,Y ]=0. Thus X and Y are uncorrelated but dependent, even though they are Gaussian. Hint: use the definition of covariance cov[X,Y]=E [XY] −E [X] E [Y ] and the rule of iterated expectation E [XY]=E [E [XY|W]]

Homework Answers

Answer #1

for (b) independe of x and w is used instead of law of iterated expectation.

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Calculate the quantity of interest please. a) Let X,Y be jointly continuous random variables generated as...
Calculate the quantity of interest please. a) Let X,Y be jointly continuous random variables generated as follows: Select X = x as a uniform random variable on [0,1]. Then, select Y as a Gaussian random variable with mean x and variance 1. Compute E[Y ]. b) Let X,Y be jointly Gaussian, with mean E[X] = E[Y ] = 0, variances V ar[X] = 1,V ar[Y ] = 1 and covariance Cov[X,Y ] = 0.4. Compute E[(X + 2Y )2].
Let X and Y be jointly distributed random variables with means, E(X) = 1, E(Y) =...
Let X and Y be jointly distributed random variables with means, E(X) = 1, E(Y) = 0, variances, Var(X) = 4, Var(Y ) = 5, and covariance, Cov(X, Y ) = 2. Let U = 3X-Y +2 and W = 2X + Y . Obtain the following expectations: A.) Var(U): B.) Var(W): C. Cov(U,W): ans should be 29, 29, 21 but I need help showing how to solve.
Let random variables X and Y follow a bivariate Gaussian distribution, where X and Y are...
Let random variables X and Y follow a bivariate Gaussian distribution, where X and Y are independent and Cov(X,Y) = 0. Show that Y|X ~ Normal(E[Y|X], V[Y|X]). What are E[Y|X] and V[Y|X]?
1. Let (X; Y ) be a continuous random vector with joint probability density function fX;Y...
1. Let (X; Y ) be a continuous random vector with joint probability density function fX;Y (x, y) = k(x + y^2) if 0 < x < 1 and 0 < y < 1 0 otherwise. Find the following: I: The expectation of XY , E(XY ). J: The covariance of X and Y , Cov(X; Y ).
Show that if X ∼ N(0,1) and Y ∼ chi-squares with df n are independent, X/sqrt(Y/n)...
Show that if X ∼ N(0,1) and Y ∼ chi-squares with df n are independent, X/sqrt(Y/n) is Student t with df n.
Let n be a positive integer and p and r two real numbers in the interval...
Let n be a positive integer and p and r two real numbers in the interval (0,1). Two random variables X and Y are defined on a the same sample space. All we know about them is that X∼Geom(p) and Y∼Bin(n,r). (In particular, we do not know whether X and Y are independent.) For each expectation below, decide whether it can be calculated with this information, and if it can, give its value (in terms of p, n, and r)....
Let two independent random vectors x and z have Gaussian distributions: p(x) = N(x|µx,Σx), and p(z)...
Let two independent random vectors x and z have Gaussian distributions: p(x) = N(x|µx,Σx), and p(z) = N(z|µz,Σz). Now consider y = x + z. Use the results for Gaussian linear system to find the distribution p(y) for y. Hint. Consider p(x) and p(y|x). Please prove for it rather than directly giving the result.
Suppose that X, Y, and Z are independent, with E[X]=E[Y]=E[Z]=2, and E[X2]=E[Y2]=E[Z2]=5. Find cov(XY,XZ). (Enter a...
Suppose that X, Y, and Z are independent, with E[X]=E[Y]=E[Z]=2, and E[X2]=E[Y2]=E[Z2]=5. Find cov(XY,XZ). (Enter a numerical answer.) cov(XY,XZ)= Let X be a standard normal random variable. Another random variable is determined as follows. We flip a fair coin (independent from X). In case of Heads, we let Y=X. In case of Tails, we let Y=−X. Is Y normal? Justify your answer. yes no not enough information to determine Compute Cov(X,Y). Cov(X,Y)= Are X and Y independent? yes no not...
Suppose that the joint probability density function of the random variables X and Y is f(x,...
Suppose that the joint probability density function of the random variables X and Y is f(x, y) = 8 >< >: x + cy^2 0 ≤ x ≤ 1, 0 ≤ y ≤ 1 0 otherwise. (a) Sketch the region of non-zero probability density and show that c = 3/ 2 . (b) Find P(X + Y < 1), P(X + Y = 1) and P(X + Y > 1). (c) Compute the marginal density function of X and Y...
Suppose that X and Y are continuous and jointly distributed by f(x, y) = c(x +...
Suppose that X and Y are continuous and jointly distributed by f(x, y) = c(x + y)2 on the triangular region defined by 0 ≤ y ≤ x ≤ 1. a. Find c so that we have a joint pdf. b. Find the marginal for X c. Find the marginal for Y. d. Find E[X] and V[X]. e. Find E[Y] and V[Y]. f. Find E[XY] g. Find cov(X, Y). h. Find the correlation coefficient for the two variables. i. Prove...
ADVERTISEMENT
Need Online Homework Help?

Get Answers For Free
Most questions answered within 1 hours.

Ask a Question
ADVERTISEMENT