Question

Let X ∼ Unif[−1, 1]. Consider the functions g, h : [−1, 1] → [−1, 1]...

Let X ∼ Unif[−1, 1]. Consider the functions g, h : [−1, 1] → [−1, 1] given by g(x) = 1 − x if x ∈ [0, 1]; x if x ∈ [−1, 0), and h(x) = x if x ∈ [0, 1]; −(x + 1) if x ∈ [−1, 0).

Y = g(X) and Z = h(X) are both uniform Y, Z ∼ Unif[−1, 1].

(c) Prove that the random vectors (X, Y ) and (X, Z) do not have the same joint distribution. This can be done by finding a subset B ⊂ R 2 such that P((X, Y ) ∈ B) 6= P((X, Z) ∈ B).

Homework Answers

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Let X ∼ Unif[−1, 1]. Consider the functions g, h : [−1, 1] → [−1, 1]...
Let X ∼ Unif[−1, 1]. Consider the functions g, h : [−1, 1] → [−1, 1] given by g(x) = 1 − x if x ∈ [0, 1]; x if x ∈ [−1, 0), and h(x) = x if x ∈ [0, 1]; −(x + 1) if x ∈ [−1, 0). Y = g(X) and Z = h(X) are both uniform Y, Z ∼ Unif[−1, 1]. (c) Prove that the random vectors (X, Y ) and (X, Z) do not...
Let X ∼ UNIF(0, 1). Find the pdf of Y = 1 − X using the...
Let X ∼ UNIF(0, 1). Find the pdf of Y = 1 − X using the distribution-function technique. Also indicate the support of Y.
Let X and Y have the joint PDF (i really just need g and h if...
Let X and Y have the joint PDF (i really just need g and h if that makes it easier) f(x) = { c(y + x^2) 0 < x < 1 and 0 < y < 1 ; 0 elsewhere a) Find c such that this is a PDF. b) What is P(X ≤ .4, Y ≤ .2) ? C) Find the Marginal Distribution of X, f(x) D) Find the Marginal Distribution of Y, f(y) E) Are X and Y...
Let X i ~ Unif(0, 1) for 1 <= i <= n be IID (independent identically...
Let X i ~ Unif(0, 1) for 1 <= i <= n be IID (independent identically distributed) random variables. Let Y = max(X 1 , …, X n ). What is E(Y)?
Let X and Y be independent random variables, with X following uniform distribution in the interval...
Let X and Y be independent random variables, with X following uniform distribution in the interval (0, 1) and Y has an Exp (1) distribution. a) Determine the joint distribution of Z = X + Y and Y. b) Determine the marginal distribution of Z. c) Can we say that Z and Y are independent? Good
Let f and g be continuous functions from C to C and let D be a...
Let f and g be continuous functions from C to C and let D be a dense subset of C, i.e., the closure of D equals to C. Prove that if f(z) = g(z) for all x element of D, then f = g on C.
Assume that X~N(0, 1), Y~N(0, 1) and X and Y are independent variables. Let Z =...
Assume that X~N(0, 1), Y~N(0, 1) and X and Y are independent variables. Let Z = X+Y, and joint density of Y and Z is expressed as f(y, z) = g(z|y)*h(y) g(z|y) is conditional distribution of Z given y, and h(y) is density of Y how can i get f(y, z)?
Let two independent random vectors x and z have Gaussian distributions: p(x) = N(x|µx,Σx), and p(z)...
Let two independent random vectors x and z have Gaussian distributions: p(x) = N(x|µx,Σx), and p(z) = N(z|µz,Σz). Now consider y = x + z. Use the results for Gaussian linear system to find the distribution p(y) for y. Hint. Consider p(x) and p(y|x). Please prove for it rather than directly giving the result.
1. Let f : R2 → R2, f(x,y) = ?g(x,y),h(x,y)? where g,h : R2 → R....
1. Let f : R2 → R2, f(x,y) = ?g(x,y),h(x,y)? where g,h : R2 → R. Show that f is continuous at p0 ⇐⇒ both g,h are continuous at p0
Let X and Y be continuous random variables with joint distribution function F(x, y), and let...
Let X and Y be continuous random variables with joint distribution function F(x, y), and let g(X, Y ) and h(X, Y ) be functions of X and Y . Prove the following: (a) E[cg(X, Y )] = cE[g(X, Y )]. (b) E[g(X, Y ) + h(X, Y )] = E[g(X, Y )] + E[h(X, Y )]. (c) V ar(a + X) = V ar(X). (d) V ar(aX) = a 2V ar(X). (e) V ar(aX + bY ) = a...