Question

Let V be a vector space and let U1, U2 be two subspaces of V ....

Let V be a vector space and let U1, U2 be two subspaces of V . Show that U1 ∩ U2 is a subspace of V . By giving an example, show that U1 ∪ U2 is in general not a subspace of V .

Homework Answers

Know the answer?
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for?
Ask your own homework help question
Similar Questions
Let U1, U2 be subspaces of a vector space V. Prove that the union of U1...
Let U1, U2 be subspaces of a vector space V. Prove that the union of U1 and U2 is a subspace if and only if either U1 is a subset of U2 or U2 is a subset of U1.
5. Let U1, U2, U3 be subspaces of a vector space V. Prove that U1, U2,...
5. Let U1, U2, U3 be subspaces of a vector space V. Prove that U1, U2, U3 are direct-summable if and only if (i) the intersection of U1 and U2 is 0.\, and (ii) the intersection of U1+U2 and U3 is 0. A detailed explanation would be greatly appreciated :)
Let U and V be subspaces of the vector space W . Recall that U ∩...
Let U and V be subspaces of the vector space W . Recall that U ∩ V is the set of all vectors ⃗v in W that are in both of U or V , and that U ∪ V is the set of all vectors ⃗v in W that are in at least one of U or V i: Prove: U ∩V is a subspace of W. ii: Consider the statement: “U ∪ V is a subspace of W...
If v1 and v2 are linearly independent vectors in vector space V, and u1, u2, and...
If v1 and v2 are linearly independent vectors in vector space V, and u1, u2, and u3 are each a linear combination of them, prove that {u1, u2, u3} is linearly dependent. Do NOT use the theorem which states, " If S = { v 1 , v 2 , . . . , v n } is a basis for a vector space V, then every set containing more than n vectors in V is linearly dependent." Prove without...
Let U and W be subspaces of a nite dimensional vector space V such that U...
Let U and W be subspaces of a nite dimensional vector space V such that U ∩ W = {~0}. Dene their sum U + W := {u + w | u ∈ U, w ∈ W}. (1) Prove that U + W is a subspace of V . (2) Let U = {u1, . . . , ur} and W = {w1, . . . , ws} be bases of U and W respectively. Prove that U ∪ W...
Let B={u1,...un} be an orthonormal basis for inner product space V and v b any vector...
Let B={u1,...un} be an orthonormal basis for inner product space V and v b any vector in V. Prove that v =c1u1 + c2u2 +....+cnun where c1=<v,u1>, c2=<v,u2>,...,cn=<v,un>
Let S, U, and W be subspaces of a vector space V, where U ⊆ W....
Let S, U, and W be subspaces of a vector space V, where U ⊆ W. Show that U + (W ∩ S) = W ∩ (U + S)
9. Let S and T be two subspaces of some vector space V. (b) Define S...
9. Let S and T be two subspaces of some vector space V. (b) Define S + T to be the subset of V whose elements have the form (an element of S) + (an element of T). Prove that S + T is a subspace of V. (c) Suppose {v1, . . . , vi} is a basis for the intersection S ∩ T. Extend this with {s1, . . . , sj} to a basis for S, and...
Let V be an n-dimensional vector space. Let W and W2 be unequal subspaces of V,...
Let V be an n-dimensional vector space. Let W and W2 be unequal subspaces of V, each of dimension n - 1. Prove that V =W1 + W2 and dim(Win W2) = n - 2.
Let U and W be subspaces of a finite dimensional vector space V such that V=U⊕W....
Let U and W be subspaces of a finite dimensional vector space V such that V=U⊕W. For any x∈V write x=u+w where u∈U and w∈W. Let R:U→U and S:W→W be linear transformations and define T:V→V by Tx=Ru+Sw . Show that detT=detRdetS .
ADVERTISEMENT
Need Online Homework Help?

Get Answers For Free
Most questions answered within 1 hours.

Ask a Question
ADVERTISEMENT