. Given vectors v1, ..., vk in F n , by reducing the matrix M with v1, ..., vk as its rows to its reduced row echelon form M˜ , we can get a basis B of Span ({v1, ..., vk}) consisting of the nonzero rows of M˜ . In general, the vectors in B are not in {v1, ..., vk}. In order to find a basis of Span ({v1, ..., vk}) from inside the original spanning set {v1, ..., vk}, we can utilize the result of Problem 2 above. Describe explicitly and precisely such an algorithm of finding a basis of Span ({v1, ..., vk}) out of {v1, ..., vk}. Justify your answer.
Get Answers For Free
Most questions answered within 1 hours.