Assume A is an invertible matrix
1. prove that 0 is not an eigenvalue of A
2. assume λ is an eigenvalue of A. Show that λ^(-1) is an eigenvalue of A^(-1)
Coins can be redeemed for fabulous gifts.
Log In
Sign Up
Get Answers For Free Most questions answered within 1 hours.