I'm not sure how to prove that and it doesn't say there are n-many distinct eigenvalues so I don't know how I can prove there are n-many eigenvectors.
You're not always guaranteed that the algebraic multiplicity of an eigenvalue (multiplicity of the root in the characteristic equation) is equal to the geometric multiplicity of the eigenvalue (dimension of the space spanned by the corresponding eigenvectors). However, by the way the definition works, I believe you can prove that each distinct eigenvalue will give you at least one eigenvector....unless its guaranteed you get an eigenvector when you have an eigenvalue...
Question: How does this prove the needed result?
I found a better link! Check this book out on Google books. You're interested in sections 1.3 and 1.4.
The author is making an assumption that A is n x n and has n linearly independent eigenvectors. With that assumption, he appears to get the result you want.
Also, check this out for a highly unreadable proof. That is, you're going to have to wade through an ASCII text proof!
I think between the two of these, you can get what you need.
Thank you for the links, they were helpful, I understand more of the concept now. I was searching earlier about eigenvectors of transposes and apparently someone asked the same question before. So I have this link to go off of as well Linear Algebra: Diagonalization, Transpose, and Disctinct Eigenvectors.