Suppose is transpose of . Does the basis for the always basis of .
However i need to somehow prove or disprove that basis for the always basis of . So i was wondering if there is a theorem or a definition that could be help me here. i know that if it is linearly independent it does not matter if i transpose them. it would still be linearly independent thus rank of A would also equal to Rank of A transpose. Since the columns of A are the rows of , finding a basis for is equivalent to finding a basis for . Is there any case this would not hold, i guess is the question that i am asking.
Thanks,
KC
Well I guess the question is a little vague.
What I am saying is clearly any basis for the Col(A) will be a basis for Row( ) and the other way too. These things are exactly the same thing Row( )=Col(A). so if you look at the same thing on each side there is no way they are not the same.
But there is one concern. For instance there are lots of different bases you can pick for Col(A). But this also means there are lots of different bases that can be picked for Row( ). So on an individual basis you could certainly pick a different basis for each of the spaces, but both of those would be bases for the other space? Do you know what I mean?
like just take the matrix A to be the identity matrix on . It is symmetric so .
The vectors <1,0> and <0,1> form a basis for Col(A)
The vectors <2,0> and <0,2> form a basis for Row( )
these two sets are clearly not the same, but they are still a basis for the other space because the two spaces are the same. Know what I mean?
Your question is not well posed basically is what I am getting at. You say the basis for something, there are lots of choices for a basis.
I don't understand what you mean by being able to pick bases. I thought a basis for Row A are the non-zero rows of echelon form or row reduced echelon form. And basis for col A are the pivot columns of A. I think i am missing something.
Lets just talk about Col(A), I think you trouble is understanding what a basis is.
The number of pivot positions is how you determine the RANK of Col(A). It tells you the dimension of Col(A) as a vector space. This will tell you the NUMBER of elements in your basis.
DEFINITION: A basis is an ordered set of linearly independent elements which span the space.
With this definition comes many things. In a vector space every element can be uniquely represented as a linear combination of the basis. The trouble is there are uncountably many choices for a basis of a vector space (of degree at least 1).
Take
Standard basis
(1,0,0)
(0,1,0)
(0,0,1)
A DIFFERENT BASIS FOR
(0,2,0)
(9,0,0)
(0,0, )
One choice is not THE basis, while most people pick the standard basis there is no reason one should discount any other possible choice of basis.
What is a standard matrix? You mean the identity matrix? that was just an example since it is already in RREF. The matrix is a representation of a linear map. Would you talk about a basis for f if it were a polynomial?
A basis is talking about a vector space, you can find a basis for the domain or range, but not for a function (unless you are talking about the dual space to a vector space which is the collection of all linear functionals, then in fact you can pick a basis of functions for the dual space, but only because it is actually a vector space in its own right).
Counting the pivot positions simply tells you the rank of the matrix. That is the dimension of the image of the function. If you count 4 pivot positions of A it means col(A) has dimension 4, and so a basis will need to consist of precisely 4 linearly independent vectors which span Col(A). That is all you know.
It is up to you to pick the basis, there are uncountably many choices.