# Thread: simple way to find basis for Ker(A-transpose) and Im(A-transpose)

1. ## simple way to find basis for Ker(A-transpose) and Im(A-transpose)

I'm pretty sure there have to be some easier ways of finding bases for those two spaces other than taking the transpose of a given matrix A and row reducing it again (after row reducing the original A to find Im(A) and Ker(A)). Does anyone know? help would be appreciated. thanks!

2. If the matrix isn't square, then you have to find the Ker and Im every time without short cuts when transposed.

3. phoo
and if it is square?

4. If it is square, the same vectors should span then. I am almost positive on this but I could be wrong. I can't think of any examples where it doesn't work yet.

I'm pretty sure there have to be some easier ways of finding bases for those two spaces other than taking the transpose of a given matrix A and row reducing it again (after row reducing the original A to find Im(A) and Ker(A)). Does anyone know? help would be appreciated. thanks!
If you're talking about $A:\mathbb{R}^n\to\mathbb{R}^m$ then $\ker(A)=\left(\text{im}(A^{\top})\right)^{\perp}$ (with the usual inner product) and thus consequently $\ker(A^{\top})=\left(\text{im}(A)\right)^{\perp}$.

6. Originally Posted by Drexel28
If you're talking about $A:\mathbb{R}^n\to\mathbb{R}^m$ then $\ker(A)=\left(\text{im}(A^{\top})\right)^{\perp}$ (with the usual inner product) and thus consequently $\ker(A^{\top})=\left(\text{im}(A)\right)^{\perp}$.
hmm for my problem I was given a 4x4 matrix A and had to find the orthogonal projections onto Ker(A transpose) and Im(A transpose)
so I thought I'd have to find bases for them to calculate the projection
I'm not sure how to use the equality you brought up to do it

7. can someone please help how do you find the orthogonal projections onto the kernel and image of A transposed?

8. nobody?

9. Post your 4x4 matrix A with what you have done.

[1 1 0 1
1 0 3 -1
1 0 1 -1
1 2 0 3]
I found a basis for Im(A) to be the first 3 columns of A and a basis for Ker(A) to be the vector [-3 0 2 3].
I now have to find the orthogonal projections onto the kernal and image of A-transposed

another question, not related to my original one:
it says to find an orthonormal basis for Im(A)
would you just use the gram-schmidt process? I'm given a hint that points out that two vectors of the basis I found for Im(A) are already orthogonal but I don't know what to do with it

11. Yes, use the GS process.

12. $\displaystyle
\begin{bmatrix}
1 & 1 & 0 & 1\\
1 & 0 & 3 & -1\\
1 & 0 & 1 & -1\\
1 & 2 & 0 & 3
\end{bmatrix}\rightarrow \ \mbox{rref}=
\begin{bmatrix}
1 & 0 & 0 & -1\\
0 & 1 & 0 & 2\\
0 & 0 & 1 & 0\\
0 & 0 & 0 & 0
\end{bmatrix}\rightarrow \ x_1=x_4, \ x_2=-2x_4, \ x_3=0, \ x_4$

Basis for the Kernel: $\displaystyle \ \ \ x_4\begin{bmatrix}
1\\
-2\\
0 \\
1
\end{bmatrix}$

13. is the only way to find the orthogonal projections onto the kernel and image of A transpose to take the A transpose and row reduce it all over again?

14. Not sure but take that and see what you get.