How would you guys compute $\displaystyle A^k$ for $\displaystyle A=\left( \begin{matrix}
1 & 0 & 0 \\
1 & 0 & 0 \\
0 & 1 & 0
\end{matrix} \right).$
$\displaystyle A=\left( \begin{matrix}
1 & 0 & 0 \\
1 & 0 & 0 \\
0 & 1 & 0
\end{matrix} \right)= \left( \begin{matrix}
1 & 0 & 0 \\
0& 0 & 0 \\
0 & 0 & 0
\end{matrix} \right)+\left( \begin{matrix}
0& 0 & 0 \\
1 & 0 & 0 \\
0 & 1 & 0
\end{matrix} \right)$
Now use the binomial theorem (they commute). Notice the first one is diagonal so when you raise it to a power its just the power of the entry on the diagonal, ie it stays the same. The second one is nilpotent and goes away after you raise it to the third power.
This still is a hidden form of induction and then one could directly show that A^3 = A^4 and in general (and here's the hidden induction) A^n = A^3 since every case if obtained from the former one by multiplication by A.
Nevertheless I think this is what the OP must do since oterhwise it's hard to see how they expect him/her to do it,
Tonio
when you are asked to prove something for all natural numbers, it is difficult to avoid using induction, especially when you guys are this paranoid of "hidden induction." Like if I told you to prove to me that $\displaystyle 1^k=1$ for all integers k, you can't really formally prove it without using induction, even thought it is obvious, just like this one.
My point is you cannot directly prove something for an infinite number of things, unless you have infinite time to write down each case.
n=1
n=2
n=3
n=4
n=5
...
Your teacher is just going to have to get over it