Let $\displaystyle T_1, T_2,....T_n$ be linear operators on a vector space V.
Let $\displaystyle v\in V$ such that one of $\displaystyle T_1(v),T_2(v),..,T_n(v)$ is 0.
Is it ok to say $\displaystyle T_1T_2....T_n(V)=0$?
Let $\displaystyle T_1, T_2,....T_n$ be linear operators on a vector space V.
Let $\displaystyle v\in V$ such that one of $\displaystyle T_1(v),T_2(v),..,T_n(v)$ is 0.
Is it ok to say $\displaystyle T_1T_2....T_n(V)=0$?
i think you meant $\displaystyle T_1T_2....T_n(v)=0$ not $\displaystyle T_1T_2....T_n(V)=0$? if so, the answer is no. for example let $\displaystyle \{v_1,v_2 \}$ be a basis for $\displaystyle V$ and define $\displaystyle T_1(v_1)=0, \ T_1(v_2)=v_1, \ T_2(v_1)=v_2, \ T_2(v_2)=v_1.$
here's a much less trivial question: suppose that for every $\displaystyle v \in V$ there exists $\displaystyle 1 \leq i \leq n$ such that $\displaystyle T_i(v)=0.$ would we necessarily have $\displaystyle T_1T_2 \cdots T_n = 0$?
Yes I meant $\displaystyle T_1T_2....T_n(v)=0$
My book - says the following
If v is a eigenvector then one of
(T-c1I), (T-c2I),....,(T-ckI) sends v to 0.
where c1,c2,....ck are eigenvalues
I understand till this point.
Then it says, hence
[(T-c1I)(T-c2I)....(T-ckI)](v) = 0
This is where I got confused and asked the question above.
And let me think about your poser for a while please. Thanks again !
well, this is very different from your original question! suppose $\displaystyle Tv=c_j v.$ then $\displaystyle (T-c_kI)(v)=(c_j - c_k)v$ and so $\displaystyle (T-c_1I) \cdots (T-c_kI)(v)=(c_j-c_k)(T-c_1I) \cdots (T-c_{k-1}I)(v) = \cdots $
i'm sure you can finish the proof now!
First - Thanks for your question. Here is my attempt (I hope I got it )
Ans: Yes $\displaystyle T_1T_2 \cdots T_n = 0$ as I claim one of $\displaystyle T_i= 0$
Re-phrasing your question.
Let $\displaystyle K_i$ be kernel of $\displaystyle T_i$
You tell me $\displaystyle \bigcup K_i = V$
I claim this is possible only if one of $\displaystyle K_i = V$ and hence corresponding $\displaystyle T_i= 0$
If I show you that $\displaystyle \forall i$, if $\displaystyle K_i \neq V$ then $\displaystyle \bigcup K_i \neq V$, I will be done. Correct?
I will prove this by induction over $\displaystyle i$. $\displaystyle i=1 $or $\displaystyle i=2$ are trivial I am omitting them.
Let above be true for $\displaystyle i=n$. I claim it is true for $\displaystyle i=n+1$, for otherwise we have a situation where
$\displaystyle \bigcup K_i \neq V$ but $\displaystyle (\bigcup K_i)\bigcup K_{n+1} = V$. Note also $\displaystyle \forall i, K_i \neq V$
Under the above situation, I can find vectors
$\displaystyle a,b \in V$ such that $\displaystyle a \in K_{n+1}$ and $\displaystyle a \notin (\bigcup K_i) $. Similarly $\displaystyle b \notin K_{n+1}$ and $\displaystyle b \in (\bigcup K_i) $
Consider $\displaystyle b, a+b, a+2n, ..., a+(n-1)b$. No two of these can be in same $\displaystyle K_i$ for $\displaystyle i \in [1,n]$. As other wise we will have $\displaystyle a \in K_i$, contrary to our claim.
Therefor consider $\displaystyle a+nb \in V$. But, $\displaystyle a+nb \notin (\bigcup K_i)\bigcup K_{n+1}$.
A contradiction and hence we are done.
@NonCommAlg: Please let me know if I am correct please.
Thanks
Yes - I meant $\displaystyle b, b+a, b+2a, \cdots , b+(n-1)a.$. Sorry about that
Any vector of the form $\displaystyle b+ka$ can't belong to $\displaystyle K_{n+1}$. Because if it does => $\displaystyle b+ka - (ka) = b \in K_{n+1}$, which is a contradiction. Note: $\displaystyle a \in K_{n+1} $ => $\displaystyle ka \in K_{n+1} $
So any vector of the form $\displaystyle b+ka \in \bigcup K_i, i \in [1,n]$
Also $\displaystyle b+k_1a, b+k_2a$ for two different $\displaystyle k1,k2$ cannot belong to same $\displaystyle K_i, i \in [1,n]$. Because if it does $\displaystyle (b+k_1a) - (b+k_2a) = (k_1-k_2)a \in K_i$. Which again is a contradiction as we have assumed $\displaystyle a \notin \bigcup K_i, i \in [1,n] $
Now consider $\displaystyle b+na$. Sorry for typo here in the earlier post.
By Pigeon-hole principle $\displaystyle b+na$ can't belong to any of the $\displaystyle K_i, i \in [1,n] $.
Also, $\displaystyle b+na \notin K_{n+1}$. as it is of the form $\displaystyle b+ka$
Thus $\displaystyle b+na \notin (\bigcup K_i)\bigcup K_{n+1}$
Thus $\displaystyle (\bigcup K_i)\bigcup K_{n+1}\neq V$
it's correct now. good work! what you're saying is that for every $\displaystyle 0 \leq i \leq n-1$ there exists a "unique" $\displaystyle 1 \leq j \leq n$ such that $\displaystyle b + ia \in K_j.$ now $\displaystyle b + na \notin K_{n+1}$ because $\displaystyle a \in K_{n+1}$ and $\displaystyle b \notin K_{n+1}.$
also $\displaystyle b + na \notin K_j,$ for all $\displaystyle 1 \leq j \leq n,$ because we already know that $\displaystyle b + ia \in K_j$ for some $\displaystyle 0 \leq i \leq n-1.$ so we'd have $\displaystyle (n-i)a \in K_j$ and hence $\displaystyle a \in K_j,$ which is false.
Remark: as you might have noticed that your proof works only for vector spaces over fields with zero characteristic. this proof can be modified to work for all infinite fields. it is an interesting
question to see if there's any counter-example for vector spaces over "finite" fields!