# Linear Alg -- Independent Functions?

• May 31st 2009, 05:21 PM
paupsers
Linear Alg -- Independent Functions?
Came across this problem during my studies... Here it is:

$f, g \in F(R, R)$ be the functions defined by $f(t)=e^{rt}$ and $g(t)=e^{st}$ where r is not equal to s. Prove that $f$ and $g$ are linearly independent in $F(R, R)$.

However, I'm confused about what I can allow $t$ to be. Because if $t=0$ then $e^{st}$ and $e^{rt}$ are both equal to 1.

Thus, I can form the equation $ae^{rt}+be^{st}=0$
If $t=0$ then letting $a=1$ and $b=-1$, the equation is true. Thus, aren't the equations linearly dependent? (Thinking)

EDIT: Forgot to include another problem I'm having trouble with!

Let u, v, and w be distinct vectors of a vector space V. Show that if {u, v, w} is a basis for V, then {u+v+w, v+w, w} is also a basis for V.

I have no idea how to even start that one!
• May 31st 2009, 06:00 PM
paupsers
May have a solution to the 2nd question now. Seems kind of backward though...

Let x be in V.

Assume {u+v+w, v+w, w} is a basis for V.
Then x = a(u+v+w) + b(v+w) + cw
Rearranging yields:
(*) x = au + (a+b)v + (a+b+c)w
All of the coefficients above are in F by properties of F.
So then (*) is true by hypothesis, since that is the definition of {u, v, w} being a basis. QED.

Does that work?(Worried)
• May 31st 2009, 06:07 PM
Jose27
Let $F= \{f: \mathbb{R} \longrightarrow \mathbb{R} \}$ and $f(t)=e^{rt}$ and $g(t)=e^{st}$ with $r \neq s$. Now let $af + bg = \overline{0}$ be a linear combination of $f$ and $g$ ( $\overline{0}$ denotes the zero function). Now as you noted, putting $t=0$ we have $a=-b$ and since we are assuming $a\neq 0$ this implies that $f(t) - g(t) = 0$ $\forall t \in \mathbb{R}$ which is absurd since $r \neq s$.

For the second one we put $a(u +v +w) +b(v +w) + cw= \overline{0}$ then we have $au + (a +b)v + (a +b +c)w = \overline{0}$, and since $\{u,v,w \}$ are linearly independent we have $a= a +b= a +b +c=0$ which implies $a=b=c=0$, and thus $\{u +v +w, v +w, w \}$ is linearly independent and since it is contained in the span of the original set of vectors you have that it is also a basis for your original space
• May 31st 2009, 06:09 PM
Gamma
Those are two functions in a function space, in order for two functions to be the same, they must be equal for all t, not just one time t. Like any two lines even that do not have the same slope will intersect somewhere, this does not make them the same.

So you say suppose that they are linearly dependent, WLOG suppose $s > r$ then there should be scalars, not both of which are 0 $a,b\in \mathbb{R}$ so that:

$ae^{rt}+be^{st} \equiv 0$

It is clear that if one of a or b is 0, then the other has to be, so go ahead and assume that neither a nor b are 0.
$ae^{rt}+be^{st} \equiv 0 \Rightarrow ae^{rt} \equiv -be^{st}\Rightarrow 1 \equiv -\frac{b}{a}e^{(s-r)t}$

Consider $t=0$, then $b=-a$, but then if this is true, it is clear the only way this function can be identically 1 is if s=r which is against the hypothesis. So this is only true if a=b=0, which means they are linearly independent as desired.

By definition of basis, u,v,w are linearly independent, so

$au+bv+cw=0 \Rightarrow a=b=c=0$

In a finite dimensional vector space of dimension n, if you have n linearly independent elements of the vector space, than they form a basis, so it suffices to show your second set is linearly independent.

Suppose $a(u+v+w) + b(v+w) + cw=0 \Rightarrow au + (a+b)v + (a+b+c)w = 0$.

But this means
$a=0$
$a+b=0$
$a+b+c=0$
You see this pyramid tells you that in fact $a=b=c=0$ as desired. This tells you your set is linearly independent and thus is a basis for V.
• May 31st 2009, 06:15 PM
paupsers
Thanks for the help, I think I get it!