# Linear Independence

#### npthardcorebmore

In my ODE class our professor has thrown a lot of stuff at us. There are only 2 people in my class at this small uni and it is hard for us to study or get much help from eachother.

This question seems more aligned with linear algebra though so I have posted in this section for some guidance. It has been a struggle.

Let f1, f2 and f3 be continuous functions on (a, b).

Set cij =integral from a to b of [fi(x)fj(x) dx]
Prove that these functions are linearly independent on (a, b) if and only if
(matrix)
c11 c12 c13
c21 c22 c23 /=0
c31 c32 c33

How can one extend this result to the case of n functions on (a, b)?

#### dwsmith

MHF Hall of Honor
In my ODE class our professor has thrown a lot of stuff at us. There are only 2 people in my class at this small uni and it is hard for us to study or get much help from eachother.

This question seems more aligned with linear algebra though so I have posted in this section for some guidance. It has been a struggle.

Let f1, f2 and f3 be continuous functions on (a, b).

Set cij =integral from a to b of [fi(x)fj(x) dx]
Prove that these functions are linearly independent on (a, b) if and only if
(matrix)
c11 c12 c13
c21 c22 c23 /=0
c31 c32 c33

How can one extend this result to the case of n functions on (a, b)?

I would use the Wronskian.

$$\displaystyle W(f,f',f'',...f^n)=\begin{vmatrix} f_1 & \dots & & & f_n\\ f_1' & \ddots & & & \\ \vdots & & & & \\ & & & & \\ f_1^n & \dots & & & f_n^n \end{vmatrix}=0$$ iff. the functions are lin. dep. *Special note this must be zero on the entire interval if it is just zero at one point, they are lin. ind.

#### HallsofIvy

MHF Helper
The "theorem", as you state it, is NOT true. It is not the matrix that must be non-zero, but its determinant. If, for example, $$\displaystyle f_1(x)= f_2(x)= f_3(x)= \frac{1}{\sqrt{b- a}}$$ then the functions are obviously not independent but $$\displaystyle \begin{bmatrix}c_11 & c_{12} & c_{13} \\ c_{12} & c_{22} & c_{23} \\ c_{13} & c{23} & c_{33}\end{bmatrix}= \begin{bmatrix}1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1\end{bmatrix}$$ which non-zero. Its determinant is 0, of course.

Suppose $$\displaystyle pf_1(x)+ qf_2(x)+ rf_3(x)= 0$$ for all x. Multiply that equation by $$\displaystyle f_1(x)$$ and integrate from a to b. That gives $$\displaystyle c_{11}p+ c_{12}q+ c_{13}r= 0$$. Similarly, multiplying by $$\displaystyle f_2(x)$$ and integrating from a to b gives $$\displaystyle c_{12}p+ c_{22}q+ c_{32}r= 0\(\displaystyle and multiplying by \(\displaystyle f_3(x)$$ and integrating from a to b gives $$\displaystyle c{13}p+ c_{23}q+ c_{33}r= 0$$.

That system of equations is equivalent to the matrix equation $$\displaystyle \begin{bmatrix}c_11 & c_{12} & c_{13} \\ c_{12} & c_{22} & c_{23} \\ c_{13} & c{23} & c_{33}\end{bmatrix}\begin{bmatrix}p \\ q\\ r\end{bmatrix}= \begin{bmatrix} 0 \\ 0 \\ 0\end{bmatrix}$$

That matrix has the unique solution p= q= r= 0 (and so $$\displaystyle f_1(x)$$, $$\displaystyle f_2(x)$$, and $$\displaystyle f_3(x)$$ are independent if and only if the matrix of coefficients is invertible, which is true if and only if its determinant is non-zero.\)\)

• npthardcorebmore

#### npthardcorebmore

Thank you both for your responces.

Our prof has gone over the Wronskian which is what i was originally thinking of using but wasnt sure how to go about that since the functions werent given and wasnt sure how to apply it to the matrix given.

The second response seems to lead to an answer for the first portion of the proof but how would one generalize it for n functions? That is one of my major weak spots, generalizing a proof that was generated around a specific problem (Worried)