Hey cubejunkies.
Recall that is something is linear if f(x+y) = f(x) + f(y) and f(ax) = a*f(x). If you prove that both of these hold, you have proven linearity.
Basically I want to find a general algorithm that lets me, given any function y = f(x) (I'm not too worried about multivariable functions quite yet), I can come up with a matrix A so that x is mapped to f'(x). I want to do this so that I can sleep at night knowing that the solution to a linear ODE can be y_{gen}= y_{homogeneous} + y_{particular }because derivatives are linear transforms. The moral of the story is that I want a solid understanding of why derivatives are linear transformations.
Sorry if this is a really stupid question to ask or pose.
Anthony
the problem is, most (real) function spaces are not finite-dimensional over R, that is, we need an infinite number of "basis functions". so you would need "infinite matrices" (at least).
as chiro points out, if D(f) is defined by: (D(f))(x) = f'(x) (for some suitable class of functions), we have:
(D(f+g))(x) = (f+g)'(x) = f'(x) + g'(x) = (D(f))(x) + (D(g))(x) = (D(f) + D(g))(x), hence D(f+g) = D(f) + D(g).
(D(af))(x) = (af)'(x) = a(f'(x)) = a[(D(f))(x)] = (aD(f))(x), so D(af) = aD(f), thus D is linear.
if we restrict ourselves to polynomial functions of degree equal to or less than n, we CAN write down an actual matrix (since this space IS finite-dimensional):
this matrix is of rank n-1, which reflects the fact that when you differentiate a polynomial, "the degree goes down by one".
for all functions which have convergent Taylor series on some interval [a,b], we can approximate them by polynomials (if the convergence is slow, we may need a large "n" to be within ε).
often, a given linear ODE is of finite degree anyway, for example:
y" + ay' + by = g(x) which can be written:
D(D(y)) + a(D(y)) + by = g(x), or:
(D^{2} + aD + b)(y) = g(x).
suppose a given y_{0}(x) satisfies this. then for ANY function y(x) for which: (D^{2} + aD + b)(y) = 0, we have:
(D^{2} + aD + b)(y + y_{0}) = (D^{2} + aD + b)(y) + (D^{2} + aD + b)(y_{0}) = 0 + g(x) = g(x).
this is entirely analogous to solving:
Ax = b. given a PARTICULAR x_{0} for which Ax_{0} = b, then for ANY x with Ax = 0:
A(x + x_{0}) = A(x) + A(x_{0}) = 0 + b = b.
note all we used was the LINEARITY of A, not the fact that it was "a matrix".
matrices are "concrete linear transformations", one we can actually crunch numbers on (they are a glorified form of arithmetic). they are not the ONLY linear transformations. many important vector spaces have infinite bases (a system of equations may have infinite degrees of freedom). function spaces are chief amongst these, there are more functions possible than dreamt of in your philosophy....