You can calculate the determinant: if it's not it is linearly independent, and if it's dependent.
The determinant give us a necessary and sufficient condition about linear dependence.
Hey, so the idea is to look at what the conditions are for (1,x) and (1,y) to be linearly dependent then for (1,x,x^2), (1,y,y^2), and (1,y,y^2), then to generalize to n such vectors in C^n.
The most obvious thing (which is definitely sufficient) is if we're calling our n vectors v_i=(1,b_i,b_i^2,...,b_i^n-1), if b_i=b_j for i!=j then they're linearly dependent.
In fact this is verified by looking at their determinants. They are (up to sign) of the form (b1-b2) for n=2, (b1-b2)(b2-b3)(b1-b3) for n=3, and (b1-b2)(b1-b3)(b1-b4)(b2-b3)(b2-b4)(b3-b4) for n=4, etc... product of differences of all possible pairings ( I haven't proved this pattern, only tested in for n=2,3,4,5) If the determinant of the concatenation of these column vectors is 0 that's equivalent to their being linearly dependent, so we're good.
The question is how to actually prove that this pattern about the determinants is valid or just to prove that any of the bases being equal is also a necessary condition for these vectors to be linearly dependent.
I guess I would need a little assistance calculating the determinant of a general matrix of this form. Other than being the product of eigenvalues, and reducing by cofactors, I know no shortcuts about calculating large determinants. It always seems to fit the pattern described, but I haven't determined that rigorously.
The determinant above is called a Vandermonde determinant. There is a nice way to prove that
From this, we see right away that the determinant is nonzero if and only if all of the 's are distinct. Use the fact that the determinant does not change when adding a column to another column to obtain, on the last column, the unique monic polynomial of degree whose roots are ; so all but one of the entries in the last column will vanish. Then use induction!