# Thread: A large question involving basechange matrix (change of basis) material

1. ## A large question involving basechange matrix (change of basis) material

This is pretty difficult stuff, so be sure to read all of it.

Suppose that $V$ and $W$ are vector spaces over a field $F$.

$F[x]$ is the vector space (over $F$) of polynomials with coefficients in $x$, i.e.
$F[x]=\{a_0+a_1x+...+a_nx^n:a_0,...,a_n\in F, n\geq 0\}$.
Fix a positive integer $m$. For this, you may use without proof that the subset $W\subset F[x]$ of all polynomials of degree $m$ or less is a subspace of $F[x]$ and that this subspace has two bases
$B=\{1,x,x^2,...,x^m\}$ and $B'=\{1,x,x^2,...,x^{m-1},x^{m-1}+x^m\}$.

(a) Determine the basechange matrix $P\in F^{m+1\times m+1}$ such that $B'=BP$ and compute the determinant of $P$.
A hint was provided as follows: Determine $P$ by specifying its matrix entries $p_{ij}\in F$.

(b) Suppose here that $F=R$ and $\{f_1,...,f_{m+1}\}\subset W$ such that $f_j(1)=0$ for all $1\leq j\leq m+1$. Prove that $\{f_1,...,f_{m+1}\}$ is linearly dependent.
A hint was provided as follows: What is the dimension of $W$?

I'll try to do some of this, but there's a lot of info that I'm unfamiliar with (primarily the change of basis parts).

2. Originally Posted by Runty
This is pretty difficult stuff, so be sure to read all of it.

Suppose that $V$ and $W$ are vector spaces over a field $F$.

$F[x]$ is the vector space (over $F$) of polynomials with coefficients in $x$, i.e.
$F[x]=\{a_0+a_1x+...+a_nx^n:a_0,...,a_n\in F, n\geq 0\}$.
Fix a positive integer $m$. For this, you may use without proof that the subset $W\subset F[x]$ of all polynomials of degree $m$ or less is a subspace of $F[x]$ and that this subspace has two bases
$B=\{1,x,x^2,...,x^m\}$ and $B'=\{1,x,x^2,...,x^{m-1},x^{m-1}+x^m\}$.

(a) Determine the basechange matrix $P\in F^{m+1\times m+1}$ such that $B'=BP$ and compute the determinant of $P$.
A hint was provided as follows: Determine $P$ by specifying its matrix entries $p_{ij}\in F$.
The simplest way to determine the matrix entries of a linear transfromation, in a given basis, is to apply the linear transformation to each basis vector in turn. Each gives one column of the matrix. For example, this maps each of the first m-1 basis vectors into itself so n-1 columns just give the identity matrix with an addtional "0" on the bottom. $x^{m}$, however, is mapped into $x^{m-1}+ x^m$: The column vector with all "0"s except that the last entry is "1" is mapped into a column vector with all "0"s except that the last two entries are "1". The final column of the matrix is all "0"s except that the last two entries are "1".

(b) Suppose here that $F=R$ and $\{f_1,...,f_{m+1}\}\subset W$ such that $f_j(1)=0$ for all $1\leq j\leq m+1$. Prove that $\{f_1,...,f_{m+1}\}$ is linearly dependent.
A hint was provided as follows: What is the dimension of $W$?

I'll try to do some of this, but there's a lot of info that I'm unfamiliar with (primarily the change of basis parts).
Okay, what is the dimension of W? It cannot be any larger than the dimension of F[x] can it? And a basis is the largest possible independent set of vectors.

3. Originally Posted by HallsofIvy
The simplest way to determine the matrix entries of a linear transfromation, in a given basis, is to apply the linear transformation to each basis vector in turn. Each gives one column of the matrix. For example, this maps each of the first m-1 basis vectors into itself so n-1 columns just give the identity matrix with an addtional "0" on the bottom. $x^{m}$, however, is mapped into $x^{m-1}+ x^m$: The column vector with all "0"s except that the last entry is "1" is mapped into a column vector with all "0"s except that the last two entries are "1". The final column of the matrix is all "0"s except that the last two entries are "1".
Thanks for the tip. I've done a good deal of work on this so far, and though I understand how this works now, my representation of my work is a bit messy and probably has a piece or two missing. I'll post it up, so maybe someone could help me make it a bit more complete and to check for errors.

First, I let $v_1=1, v_2=x, v_3=x^2, ..., v_{m+1}=x^m$
Then I let $v'_j=p_{1j}v_1+p_{2j}v_2+...+p_{(m+1)j}v_{m+1}, j=1,2,...,m+1$ (remember that $p_{ij}$ is the matrix entry)

This presents two cases:
1. $1\leq j\leq m$, which leads to $v'_j=v_j$.
2. $j=m+1$, which leads to $v'_{m+1}=(1\times v_m)+(1\times v_{m+1})$.

This leads to the following matrix $P$: (here's where I might have done something wrong)
$P=\left[ \begin{matrix} 1 & 0 & 0 & \ldots & 0 \\ 0 & x & 0 & \ldots & 0 \\ 0 & 0 & \ddots & \ddots & 0 \\ \vdots & \vdots & \ddots & x^{m-1} & x^{m-1} \\ 0 & 0 &\ldots & 0 & x^m \end{matrix}\right]$
(are the $x$ values meant to be in the matrix, or are they supposed to be 1's?)

This result is an upper triangular matrix (nearly an identity matrix), so its determinant, $\det P$, is the product of the entries on its diagonal.

-----

Now for the second half question, I was told to try and prove it via contradiction. Here is my work so far.

Suppose that $\{f_1,...,f_{m+1}\}$ is linearly independent.
Since $\dim W=m+1$, $\{f_1,...,f_{m+1}\}$ is a basis.
As such, $\{f_1,...,f_{m+1}\}$ also spans $W$.
If we evaluate $f_j$ for all $1\leq j\leq m+1$, we will see that all of the entries $f_j(1)=0$.
This contradicts the idea that $\{f_1,...,f_{m+1}\}$ is linearly independent, since all of its vectors are zero vectors (and therefore are duplicates of each other).
Therefore, $\{f_1,...,f_{m+1}\}$ is linearly dependent.

Let me know if I've made any mistakes in my proofs.