Unique Linear Functionals

I am working on the following...

Let 𝐵 = {$\displaystyle {\bf v}_1,{\bf v}_2,...,{\bf v}_n$} be a basis for vector space 𝔙 .

Show that there is a unique linear functional $\displaystyle v_\imath ^{\ast} $ on 𝔙, such that $\displaystyle v_\imath ^{\ast} ({\bf v}_\jmath) $ = $\displaystyle \delta_{\imath \jmath}$

Also, show that the set of 𝑛 distinct linear functionals on 𝔙 obtained form 𝐵, are linearly independent.

I have no idea on what to do, could someone help me out?

Thanks

Re: Unique Linear Functionals

Quote:

Originally Posted by

**jnava** I am working on the following...

Let 𝐵 = {$\displaystyle {\bf v}_1,{\bf v}_2,...,{\bf v}_n$} be a basis for vector space 𝔙 .

Show that there is a unique linear functional $\displaystyle v_\imath ^{\ast} $ on 𝔙, such that $\displaystyle v_\imath ^{\ast} ({\bf v}_\jmath) $ = $\displaystyle \delta_{\imath \jmath}$

Also, show that the set of 𝑛 distinct linear functionals on 𝔙 obtained form 𝐵, are linearly independent.

I have no idea on what to do, could someone help me out?

Thanks

The basic idea is this. Suppose that $\displaystyle V$ is a $\displaystyle F$-space. You can completely specify a linear transformation from $\displaystyle V$ to any other $\displaystyle F$-space $\displaystyle W$ by demanding that $\displaystyle v_i\mapsto w_i$ where $\displaystyle \{v_1,\cdots,v_n\}$ is a basis for $\displaystyle V$ and the $\displaystyle w_i$ are just any vectors in $\displaystyle W$ (not necessarily different). How? Well, say you have made a choice about what the $\displaystyle v_i$ go to, you still haven't defined a transformation on $\displaystyle V$ itself. That said, if the map which takes $\displaystyle v_i\to w_i$ is to be a linear transformation you must take each $\displaystyle v=\alpha_1v_1+\cdots+\alpha_n v_n$ to $\displaystyle \alpha_1w_1+\cdots+\alpha_nw_n$. Thus, if $\displaystyle T:V\to W$ is linear and satisfies $\displaystyle T(v_i)=w_i$ then the function must be defined by the rule $\displaystyle T(v)=\alpha_1w_1+\cdots+\alpha_nw_n$ where $\displaystyle \alpha_1v_1+\cdots+\alpha_nv_n$ is the UNIQUE representation of $\displaystyle v$ as a linear combination of the basis $\displaystyle \{v_1,\cdots,v_n\}$. Conversely, you can check that the map defined that way does, in fact, satisfy the condition of being a linear transformation $\displaystyle V\to W$ with $\displaystyle v_i\mapsto w_i$. So, in your case you have that $\displaystyle W=F$ and $\displaystyle w_j=\delta_{i,j}\in F$

For the second part, what would happen if $\displaystyle \displaystyle \sum_{i=1}^{n}v^\ast_i=0(v)$ (where I put $\displaystyle 0(v)$ to emphasize that it's the zero function)? What happens if you plug in $\displaystyle v_j$ for $\displaystyle j=1,\cdots,n$?

Re: Unique Linear Functionals

Quote:

Originally Posted by

**Drexel28** The basic idea is this. Suppose that $\displaystyle V$ is a $\displaystyle F$-space. You can completely specify a linear transformation from $\displaystyle V$ to any other $\displaystyle F$-space $\displaystyle W$ by demanding that $\displaystyle v_i\mapsto w_i$ where $\displaystyle \{v_1,\cdots,v_n\}$ is a basis for $\displaystyle V$ and the $\displaystyle w_i$ are just any vectors in $\displaystyle W$ (not necessarily different). How? Well, say you have made a choice about what the $\displaystyle v_i$ go to, you still haven't defined a transformation on $\displaystyle V$ itself. That said, if the map which takes $\displaystyle v_i\to w_i$ is to be a linear transformation you must take each $\displaystyle v=\alpha_1v_1+\cdots+\alpha_n v_n$ to $\displaystyle \alpha_1w_1+\cdots+\alpha_nw_n$. Thus, if $\displaystyle T:V\to W$ is linear and satisfies $\displaystyle T(v_i)=w_i$ then the function must be defined by the rule $\displaystyle T(v)=\alpha_1w_1+\cdots+\alpha_nw_n$ where $\displaystyle \alpha_1v_1+\cdots+\alpha_nv_n$ is the UNIQUE representation of $\displaystyle v$ as a linear combination of the basis $\displaystyle \{v_1,\cdots,v_n\}$. Conversely, you can check that the map defined that way does, in fact, satisfy the condition of being a linear transformation $\displaystyle V\to W$ with $\displaystyle v_i\mapsto w_i$. So, in your case you have that $\displaystyle W=F$ and $\displaystyle w_j=\delta_{i,j}\in F$

For the second part, what would happen if $\displaystyle \displaystyle \sum_{i=1}^{n}v^\ast_i=0(v)$ (where I put $\displaystyle 0(v)$ to emphasize that it's the zero function)? What happens if you plug in $\displaystyle v_j$ for $\displaystyle j=1,\cdots,n$?

when you plug back in $\displaystyle v_j$ you will get zero, meaning orthogonality, meaning linear independence?

Re: Unique Linear Functionals

Quote:

Originally Posted by

**jnava** when you plug back in $\displaystyle v_j$ you will get zero, meaning orthogonality, meaning linear independence?

Uh, not sure what you mean. You get that $\displaystyle \alpha_j=0$, right?

Re: Unique Linear Functionals

Quote:

Originally Posted by

**Drexel28** Uh, not sure what you mean. You get that $\displaystyle \alpha_j=0$, right?

Yes all $\displaystyle \alpha_j=0$ due to 0(v) correct? Since it will be zero, the the functionals are orthogonal implying linear dependence? This kind of math hurts my head way too much lol

Re: Unique Linear Functionals

Quote:

Originally Posted by

**jnava** Yes all $\displaystyle \alpha_j=0$ due to 0(v) correct? Since it will be zero, the the functionals are orthogonal implying linear dependence? This kind of math hurts my head way too much lol

Haha, you have proven that all the $\displaystyle \alpha$'s are zero, which shows that they are linearly independent.

Re: Unique Linear Functionals

Quote:

Originally Posted by

**Drexel28** Haha, you have proven that all the $\displaystyle \alpha$'s are zero, which shows that they are linearly independent.

Thank you for the help!