# Thread: The cross product as a linear transformation and finding its matrix

1. ## The cross product as a linear transformation and finding its matrix

I had seen this forum in my many passings online but never really delved into it until I just registered a few seconds ago. Looks awesome!!

Anyway, my first question:

Basically, as far as I've gotten, the next step in my problem is to find the matrix of the following:

v2x3 - v3x2
v3x1 - v1x3
v1x3 - v2x1

The multiple variables throw me off. I tried setting one side equal to the other and isolating via the elementary row operations but I'm stuck. Help!

2. ## Re: The cross product as a linear transformation and finding its matrix

I can't speak for others, but I find that impossible to follow. "find the matrix of the following"?? I also don't understand the parts that followed that. Are those vectors with subscripts? I don't see any equations to clue me in. (And without any equations, what on Earth does "I tried setting one side equal to the other" mean?)
You need to remember that readers have no idea the context of your question. We've no idea what class you're taking, what book you're using, what chapter you're in, etc..
If you could try to clarify the question, I'd be happy to try to help you with it.

3. ## Re: The cross product as a linear transformation and finding its matrix

Ok, sorry. I'm sure there's a way to use the proper notation on the forum, as I've seen it before, but I'm not sure how to use it. Yes lol those are meant to be subscripts.

The question starts by giving the definition of the cross product. The product of the two vectors is what I tried writing.

i.e. vectors (v) x (x) =

v2x3 - v3x2
v3x1 - v1x3
v1x3 - v2x1

It then asks if the transformation is linear. I already know from the definition of a cross product that it is. It then asks us to find the matrix in terms of the components of vector v.

Hope that is a bit clearer.

4. ## Re: The cross product as a linear transformation and finding its matrix

Much clearer. Thanks. Unfortunately, I'm still not sure what "find the matrix means". Does that mean a matrix whose determinate is the cross product?

Anyway, here's for linearity:
The cross product is linear, in the sense that it's what's called bilinear, meaning linear in each of the TWO factors in the cross product operation. So asking to show that it's linear is a little ambiguous - linear in which, v or x? (Actually, once you establish it for one, you can prove it much more quickly for other, since (v)x(x) = -(x)x(v)). When you show it's linear in one, you keep the other fixed. I'll show you how begin. (As always with math, you'll need to know the definitions, and often - not always, but often - it's just a matter of plugging into the definitions, as it will be here.)

Fix $\displaystyle \vect{x} \in \mathbb{R}^3$. Let $\displaystyle L:\mathbb{R}^3 \rightarrow \mathbb{R}^3$ by $\displaystyle L(\vec{v}) = \vec{v} \times \vec{x}$ as defined above.

Problem: Show L is a linear transformation

Must show that $\displaystyle L(a\vec{v} + b\vec{w}) = aL(\vec{v}) + bL(\vec{w}) \ \forall \ \vec{v}, \vec{w} \in \mathbb{R}^3$ and $\displaystyle a, b \in \mathbb{R}$.

Thus must show that $\displaystyle (a\vec{v} + b\vec{w}) \times \vec{x} = a(\vec{v} \times \vec{x}) + b(\vec{w} \times \vec{x}) \ \forall \ \vec{v}, \vec{w} \in \mathbb{R}^3$ and $\displaystyle a, b \in \mathbb{R}$.

(That statement is often what people mean by saying "$\displaystyle \vec{v} \times \vec{x}$ is linear in $\displaystyle \vec{v}$". Using that def or "linear transformation", makes no real difference.)

Representing vectors in $\displaystyle \mathbb{R}^3$ as coordinates, and the definition of $\displaystyle \times$ you gave, that becomes:

Show *:
$\displaystyle (a(v_1, v_2, v_3) + b(w_1, w_2, w_3)) \times (x_1, x_2, x_3)$
$\displaystyle = a((v_1, v_2, v_3) \times (x_1, x_2, x_3)) + b((w_1, w_2, w_3) \times (x_1, x_2, x_3))$

Now apply the definition, and work through the algebra. Note the actual demonstration ("proof") will work in reverse, but all of these steps are reversible - they're definitions and algebra - hence my use of the if-and-only-if symbol "$\displaystyle \Leftrightarrow$".

* $\displaystyle \Leftrightarrow (av_1+bw_1, av_2+bw_2, av_3+bw_3) \times (x_1, x_2, x_3)$
$\displaystyle = a( \ (v_2x_3 - v_3x_2), (v_3x_1 - v_1x_3), (v_1x_2 - v_2x_1) \ )+ b ( \ (w_2x_3 - w_3x_2), (w_3x_1 - w_1x_3), (w_1x_2 - w_2x_1) \ )$

$\displaystyle \Leftrightarrow ( \ ((av_2+bw_2)x_3 - (av_3+bw_3)x_2), ((av_3+bw_3)x_1 - (av_1+bw_1)x_3), ((av_1+bw_1)x_2 - (av_2+bw_2)x_1) \ )$
$\displaystyle = ( \ ( a(v_2x_3 - v_3x_2)+ b(w_2x_3 - w_3x_2)), (a(v_3x_1 - v_1x_3)+b(w_3x_1 - w_1x_3)), (a(v_1x_2 - v_2x_1) + b(w_1x_2 - w_2x_1)) \ )$

$\displaystyle \Leftrightarrow ( \ ((av_2x_3+bw_2x_3) - (av_3x_2+bw_3x_2)), ((av_3x_1+bw_3x_1) - (av_1x_3+bw_1x_3)), ((av_1x_2+bw_1x_2) - (av_2x_1+bw_2x_1)) \ )$
$\displaystyle = ( \ ( a(v_2x_3 - v_3x_2)+ b(w_2x_3 - w_3x_2)), (a(v_3x_1 - v_1x_3)+b(w_3x_1 - w_1x_3)), (a(v_1x_2 - v_2x_1) + b(w_1x_2 - w_2x_1)) \ )$

$\displaystyle \Leftrightarrow ( \ ((av_2x_3+bw_2x_3 - av_3x_2 - bw_3x_2)), ((av_3x_1+bw_3x_1 - av_1x_3-bw_1x_3)), ((av_1x_2+bw_1x_2 - av_2x_1 -bw_2x_1)) \ )$
$\displaystyle = ( \ ( a(v_2x_3 - v_3x_2)+ b(w_2x_3 - w_3x_2)), (a(v_3x_1 - v_1x_3)+b(w_3x_1 - w_1x_3)), (a(v_1x_2 - v_2x_1) + b(w_1x_2 - w_2x_1)) \ )$.

But that is obviously true (by factoring out the a & b), so the statement * is also true.

The above is how you figure out the steps, and technically, because the $\displaystyle \Leftrightarrow$ claims were justified, is a full demonstration. However, the best way to write this up is to rewrite it, starting from one side, and arriving at the other, by using the above derivation. It's much cleaner and clearer. So it would go like this:

Claim: With the above definitions, the cross prodiuct is linear in $\displaystyle \vec{v}$, meaning that for all $\displaystyle \vec{v}, \vec{w} \in \mathbb{R}^3$, and for all $\displaystyle a, b \in \mathbb{R}$, have:

$\displaystyle (a\vec{v} + b\vec{w}) \times \vec{x} = a(\vec{v} \times \vec{x}) + b(\vec{w} \times \vec{x})$.

Proof:
$\displaystyle (a\vec{v} + b\vec{w}) \times \vec{x}$

$\displaystyle = (a(v_1, v_2, v_3) + b(w_1, w_2, w_3)) \times (x_1, x_2, x_3)$

$\displaystyle = (av_1+bw_1, av_2+bw_2, av_3+bw_3) \times (x_1, x_2, x_3)$

$\displaystyle = ( \ ((av_2+bw_2)x_3 - (av_3+bw_3)x_2), ((av_3+bw_3)x_1 - (av_1+bw_1)x_3), ((av_1+bw_1)x_2 - (av_2+bw_2)x_1) \ )$

$\displaystyle = ( \ ((av_2x_3+bw_2x_3) - (av_3x_2+bw_3x_2)), ((av_3x_1+bw_3x_1) - (av_1x_3+bw_1x_3)), ((av_1x_2+bw_1x_2) - (av_2x_1+bw_2x_1)) \ )$

$\displaystyle = ( \ ((av_2x_3+bw_2x_3 - av_3x_2 - bw_3x_2)), ((av_3x_1+bw_3x_1 - av_1x_3-bw_1x_3)), ((av_1x_2+bw_1x_2 - av_2x_1 -bw_2x_1)) \ )$

$\displaystyle = ( \ ( a(v_2x_3 - v_3x_2)+ b(w_2x_3 - w_3x_2)), (a(v_3x_1 - v_1x_3)+b(w_3x_1 - w_1x_3)), (a(v_1x_2 - v_2x_1) + b(w_1x_2 - w_2x_1)) \ )$.

$\displaystyle = ( \ ( a(v_2x_3 - v_3x_2)+ b(w_2x_3 - w_3x_2)), (a(v_3x_1 - v_1x_3)+b(w_3x_1 - w_1x_3)), (a(v_1x_2 - v_2x_1) + b(w_1x_2 - w_2x_1)) \ )$

$\displaystyle =( \ ( a(v_2x_3 - v_3x_2)+ b(w_2x_3 - w_3x_2)), (a(v_3x_1 - v_1x_3)+b(w_3x_1 - w_1x_3)), (a(v_1x_2 - v_2x_1) + b(w_1x_2 - w_2x_1)) \ )$

$\displaystyle = a( \ (v_2x_3 - v_3x_2), (v_3x_1 - v_1x_3), (v_1x_2 - v_2x_1) \ ) + b ( \ (w_2x_3 - w_3x_2), (w_3x_1 - w_1x_3), (w_1x_2 - w_2x_1) \ )$

$\displaystyle = a((v_1, v_2, v_3) \times (x_1, x_2, x_3)) + b((w_1, w_2, w_3) \times (x_1, x_2, x_3))$

$\displaystyle = a(\vec{v} \times \vec{x}) + b(\vec{w} \times \vec{x})$.

5. ## Re: The cross product as a linear transformation and finding its matrix

a little more simply:

let v1,v2 and v3 be real numbers (so that v = (v1,v2,v3) is a vector in R3). Let x = (x1,x2,x3) be a vector in R3.

then:

$\displaystyle \begin{bmatrix}0&-v_3&v_2\\v_3&0&-v_1\\-v_2&v_1&0 \end{bmatrix} \begin{bmatrix}x_1\\x_2\\x_3 \end{bmatrix} = v \times x$.

this shows that the mapping x → v x x is a linear mapping (multiplication by a matrix is always linear).

it also has the advantage of giving us the matrix right there.

6. ## Re: The cross product as a linear transformation and finding its matrix

Thanks guys!!

Deveno - so I guess it's just a question of working backwards? I.e., figuring out what matrix multiplied by (vector x) will equal (v) x (x)? I had to review my matrix multiplication a bit but then it made sense to me.

john - thank you. It is becoming apparent to me that proofs at this level are a weakness of mine. Do you have any suggestions as far as improving my 'proof-proving' ability? Seems like the ability to recognize definitions and apply/substitute them is what's killing me.

7. ## Re: The cross product as a linear transformation and finding its matrix

I can direct you to how to do these "plug & chug" proofs the same way I can direct you to Carnigie Hall... practice, practice, practice.
As an outline, go to the definition to understand exactly what you need to show. Write down what you need to show (and in your final written proof, it never hurts to write down what you're going to show first - to let your reader know what you're intending to do). Then, when you write the proof, start on one side and crank it out until you produce the other side, the final statement.
Finding the proof, and writing it, aren't always the same. In this problem, I worked backwards, unpacking both sides until I got to where they were equal. If all your steps are reversible, then this is a proof, but it's still not the ideal way to write the proof. Poorfs should be as "linear" and straightforward as possible. This issue of the work in finding the answer being different from how to write the answer is sorta like with epsilon-delta proofs of limits in Calculus I (if you did any of those). Your efforts are usually to find the delta, but in the proof you write, you start (well, you start with a generic positive epsilon) by producing that delta out of the blue, and then show that your delta works to prove the limit.

Some sample problems you might try:
Let $\displaystyle L_1, L_2 : \mathbb{R}^{N_1} \rightarrow \mathbb{R}^{N_2}, M:\mathbb{R}^{N_2} \rightarrow \mathbb{R}^{N_3}, a, b \in \mathbb{R}, \vec{v}, \vec{w}, \vec{z} \in \mathbb{R}^3$,
where the maps $\displaystyle L_1, L_2, M$ are linear maps between vectors spaces over $\displaystyle \mathbb{R}$.

Show
1) $\displaystyle aL_1 + bL_2$ is a linear map, where $\displaystyle (aL_1 + bL_2)(\vec{x})$ is defined to equal $\displaystyle aL_1(\vec{x}) + bL_2(\vec{x})$.

2) The composition $\displaystyle M \circ L_2$ is a linear map from $\displaystyle \mathbb{R}^{N_1}$ to $\displaystyle \mathbb{R}^{N_3}$.

3) $\displaystyle \vec{v} \times (\vec{w} \times \vec{z}) = (\vec{v} \cdot \vec{z})\vec{w} - (\vec{v} \cdot \vec{w})\vec{z}$.

Note: #1 and #2 are archtypes for this plug & chug type of proof. They're so straightfoward that after you've done a certain number of such problems, it becomes basically effortless - like doing arithmetic. For #3, it's a little crazy to use the original definition of $\displaystyle \times$ - it'll be long and messy, but you can if you want to practice derivations. The mess of showing #3 directly demonstrates some of the super-duper wonderfulness of linearity: Since you've proven that $\displaystyle \times$ is linear, you can just show that it's true for all combinations of basis vectors for $\displaystyle \mathbb{R}^3$, and then "by linearity", it's true for all vectors in $\displaystyle \mathbb{R}^3$. This is a very common technique in linear algebra.

,

,

,

,

,

,