1. ## norms and vectors

The 'T' means transposed.

Consider the four non-zero COLUMN vectors in R^2:
x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

Let xTw=0, yTz=0, A=xyT, B=wzT

Determine:

a) The 2-norm of A
b) The singular value decomposition (SVD) of A
c) The Range space and Nullspace of A
d) The pseudo-inverse de A
e) The singular value decomposition (SVD) of A+B

2. The 'T' means transposed.

Consider the four non-zero COLUMN vectors in R^2:
x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

Let xTw=0, yTz=0, A=xyT, B=wzT

Determine:

a) The 2-norm of A
b) The singular value decomposition (SVD) of A
c) The Range space and Nullspace of A
d) The pseudo-inverse de A
e) The singular value decomposition (SVD) of A+B

--------------------------------------------------------
$x^Tw =
\begin{bmatrix}x_1 & x_2\end{bmatrix}
\begin{bmatrix}w_1 \\ w_2 \end{bmatrix} =
\begin{bmatrix}x_1 w_1 + x_2 w_2\end{bmatrix}=0
$

$y^Tz =
\begin{bmatrix}y_1 & y_2 \end{bmatrix}
\begin{bmatrix}z_1 \\ z_2 \end{bmatrix} =
\begin{bmatrix}y_1 z_1 + y_2 z_2 \end{bmatrix} =0
$

$A = xy^T =
\begin{bmatrix}x_1 \\ x_2 \end{bmatrix}
\begin{bmatrix}y_1 & y_2 \end{bmatrix} =
\begin{bmatrix}x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2\end{bmatrix}
$

$B = wz^T =
\begin{bmatrix}w_1 \\ w_2 \end{bmatrix}
\begin{bmatrix}z_1 & z_2 \end{bmatrix} =
\begin{bmatrix}w_1 z_1 & w_1 z_2 \\ w_2 z_1 & w_2 z_2\end{bmatrix}
$

PART A:

$\vert\vert A \vert\vert_2 = \sqrt{\lambda_{max}(A^H A)}$, whereby $A^H$ is the conjugate transpose of $A$.

$A = \begin{bmatrix}x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2\end{bmatrix}$

$A^T = \begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2\end{bmatrix}$

$A^H \equiv \bar{A}^T = A^T =\begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2\end{bmatrix}$

(Note that since each of the entries in $A^T \text{ are real, } A^H \text{ can simply be written here as }A^T$).

-------
EDIT: I feel like I owe more of an explanation for why $\bar{A}^T = A^T$ in this problem. $\bar{A}^T$ is called the conjugate transpose matrix of $A$ because each entry in $\bar{A}^T$ is the complex conjugate (remember from algebra?) of its respective entry (i.e., same location in the matrix) in $A^T$. In our case, since $x_1,x_2,y_1,y_2 \in$ R, the entries in $A^T$ and $\bar{A}^T$ are identical; $\bar{A}^T = A^T$.
--------

So,

$A^H A = A^T A =\begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2\end{bmatrix} \begin{bmatrix}x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2\end{bmatrix} =
\begin{bmatrix}x_1^2 y_1^2 + x_2^2 y_1^2 &
x_1^2 y_1 y_2 + x_2^2 y_1 y_2 \\
x_1^2 y_1 y_2 + x_2^2 y_1 y_2 &
x_1^2 y_2^2 + x_2^2 y_2^2 \end{bmatrix} =
$
$
\begin{bmatrix} y_1^2 (x_1^2 + x_2^2) &
y_1 y_2 (x_1^2 + x_2^2) \\
y_1 y_2 (x_1^2 + x_2^2) &
y_2^2 (x_1^2 + x_2^2) \end{bmatrix}$

Now, we compute our characteristic polynomial, which will give us what we need to know in terms of eigenvalues.

$\det(A^H A - \lambda I) =
\det\begin{bmatrix} y_1^2 (x_1^2 + x_2^2) - \lambda&
y_1 y_2 (x_1^2 + x_2^2) \\ y_1 y_2 (x_1^2 + x_2^2) &
y_2^2 (x_1^2 + x_2^2) - \lambda \end{bmatrix} = 0$

$\implies
\big{(}y_1^2 (x_1^2 + x_2^2) - \lambda \big{)}
\big{(}y_2^2 (x_1^2 + x_2^2) - \lambda \big{)} -
\big{(}y_1 y_2 (x_1^2 + x_2^2) \big{)}
\big{(}y_1 y_2 (x_1^2 + x_2^2) \big{)}

$

$\implies
y_1^2 y_2^2 (x_1^2 + x_2^2)^2 - \lambda y_1^2(x_1^2 + x_2^2) -
\lambda y_2^2(x_1^2 + x_2^2) + \lambda^2 -
y_1^2 y_2^2 (x_1^2 + x_2^2)^2 = 0$

$\implies
-\lambda y_1^2(x_1^2 + x_2^2) - \lambda y_2^2(x_1^2 + x_2^2) + \lambda^2 = 0$

$\implies
\lambda^2 - \lambda y_1^2 (x_1^2 + x_2^2) -
\lambda y_2^2 (x_1^2 + x_2^2) = 0
$

$\implies
\lambda^2 - \lambda y_1^2 (x_1^2 + x_2^2) -
\lambda y_2^2 (x_1^2 + x_2^2) = 0
$

$\implies
\lambda^2 - \lambda \big{(} (y_1^2 (x_1^2 + x_2^2) + y_2^2 (x_1^2 + x_2^2)\big{)} = 0
$

$\implies
\big{(}\lambda \big{)} \big{(} \lambda - \big{(} y_1^2 (x_1^2 + x_2^2) + y_2^2 (x_1^2 + x_2^2)\big{)} = 0
$

$\implies
\big{(}\lambda \big{)} \big{(} \lambda - (y_1^2 + y_2^2)
(x_1^2 + x_2^2)\big{)} = 0
$

Sooooooooooo,

$\lambda_1 = 0,$
$\lambda_2 = (x_1^2 + x_2^2) (y_1^2 + y_2^2)$

The greatest of these two roots, is of course, $lambda_2 = (x_1^2 + x_2^2) (y_1^2 + y_2^2)$, since it absolutely must be positive by virtue of the fact that both x and y are given as non-zero column vectors, and addition and multiplication of positive real numbers will never give a negative result.

-------------------

So FINALLY, we compute the 2-norm of $A$:

$\indent \vert\vert A \vert\vert_2 = \sqrt{\lambda_{max}(A^H A)} = \sqrt{(x_1^2 + x_2^2) (y_1^2 + y_2^2)}$

This problem is the pinnacle of tedium.
-Andy

3. PART B:

The SVD of an $m$ by $n$ matrix is written $A_{mn} = U_{mm} S_{mn} V_{nn}^T$, whereby $U^T U = I, V^T V = I,$ the columns of $U$ are orthogonal eigenvectors of $AA^T$, the columns of $V$ are orthogonal eigenvectors of $A^T A$, and $S$ is a diagonal matrix containing the square roots of eigenvalues from $U$ or $V$ in descending order.

$A = \begin{bmatrix}x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2 \end{bmatrix}$

$A^T = \begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2 \end{bmatrix}$

$A^T A = \begin{bmatrix} y_1^2 (x_1^2 + x_2^2) &
y_1 y_2 (x_1^2 + x_2^2) \\ y_1 y_2 (x_1^2 + x_2^2) &
y_2^2 (x_1^2 + x_2^2) \end{bmatrix}$

$AA^T =
\begin{bmatrix} x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2 \end{bmatrix}
\begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2 \end{bmatrix} =
\begin{bmatrix}
x_1^2 y_1^2 + x_1^2 y_2^2 &
x_1 x_2 y_1^2 + x_1 x_2 y_2^2 \\
x_1 x_2 y_1^2 + x_1 x_2 y_2^2 &
x_2^2 y_1^2 + x_2^2 y_2^2 \end{bmatrix} =$
$\begin{bmatrix} x_1^2 (y_1^2 + y_2^2) & x_1 x_2 (y_1^2 + y_2^2) \\
x_1 x_2 (y_1^2 + y_2^2) & x_2^2 (y_1^2 + y_2^2) \end{bmatrix}$

EIGENVECTORS of $AA^T$:

$\indent \begin{bmatrix}
x_1^2 (y_1^2 + y_2^2) & x_1 x_2 (y_1^2 + y_2^2) \\
x_1 x_2 (y_1^2 + y_2^2) & x_2^2 (y_1^2 + y_2^2) \end{bmatrix}
\begin{bmatrix} \alpha \\ \beta \end{bmatrix} =
\lambda \begin{bmatrix} \alpha \\ \beta \end{bmatrix}$

(Eq. 1) $\indent x_1^2 (y_1^2 + y_2^2)\alpha +
x_1 x_2 (y_1^2 + y_2^2)\beta = \lambda \alpha$

(Eq. 2) $\indent x_1 x_2 (y_1^2 + y_2^2)\alpha +
x_2^2 (y_1^2 + y_2^2)\beta = \lambda \beta$

Rearranging (Eq. 1):

$
x_1^2 (y_1^2 + y_2^2) \alpha +
x_1 x_2 (y_1^2 + y_2^2) \beta - \lambda \alpha = 0$
$
\implies \bigg{(}\big{(} x_1^2 (y_1^2 + y_2^2)\big{)}-\lambda\bigg{)}\alpha + x_1 x_2 (y_1^2 + y_2^2)\beta = 0$

Rearranging (Eq. 2):

$x_1 x_2 (y_1^2 + y_2^2)\alpha +
x_2^2 (y_1^2 + y_2^2)\beta - \lambda \beta =0$
$

\implies x_1 x_2 (y_1^2 + y_2^2)\alpha +
\bigg{(}\big{(} x_2^2 (y_1^2 + y_2^2)\big{)}
- \lambda \bigg{)}\beta =0$

Now we solve for $\lambda$ by setting the determinant of the coefficient matrix equal to 0:

$
\det \begin{bmatrix}
\big{(} x_1^2 (y_1^2 + y_2^2)\big{)}-\lambda &
x_1 x_2 (y_1^2 + y_2^2) \\
x_1 x_2 (y_1^2 + y_2^2) &
\big{(} x_2^2 (y_1^2 + y_2^2)\big{)} - \lambda
\end{bmatrix} = 0$

We now set up the characteristic equation. Solving for $\lambda$ will give us our eigenvalues.

$
\bigg{(} \big{(} x_1^2 (y_1^2 + y_2^2) \big{)} - \lambda \bigg{)}
\bigg{(} \big{(} x_2^2 (y_1^2 + y_2^2)\big{)} - \lambda \bigg{)} -
\big{(} x_1 x_2 (y_1^2 + y_2^2) \big{)}$
$
\big{(} x_1 x_2 (y_1^2 + y_2^2) \big{)} -
x_1^2 x_2^2 (y_1^2 + y_2^2)^2
= 0$

$\implies
x_1^2 x_2^2 (y_1^2 + y_2^2)^2
- \lambda x_1^2 (y_1^2 + y_2^2)
- \lambda x_2^2 (y_1^2 + y_2^2)
+ \lambda^2
- x_1^2 x_2^2 (y_1^2 + y_2^2)^2
= 0$

$\implies
\lambda^2 - \lambda x_1^2 (y_1^2 + y_2^2) -
\lambda x_2^2 (y_1^2 + y_2^2) = 0
$

$\implies
\lambda^2 - \lambda \big{(} x_1^2 (y_1^2 + y_2^2) +
x_2^2 (y_1^2 + y_2^2)\big{)} = 0$

$\implies
\lambda^2 - \lambda \big{(}(x_1^2 + x_2^2)(y_1^2 + y_2^2)\big{)} = 0
$

$\implies \big{(}\lambda\big{)}\big{(}\lambda - (x_1^2 + x_2^2)(y_1^2 + y_2^2)\big{)} = 0$

Therefore, the eigenvalues of $AA^T$ are:

$\lambda_1 = 0$
$\lambda_2 = (x_1^2 + x_2^2)(y_1^2 + y_2^2)$

Now we plug $\lambda$ back in to the original equations to get our eigenvectors.

For $\lambda = 0$, we get

$x_1^2 (y_1^2 + y_2^2)\alpha + x_1 x_2 (y_1^2 + y_2^2)\beta = 0$

$x_1 x_2 (y_1^2 + y_2^2)\alpha +
x_2^2 (y_1^2 + y_2^2)\beta =0$

An appropriate choice is $\alpha = x_2$,
and $\beta = -x_1$. Thus, we have the eigenvector $[x_2, -x_1]$ corresponding to the eigenvalue $\lambda = 0$

For $\lambda = (x_1^2 + x_2^2)(y_1^2 + y_2^2)$, we have

$\bigg{(}\big{(} x_1^2 (y_1^2 + y_2^2)\big{)}-(x_1^2 + x_2^2)(y_1^2 + y_2^2)\bigg{)}\alpha + x_1 x_2 (y_1^2 + y_2^2)\beta = 0$

$x_1 x_2 (y_1^2 + y_2^2)\alpha +
\bigg{(}\big{(} x_2^2 (y_1^2 + y_2^2)\big{)}
- (x_1^2 + x_2^2)(y_1^2 + y_2^2) \bigg{)}\beta =0$

In both equations, we can cancel out a common factor, namely, $(y_1^2 + y_2^2)$. The equations become

$\bigg{(}\big{(} x_1^2 \big{)}-(x_1^2 + x_2^2)\bigg{)}\alpha + x_1 x_2 \beta = (-x_2^2)\alpha + x_1 x_2 \beta = 0$

$x_1 x_2 \alpha +
\bigg{(}\big{(} x_2^2 \big{)}
- (x_1^2 + x_2^2) \bigg{)}\beta =
x_1 x_2 \alpha - x_1^2 \beta = 0$

We now want to find appropriate values for $\alpha$ and $\beta$, using the equations.

$(-x_2^2)\alpha + x_1 x_2 \beta = x_1 x_2 \beta - x_2^2 \alpha =
x_2 (x_1 \beta - x_2 \alpha ) = 0 \implies (x_1 \beta - x_2 \alpha ) = 0$

$x_1 x_2 \alpha - x_1^2 \beta = x_1 (x_2 \alpha - x_1 \beta) =0 \implies (x_2 \alpha - x_1 \beta) = 0$

Since $(x_2 \alpha - x_1 \beta) = (x_2 \alpha - x_1 \beta) = 0$,

$\alpha = x_1$
$\beta = x_2$.

Thus, we have the eigenvector $[x_1, x_2]$ corresponding to the eigenvalue $/lambda = (x_1^2 + x_2^2)(y_1^2 + y_2^2)$.

[more to go]

4. Originally Posted by awaisysf
The 'T' means transposed.

Consider the four non-zero COLUMN vectors in R^2:
x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

Let xTw=0, yTz=0, A=xyT, B=wzT

Determine:

a) The 2-norm of A
There is an easier way using the definition directly:
I will assume all norms here as 2 norms.

Definition: $||A|| = \text{sup} \, \frac{||Az||}{||z||}$ where vector $z$ is varied over all non zero vectors.

Proof: So using the definition $\frac{||xy^Tz||}{||z||} = \left|y^T\left(\frac{z}{||z||}\right)\right|||x||$

We have to maximise the right hand side expression and this can be done by choosing z along y and thus $z = \alpha y$ for some real $\alpha$. Substituting $z$ in,
$||A|| = \text{sup} \, y^T\left(\frac{z}{||z||}\right)||x|| = ||y|| ||x||$

5. Originally Posted by awaisysf
The 'T' means transposed.

Consider the four non-zero COLUMN vectors in R^2:
x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

Let xTw=0, yTz=0, A=xyT, B=wzT

Determine:
b) The singular value decomposition (SVD) of A
To obtain SVD of A, first it suffices to obtain orthogonal eigenvectors of $AA^T$ and $A^TA$.

$AA^T = (xy^T)(yx^T) = ||y||^2 xx^T$
$A^TA = (yx^T)(xy^T) = ||x||^2 yy^T$

By observation it is clear that $x$ is an eigenvector for $AA^T$ and the eigenvalue for that is $||x||^2||y||^2$. Since rank of $xx^T$ is 1, the other eigenvalue $AA^T$ is 0 and $w$ is an eigenvector corresponding to 0 eigenvalue. Also x and w are orthogonal. So we have obtained our spectral decomposition for $AA^T$

Let $U = \left[ \begin{array}{cc}\frac{x}{||x||} & \frac{w}{||w||}\end{array}\right]$ and $\Sigma = \left[ \begin{array}{cc}||x||\, ||y|| & 0 \\ 0 & 0\end{array}\right]$

So $AA^T = U\Sigma^2U^T$. Similarly $A^TA = V\Sigma^2V^T$ where V can be computed similarly.

Exercise: Compute V

So $A = U \Sigma V^T$ is the SVD of A.

6. Originally Posted by awaisysf
The 'T' means transposed.

Consider the four non-zero COLUMN vectors in R^2:
x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

Let xTw=0, yTz=0, A=xyT, B=wzT

Determine:

c) The Range space and Nullspace of A
As seen in the previous post the nullspace of A is span(w). Compute the range space as an exercise

d) The pseudo-inverse de A
Since you know the SVD of A, compute the pseudo inverse the way it is done here.

e) The singular value decomposition (SVD) of A+B
It is an easy exercise to see that $(A+B)(A+B)^T = AA^T + BB^T$ for the given conditions.

Then you can show by a simple manipulation that $A+B = U ( \Sigma + \tilde{\Sigma} ) V^T$ where $\tilde{\Sigma} = \left[ \begin{array}{cc}0 & 0 \\ 0 & ||z||\, ||w||\end{array}\right]$ and the other parameters are the same as those computed in post #5