Results 1 to 6 of 6

Thread: norms and vectors

  1. #1
    Newbie
    Joined
    Dec 2009
    Posts
    9

    norms and vectors

    The 'T' means transposed.

    Consider the four non-zero COLUMN vectors in R^2:
    x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

    Let xTw=0, yTz=0, A=xyT, B=wzT

    Determine:

    a) The 2-norm of A
    b) The singular value decomposition (SVD) of A
    c) The Range space and Nullspace of A
    d) The pseudo-inverse de A
    e) The singular value decomposition (SVD) of A+B
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Senior Member
    Joined
    Mar 2008
    From
    Pennsylvania, USA
    Posts
    339
    Thanks
    46
    The 'T' means transposed.

    Consider the four non-zero COLUMN vectors in R^2:
    x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

    Let xTw=0, yTz=0, A=xyT, B=wzT

    Determine:

    a) The 2-norm of A
    b) The singular value decomposition (SVD) of A
    c) The Range space and Nullspace of A
    d) The pseudo-inverse de A
    e) The singular value decomposition (SVD) of A+B

    --------------------------------------------------------
    $\displaystyle x^Tw =
    \begin{bmatrix}x_1 & x_2\end{bmatrix}
    \begin{bmatrix}w_1 \\ w_2 \end{bmatrix} =
    \begin{bmatrix}x_1 w_1 + x_2 w_2\end{bmatrix}=0
    $

    $\displaystyle y^Tz =
    \begin{bmatrix}y_1 & y_2 \end{bmatrix}
    \begin{bmatrix}z_1 \\ z_2 \end{bmatrix} =
    \begin{bmatrix}y_1 z_1 + y_2 z_2 \end{bmatrix} =0
    $

    $\displaystyle A = xy^T =
    \begin{bmatrix}x_1 \\ x_2 \end{bmatrix}
    \begin{bmatrix}y_1 & y_2 \end{bmatrix} =
    \begin{bmatrix}x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2\end{bmatrix}
    $

    $\displaystyle B = wz^T =
    \begin{bmatrix}w_1 \\ w_2 \end{bmatrix}
    \begin{bmatrix}z_1 & z_2 \end{bmatrix} =
    \begin{bmatrix}w_1 z_1 & w_1 z_2 \\ w_2 z_1 & w_2 z_2\end{bmatrix}
    $


    PART A:

    $\displaystyle \vert\vert A \vert\vert_2 = \sqrt{\lambda_{max}(A^H A)}$, whereby $\displaystyle A^H $ is the conjugate transpose of $\displaystyle A $.

    $\displaystyle A = \begin{bmatrix}x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2\end{bmatrix} $

    $\displaystyle A^T = \begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2\end{bmatrix} $

    $\displaystyle A^H \equiv \bar{A}^T = A^T =\begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2\end{bmatrix} $

    (Note that since each of the entries in $\displaystyle A^T \text{ are real, } A^H \text{ can simply be written here as }A^T $).

    -------
    EDIT: I feel like I owe more of an explanation for why $\displaystyle \bar{A}^T = A^T $ in this problem. $\displaystyle \bar{A}^T $ is called the conjugate transpose matrix of $\displaystyle A $ because each entry in $\displaystyle \bar{A}^T $ is the complex conjugate (remember from algebra?) of its respective entry (i.e., same location in the matrix) in $\displaystyle A^T $. In our case, since $\displaystyle x_1,x_2,y_1,y_2 \in $ R, the entries in $\displaystyle A^T $ and $\displaystyle \bar{A}^T $ are identical; $\displaystyle \bar{A}^T = A^T $.
    --------


    So,

    $\displaystyle A^H A = A^T A =\begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2\end{bmatrix} \begin{bmatrix}x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2\end{bmatrix} =
    \begin{bmatrix}x_1^2 y_1^2 + x_2^2 y_1^2 &
    x_1^2 y_1 y_2 + x_2^2 y_1 y_2 \\
    x_1^2 y_1 y_2 + x_2^2 y_1 y_2 &
    x_1^2 y_2^2 + x_2^2 y_2^2 \end{bmatrix} =
    $ $\displaystyle
    \begin{bmatrix} y_1^2 (x_1^2 + x_2^2) &
    y_1 y_2 (x_1^2 + x_2^2) \\
    y_1 y_2 (x_1^2 + x_2^2) &
    y_2^2 (x_1^2 + x_2^2) \end{bmatrix}$


    Now, we compute our characteristic polynomial, which will give us what we need to know in terms of eigenvalues.


    $\displaystyle \det(A^H A - \lambda I) =
    \det\begin{bmatrix} y_1^2 (x_1^2 + x_2^2) - \lambda&
    y_1 y_2 (x_1^2 + x_2^2) \\ y_1 y_2 (x_1^2 + x_2^2) &
    y_2^2 (x_1^2 + x_2^2) - \lambda \end{bmatrix} = 0 $

    $\displaystyle \implies
    \big{(}y_1^2 (x_1^2 + x_2^2) - \lambda \big{)}
    \big{(}y_2^2 (x_1^2 + x_2^2) - \lambda \big{)} -
    \big{(}y_1 y_2 (x_1^2 + x_2^2) \big{)}
    \big{(}y_1 y_2 (x_1^2 + x_2^2) \big{)}

    $

    $\displaystyle \implies
    y_1^2 y_2^2 (x_1^2 + x_2^2)^2 - \lambda y_1^2(x_1^2 + x_2^2) -
    \lambda y_2^2(x_1^2 + x_2^2) + \lambda^2 -
    y_1^2 y_2^2 (x_1^2 + x_2^2)^2 = 0 $

    $\displaystyle \implies
    -\lambda y_1^2(x_1^2 + x_2^2) - \lambda y_2^2(x_1^2 + x_2^2) + \lambda^2 = 0 $

    $\displaystyle \implies
    \lambda^2 - \lambda y_1^2 (x_1^2 + x_2^2) -
    \lambda y_2^2 (x_1^2 + x_2^2) = 0
    $


    $\displaystyle \implies
    \lambda^2 - \lambda y_1^2 (x_1^2 + x_2^2) -
    \lambda y_2^2 (x_1^2 + x_2^2) = 0
    $


    $\displaystyle \implies
    \lambda^2 - \lambda \big{(} (y_1^2 (x_1^2 + x_2^2) + y_2^2 (x_1^2 + x_2^2)\big{)} = 0
    $


    $\displaystyle \implies
    \big{(}\lambda \big{)} \big{(} \lambda - \big{(} y_1^2 (x_1^2 + x_2^2) + y_2^2 (x_1^2 + x_2^2)\big{)} = 0
    $


    $\displaystyle \implies
    \big{(}\lambda \big{)} \big{(} \lambda - (y_1^2 + y_2^2)
    (x_1^2 + x_2^2)\big{)} = 0
    $


    Sooooooooooo,

    $\displaystyle \lambda_1 = 0, $
    $\displaystyle \lambda_2 = (x_1^2 + x_2^2) (y_1^2 + y_2^2) $

    The greatest of these two roots, is of course, $\displaystyle lambda_2 = (x_1^2 + x_2^2) (y_1^2 + y_2^2) $, since it absolutely must be positive by virtue of the fact that both x and y are given as non-zero column vectors, and addition and multiplication of positive real numbers will never give a negative result.

    -------------------

    So FINALLY, we compute the 2-norm of $\displaystyle A $:

    $\displaystyle \indent \vert\vert A \vert\vert_2 = \sqrt{\lambda_{max}(A^H A)} = \sqrt{(x_1^2 + x_2^2) (y_1^2 + y_2^2)} $

    This problem is the pinnacle of tedium.
    -Andy
    Last edited by abender; Dec 16th 2009 at 04:39 PM.
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Senior Member
    Joined
    Mar 2008
    From
    Pennsylvania, USA
    Posts
    339
    Thanks
    46
    PART B:

    The SVD of an $\displaystyle m$ by $\displaystyle n$ matrix is written $\displaystyle A_{mn} = U_{mm} S_{mn} V_{nn}^T $, whereby $\displaystyle U^T U = I, V^T V = I, $ the columns of $\displaystyle U$ are orthogonal eigenvectors of $\displaystyle AA^T $, the columns of $\displaystyle V$ are orthogonal eigenvectors of $\displaystyle A^T A$, and $\displaystyle S$ is a diagonal matrix containing the square roots of eigenvalues from $\displaystyle U $ or $\displaystyle V $ in descending order.


    $\displaystyle A = \begin{bmatrix}x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2 \end{bmatrix} $

    $\displaystyle A^T = \begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2 \end{bmatrix} $

    $\displaystyle A^T A = \begin{bmatrix} y_1^2 (x_1^2 + x_2^2) &
    y_1 y_2 (x_1^2 + x_2^2) \\ y_1 y_2 (x_1^2 + x_2^2) &
    y_2^2 (x_1^2 + x_2^2) \end{bmatrix} $

    $\displaystyle AA^T =
    \begin{bmatrix} x_1 y_1 & x_1 y_2 \\ x_2 y_1 & x_2 y_2 \end{bmatrix}
    \begin{bmatrix}x_1 y_1 & x_2 y_1 \\ x_1 y_2 & x_2 y_2 \end{bmatrix} =
    \begin{bmatrix}
    x_1^2 y_1^2 + x_1^2 y_2^2 &
    x_1 x_2 y_1^2 + x_1 x_2 y_2^2 \\
    x_1 x_2 y_1^2 + x_1 x_2 y_2^2 &
    x_2^2 y_1^2 + x_2^2 y_2^2 \end{bmatrix} = $ $\displaystyle \begin{bmatrix} x_1^2 (y_1^2 + y_2^2) & x_1 x_2 (y_1^2 + y_2^2) \\
    x_1 x_2 (y_1^2 + y_2^2) & x_2^2 (y_1^2 + y_2^2) \end{bmatrix}$

    EIGENVECTORS of $\displaystyle AA^T $:

    $\displaystyle \indent \begin{bmatrix}
    x_1^2 (y_1^2 + y_2^2) & x_1 x_2 (y_1^2 + y_2^2) \\
    x_1 x_2 (y_1^2 + y_2^2) & x_2^2 (y_1^2 + y_2^2) \end{bmatrix}
    \begin{bmatrix} \alpha \\ \beta \end{bmatrix} =
    \lambda \begin{bmatrix} \alpha \\ \beta \end{bmatrix} $


    (Eq. 1) $\displaystyle \indent x_1^2 (y_1^2 + y_2^2)\alpha +
    x_1 x_2 (y_1^2 + y_2^2)\beta = \lambda \alpha $
    (Eq. 2) $\displaystyle \indent x_1 x_2 (y_1^2 + y_2^2)\alpha +
    x_2^2 (y_1^2 + y_2^2)\beta = \lambda \beta $


    Rearranging (Eq. 1):

    $\displaystyle
    x_1^2 (y_1^2 + y_2^2) \alpha +
    x_1 x_2 (y_1^2 + y_2^2) \beta - \lambda \alpha = 0 $ $\displaystyle
    \implies \bigg{(}\big{(} x_1^2 (y_1^2 + y_2^2)\big{)}-\lambda\bigg{)}\alpha + x_1 x_2 (y_1^2 + y_2^2)\beta = 0 $


    Rearranging (Eq. 2):

    $\displaystyle x_1 x_2 (y_1^2 + y_2^2)\alpha +
    x_2^2 (y_1^2 + y_2^2)\beta - \lambda \beta =0 $ $\displaystyle

    \implies x_1 x_2 (y_1^2 + y_2^2)\alpha +
    \bigg{(}\big{(} x_2^2 (y_1^2 + y_2^2)\big{)}
    - \lambda \bigg{)}\beta =0 $


    Now we solve for $\displaystyle \lambda $ by setting the determinant of the coefficient matrix equal to 0:

    $\displaystyle
    \det \begin{bmatrix}
    \big{(} x_1^2 (y_1^2 + y_2^2)\big{)}-\lambda &
    x_1 x_2 (y_1^2 + y_2^2) \\
    x_1 x_2 (y_1^2 + y_2^2) &
    \big{(} x_2^2 (y_1^2 + y_2^2)\big{)} - \lambda
    \end{bmatrix} = 0 $


    We now set up the characteristic equation. Solving for $\displaystyle \lambda $ will give us our eigenvalues.


    $\displaystyle
    \bigg{(} \big{(} x_1^2 (y_1^2 + y_2^2) \big{)} - \lambda \bigg{)}
    \bigg{(} \big{(} x_2^2 (y_1^2 + y_2^2)\big{)} - \lambda \bigg{)} -
    \big{(} x_1 x_2 (y_1^2 + y_2^2) \big{)} $ $\displaystyle
    \big{(} x_1 x_2 (y_1^2 + y_2^2) \big{)} -
    x_1^2 x_2^2 (y_1^2 + y_2^2)^2
    = 0 $


    $\displaystyle \implies
    x_1^2 x_2^2 (y_1^2 + y_2^2)^2
    - \lambda x_1^2 (y_1^2 + y_2^2)
    - \lambda x_2^2 (y_1^2 + y_2^2)
    + \lambda^2
    - x_1^2 x_2^2 (y_1^2 + y_2^2)^2
    = 0 $


    $\displaystyle \implies
    \lambda^2 - \lambda x_1^2 (y_1^2 + y_2^2) -
    \lambda x_2^2 (y_1^2 + y_2^2) = 0
    $


    $\displaystyle \implies
    \lambda^2 - \lambda \big{(} x_1^2 (y_1^2 + y_2^2) +
    x_2^2 (y_1^2 + y_2^2)\big{)} = 0 $


    $\displaystyle \implies
    \lambda^2 - \lambda \big{(}(x_1^2 + x_2^2)(y_1^2 + y_2^2)\big{)} = 0
    $


    $\displaystyle \implies \big{(}\lambda\big{)}\big{(}\lambda - (x_1^2 + x_2^2)(y_1^2 + y_2^2)\big{)} = 0 $

    Therefore, the eigenvalues of $\displaystyle AA^T$ are:

    $\displaystyle \lambda_1 = 0 $
    $\displaystyle \lambda_2 = (x_1^2 + x_2^2)(y_1^2 + y_2^2) $


    Now we plug $\displaystyle \lambda$ back in to the original equations to get our eigenvectors.

    For $\displaystyle \lambda = 0 $, we get

    $\displaystyle x_1^2 (y_1^2 + y_2^2)\alpha + x_1 x_2 (y_1^2 + y_2^2)\beta = 0 $

    $\displaystyle x_1 x_2 (y_1^2 + y_2^2)\alpha +
    x_2^2 (y_1^2 + y_2^2)\beta =0 $

    An appropriate choice is $\displaystyle \alpha = x_2 $,
    and $\displaystyle \beta = -x_1 $. Thus, we have the eigenvector $\displaystyle [x_2, -x_1] $ corresponding to the eigenvalue $\displaystyle \lambda = 0 $


    For $\displaystyle \lambda = (x_1^2 + x_2^2)(y_1^2 + y_2^2) $, we have


    $\displaystyle \bigg{(}\big{(} x_1^2 (y_1^2 + y_2^2)\big{)}-(x_1^2 + x_2^2)(y_1^2 + y_2^2)\bigg{)}\alpha + x_1 x_2 (y_1^2 + y_2^2)\beta = 0 $

    $\displaystyle x_1 x_2 (y_1^2 + y_2^2)\alpha +
    \bigg{(}\big{(} x_2^2 (y_1^2 + y_2^2)\big{)}
    - (x_1^2 + x_2^2)(y_1^2 + y_2^2) \bigg{)}\beta =0 $


    In both equations, we can cancel out a common factor, namely, $\displaystyle (y_1^2 + y_2^2) $. The equations become


    $\displaystyle \bigg{(}\big{(} x_1^2 \big{)}-(x_1^2 + x_2^2)\bigg{)}\alpha + x_1 x_2 \beta = (-x_2^2)\alpha + x_1 x_2 \beta = 0 $


    $\displaystyle x_1 x_2 \alpha +
    \bigg{(}\big{(} x_2^2 \big{)}
    - (x_1^2 + x_2^2) \bigg{)}\beta =
    x_1 x_2 \alpha - x_1^2 \beta = 0$


    We now want to find appropriate values for $\displaystyle \alpha $ and $\displaystyle \beta $, using the equations.


    $\displaystyle (-x_2^2)\alpha + x_1 x_2 \beta = x_1 x_2 \beta - x_2^2 \alpha =
    x_2 (x_1 \beta - x_2 \alpha ) = 0 \implies (x_1 \beta - x_2 \alpha ) = 0 $

    $\displaystyle x_1 x_2 \alpha - x_1^2 \beta = x_1 (x_2 \alpha - x_1 \beta) =0 \implies (x_2 \alpha - x_1 \beta) = 0 $

    Since $\displaystyle (x_2 \alpha - x_1 \beta) = (x_2 \alpha - x_1 \beta) = 0 $,

    $\displaystyle \alpha = x_1 $
    $\displaystyle \beta = x_2 $.


    Thus, we have the eigenvector $\displaystyle [x_1, x_2] $ corresponding to the eigenvalue $\displaystyle /lambda = (x_1^2 + x_2^2)(y_1^2 + y_2^2) $.

    [more to go]
    Last edited by abender; Dec 16th 2009 at 09:20 PM.
    Follow Math Help Forum on Facebook and Google+

  4. #4
    Lord of certain Rings
    Isomorphism's Avatar
    Joined
    Dec 2007
    From
    IISc, Bangalore
    Posts
    1,465
    Thanks
    6
    Quote Originally Posted by awaisysf View Post
    The 'T' means transposed.

    Consider the four non-zero COLUMN vectors in R^2:
    x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

    Let xTw=0, yTz=0, A=xyT, B=wzT

    Determine:

    a) The 2-norm of A
    There is an easier way using the definition directly:
    I will assume all norms here as 2 norms.

    Definition:$\displaystyle ||A|| = \text{sup} \, \frac{||Az||}{||z||}$ where vector $\displaystyle z$ is varied over all non zero vectors.

    Proof: So using the definition$\displaystyle \frac{||xy^Tz||}{||z||} = \left|y^T\left(\frac{z}{||z||}\right)\right|||x||$

    We have to maximise the right hand side expression and this can be done by choosing z along y and thus $\displaystyle z = \alpha y$ for some real $\displaystyle \alpha$. Substituting $\displaystyle z$ in,
    $\displaystyle ||A|| = \text{sup} \, y^T\left(\frac{z}{||z||}\right)||x|| = ||y|| ||x||$
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Lord of certain Rings
    Isomorphism's Avatar
    Joined
    Dec 2007
    From
    IISc, Bangalore
    Posts
    1,465
    Thanks
    6
    Quote Originally Posted by awaisysf View Post
    The 'T' means transposed.

    Consider the four non-zero COLUMN vectors in R^2:
    x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

    Let xTw=0, yTz=0, A=xyT, B=wzT

    Determine:
    b) The singular value decomposition (SVD) of A
    To obtain SVD of A, first it suffices to obtain orthogonal eigenvectors of $\displaystyle AA^T$ and $\displaystyle A^TA$.

    $\displaystyle AA^T = (xy^T)(yx^T) = ||y||^2 xx^T$
    $\displaystyle A^TA = (yx^T)(xy^T) = ||x||^2 yy^T$

    By observation it is clear that $\displaystyle x$ is an eigenvector for $\displaystyle AA^T$ and the eigenvalue for that is $\displaystyle ||x||^2||y||^2$. Since rank of $\displaystyle xx^T$ is 1, the other eigenvalue $\displaystyle AA^T$ is 0 and $\displaystyle w$ is an eigenvector corresponding to 0 eigenvalue. Also x and w are orthogonal. So we have obtained our spectral decomposition for $\displaystyle AA^T$

    Let $\displaystyle U = \left[ \begin{array}{cc}\frac{x}{||x||} & \frac{w}{||w||}\end{array}\right]$ and $\displaystyle \Sigma = \left[ \begin{array}{cc}||x||\, ||y|| & 0 \\ 0 & 0\end{array}\right]$

    So $\displaystyle AA^T = U\Sigma^2U^T$. Similarly $\displaystyle A^TA = V\Sigma^2V^T$ where V can be computed similarly.

    Exercise: Compute V

    So $\displaystyle A = U \Sigma V^T$ is the SVD of A.
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Lord of certain Rings
    Isomorphism's Avatar
    Joined
    Dec 2007
    From
    IISc, Bangalore
    Posts
    1,465
    Thanks
    6
    Quote Originally Posted by awaisysf View Post
    The 'T' means transposed.

    Consider the four non-zero COLUMN vectors in R^2:
    x=[x1, x2]; y=[y1, y2]; w=[w1, w2]; z=[z1, z2]

    Let xTw=0, yTz=0, A=xyT, B=wzT

    Determine:

    c) The Range space and Nullspace of A
    As seen in the previous post the nullspace of A is span(w). Compute the range space as an exercise

    d) The pseudo-inverse de A
    Since you know the SVD of A, compute the pseudo inverse the way it is done here.

    e) The singular value decomposition (SVD) of A+B
    It is an easy exercise to see that $\displaystyle (A+B)(A+B)^T = AA^T + BB^T$ for the given conditions.

    Then you can show by a simple manipulation that $\displaystyle A+B = U ( \Sigma + \tilde{\Sigma} ) V^T$ where $\displaystyle \tilde{\Sigma} = \left[ \begin{array}{cc}0 & 0 \\ 0 & ||z||\, ||w||\end{array}\right]$ and the other parameters are the same as those computed in post #5
    Follow Math Help Forum on Facebook and Google+

Similar Math Help Forum Discussions

  1. norms on R
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: Oct 10th 2011, 02:42 PM
  2. [SOLVED] Norms in R^m
    Posted in the Differential Geometry Forum
    Replies: 7
    Last Post: Feb 26th 2011, 08:51 PM
  3. Norms help
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: Feb 23rd 2011, 07:35 AM
  4. vector norms and matrix norms
    Posted in the Advanced Algebra Forum
    Replies: 5
    Last Post: May 13th 2010, 02:42 PM
  5. Norms
    Posted in the Differential Geometry Forum
    Replies: 2
    Last Post: Nov 28th 2009, 05:00 PM

Search Tags


/mathhelpforum @mathhelpforum