Page 1 of 3 123 LastLast
Results 1 to 15 of 35

Math Help - linear algebra - concept proofs

  1. #1
    Junior Member
    Joined
    Jul 2010
    Posts
    66

    linear algebra - concept proofs

    I attached the 3 problems i'm having trouble with. They're easier to see in the attachment rather than if i typed them.

    Can someone please explain these to me step by step or guide me through them? I'm not even sure where to begin.
    Thank you

    A_4 is the matrix
    [.1 .6
    .9 .4]
    Attached Thumbnails Attached Thumbnails linear algebra - concept proofs-probs.png  
    Last edited by SpiffyEh; July 29th 2010 at 08:24 AM.
    Follow Math Help Forum on Facebook and Google+

  2. #2
    Banned
    Joined
    Oct 2009
    Posts
    4,261
    Thanks
    2
    Quote Originally Posted by SpiffyEh View Post
    I attached the 3 problems i'm having trouble with. They're easier to see in the attachment rather than if i typed them.

    Can someone please explain these to me step by step or guide me through them? I'm not even sure where to begin.
    Thank you

    I think questions 1,2 are practically impossible to understand: what´s q\,,\,A_4^n... ? I suppose x_1\,,\,x_2 are the components of x...??
    And what's A^H\,,\,\sigma(A) ? Is the last one the signature of a matrix A?

    Tonio
    Follow Math Help Forum on Facebook and Google+

  3. #3
    Junior Member
    Joined
    Jul 2010
    Posts
    66
    They're basically proofs, I'm supposed to show how those things lead to one another butr I don't know how to go about doing that
    Follow Math Help Forum on Facebook and Google+

  4. #4
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    For the spectrum of self-adjoint matrices, you can proceed as follows:

    Assume A=A^{\dagger}. (This is the more common notation for Hilbert adjoint.) Further assume that Ax=\lambda x for some nonzero vector x. Then \lambda\in\sigma(A).

    Now examine the inner product \langle x|Ax\rangle=\langle x|\lambda x\rangle=\lambda\langle x|x\rangle. However, it's also true that \langle x|Ax\rangle=\langle Ax|x\rangle, by adjointness. Hence,
    \langle x|Ax\rangle=\langle \lambda x|x\rangle=\overline{\lambda}\langle x|x\rangle.

    Can you see where to go from here?
    Last edited by Ackbeet; July 29th 2010 at 01:48 AM. Reason: Fixing LaTeX.
    Follow Math Help Forum on Facebook and Google+

  5. #5
    Banned
    Joined
    Oct 2009
    Posts
    4,261
    Thanks
    2
    Quote Originally Posted by SpiffyEh View Post
    They're basically proofs, I'm supposed to show how those things lead to one another butr I don't know how to go about doing that

    This doesn't answer my question: what are the symbols you're using, anyway? Without that it's impossible to answer questions.

    Tonio
    Follow Math Help Forum on Facebook and Google+

  6. #6
    Junior Member
    Joined
    Jul 2010
    Posts
    66
    I really don't see where it's going from here. I understand the relationships but I don't see where they're going
    Follow Math Help Forum on Facebook and Google+

  7. #7
    Junior Member
    Joined
    Jul 2010
    Posts
    66
    Quote Originally Posted by tonio View Post
    This doesn't answer my question: what are the symbols you're using, anyway? Without that it's impossible to answer questions.

    Tonio
    I'm pretty sure q is a steady state vector.

    Sorry, I completely forgot to include A_4
    A_4 is the matrix
    [.1 .6
    .9 .4]

    A^H is the hermitian of A
    And i'm guessing X_1 and X_2 are the components of x, i'm not sure about that one

    Lastly, I honestly don't know what sigma(A) is
    Follow Math Help Forum on Facebook and Google+

  8. #8
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Ok. I've just shown that \lambda\langle x|x\rangle=\overline{\lambda}\langle x|x\rangle, and we know that x\not=0. So what can you do with this equation?

    By the way, \sigma(A) is a standard notation in functional analysis for the spectrum of the operator A. If A is a finite dimensional matrix, then \sigma(A)=\sigma_{p}(A)=\{\lambda|Ax=\lambda x\land x\not=0\}. The middle term there is called the point spectrum, and the last set there is just the eigenvalues. The eigenvalues, by definition, are equal to the point spectrum, and that's true of any operator.
    Follow Math Help Forum on Facebook and Google+

  9. #9
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    For the first problem, I would diagonalize A_{4}. That should allow you to take the limit quite easily.
    Follow Math Help Forum on Facebook and Google+

  10. #10
    Junior Member
    Joined
    Jul 2010
    Posts
    66
    I hoestly still don't understand the spectrum one. I just don't see where i'm supposed to be going with it. We were never told what sigma(A) means or what its called so I couldn't find it in the book, thank you for explaing that.

    As for the first problem, I diagonalized it for another problem and got
    [1 0
    0 -.5]
    The problem before the first one told me that q = [2/5 3/5]^T for A_4. I don't see the limit going to that.
    Follow Math Help Forum on Facebook and Google+

  11. #11
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    About the spectrum problem:

    You can divide out by the \langle x|x\rangle, and you're left with the equation \lambda=\overline{\lambda}. What does that tell you?

    About the Limits of Time Series problem: when you diagonalized A_{4}, you found an invertible matrix P such that A_{4}=PDP^{-1}, where D is the diagonal matrix.

    So, proving that

    \displaystyle{\lim_{n\to\infty}A_{4}^{n}x=q} is the same as proving that

    \displaystyle{\lim_{n\to\infty}PD^{n}P^{-1}x=q}.

    About the Connection to Transposition problem: are there any assumptions about the size of A?
    Follow Math Help Forum on Facebook and Google+

  12. #12
    Junior Member
    Joined
    Jul 2010
    Posts
    66
    Quote Originally Posted by Ackbeet View Post
    About the spectrum problem:

    You can divide out by the \langle x|x\rangle, and you're left with the equation \lambda=\overline{\lambda}. What does that tell you?
    Doesn't that mean that lambda is a constant, a real constant since its conjugate is the same?

    Quote Originally Posted by Ackbeet View Post
    About the Limits of Time Series problem: when you diagonalized A_{4}, you found an invertible matrix P such that A_{4}=PDP^{-1}, where D is the diagonal matrix.

    So, proving that

    \displaystyle{\lim_{n\to\infty}A_{4}^{n}x=q} is the same as proving that

    \displaystyle{\lim_{n\to\infty}PD^{n}P^{-1}x=q}.
    I'll try that. So can I just take diagonal entries of the matrix to the nth power? Also, I don't see that going to the q from the problem before this one.

    Quote Originally Posted by Ackbeet View Post
    About the Connection to Transposition problem: are there any assumptions about the size of A?
    I think its a square matrix. I thought about this one, since det(A) = det(A^T) you can relate that to finding eigenvalues, which means the eigenvalues would be the same. So there would be n-many eigenvectors correct? Does that logic make sense and prove it?
    Follow Math Help Forum on Facebook and Google+

  13. #13
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    Doesn't that mean that lambda is a constant, a real constant since its conjugate is the same?
    Well, eigenvalues are assumed to be constants already. But yes, if a number is equal to its complex conjugate, then it's real. This means your'e done with that problem: you can now show that if a number is an eigenvalue, then it's real. That proves the set inclusion property you were asked to show.

    I'll try that. So can I just take diagonal entries of the matrix to the nth power? Also, I don't see that going to the q from the problem before this one.
    You tell me whether the nth power of a diagonal matrix can be computed by taking the nth power of the numbers on the diagonal. Hint: try squaring a diagonal matrix. You'll see what happens. Once you square it, try cubing it. Etc. Incidentally, I wouldn't recommend taking the limit of the matrix and then computing the LHS. I think the diagonalization will allow you to compute everything on the LHS of the equation that is to the right of the limit sign, and then you'd take the limit. The result should be a column vector. Show me what you have, and we'll see where that goes.

    I think its a square matrix. I thought about this one, since det(A) = det(A^T) you can relate that to finding eigenvalues, which means the eigenvalues would be the same.
    The eigenvalues would be the same, I agree.

    So there would be n-many eigenvectors correct? Does that logic make sense and prove it?
    I think it likely. But it needs a proof. The eigenvectors of a matrix would not be, I think, the same as the eigenvectors of its transpose. Therefore, it's not inherently obvious, at least to me, that there would be the same number of linearly independent eigenvectors.

    I asked if you knew what the size of A was. Of course it's square, or the whole eigenvalue process would be undefined. I'm wondering if it's n x n or not. Because if it is, there might be a very nice way of relating the eigenvectors of A to those of A^{T}.
    Follow Math Help Forum on Facebook and Google+

  14. #14
    Junior Member
    Joined
    Jul 2010
    Posts
    66
    Quote Originally Posted by Ackbeet View Post
    Well, eigenvalues are assumed to be constants already. But yes, if a number is equal to its complex conjugate, then it's real. This means your'e done with that problem: you can now show that if a number is an eigenvalue, then it's real. That proves the set inclusion property you were asked to show.
    Oh ok, that makes alittle more sense. I'm still a little confused about when you said \langle x|Ax\rangle=\langle x|\lambda x\rangle=\lambda\langle x|x\rangle. How do you go from the 2nd part to the 3rd? Is it because lambda is just a constant?

    Quote Originally Posted by Ackbeet View Post
    You tell me whether the nth power of a diagonal matrix can be computed by taking the nth power of the numbers on the diagonal. Hint: try squaring a diagonal matrix. You'll see what happens. Once you square it, try cubing it. Etc. Incidentally, I wouldn't recommend taking the limit of the matrix and then computing the LHS. I think the diagonalization will allow you to compute everything on the LHS of the equation that is to the right of the limit sign, and then you'd take the limit. The result should be a column vector. Show me what you have, and we'll see where that goes.
    I tried multiplying diagonal matricies and it is te nth power on the diagonal. So I got
    P = P^(-1) =
    [ 2 -1 [ 1 1
    3 1] -3 2]
    When I compute P * D^n * P^(-1)
    I get
    [.5^n 3^n
    4.5^n 2^n]
    Am I on the right track?

    Quote Originally Posted by Ackbeet View Post
    I think it likely. But it needs a proof. The eigenvectors of a matrix would not be, I think, the same as the eigenvectors of its transpose. Therefore, it's not inherently obvious, at least to me, that there would be the same number of linearly independent eigenvectors.

    I asked if you knew what the size of A was. Of course it's square, or the whole eigenvalue process would be undefined. I'm wondering if it's n x n or not. Because if it is, there might be a very nice way of relating the eigenvectors of A to those of A^{T}.
    I think it is n x n. I'm not sure how to relate the eigenvectors, but I understand the eigenvalue part.
    Follow Math Help Forum on Facebook and Google+

  15. #15
    A Plied Mathematician
    Joined
    Jun 2010
    From
    CT, USA
    Posts
    6,318
    Thanks
    4
    Awards
    2
    How do you go from the 2nd part to the 3rd?
    This is one of the axioms of inner products. In physics, at least, we assume that the inner product is linear in the second term. That is, \langle x|ay\rangle=a\langle x|y\rangle and \langle x|(y+z)\rangle=\langle x|y\rangle+\langle x|z\rangle for all scalars (yes, constants) a and vectors x, y, and z. With the inner product having conjugate symmetry, you can show that the inner product is conjugate linear in the first term. That's where the complex conjugate of \lambda came from.

    Moving on to the second problem, I agree with your P, but not with your P^{-1}. You need to multiply the matrix you have by 1/5 to get the correct inverse. I also think your PD^{n}P^{-1} needs a little more careful work. Each entry in the matrix there is going to be the sum of two different elements to the nth power. Incidentally, the q you mentioned in post # 10 is correct.

    About the third problem. See here for a very interesting discussion of left and right eigenvectors. They have much to do with the transpose matrix. You might find either what you need, or an idea. You might try playing around with determinants, perhaps of Equation (18).
    Follow Math Help Forum on Facebook and Google+

Page 1 of 3 123 LastLast

Similar Math Help Forum Discussions

  1. Easy Linear Algebra Proofs
    Posted in the Advanced Algebra Forum
    Replies: 5
    Last Post: July 8th 2010, 06:22 AM
  2. Linear algebra proofs
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: July 5th 2010, 12:19 PM
  3. linear algebra proofs
    Posted in the Advanced Algebra Forum
    Replies: 2
    Last Post: April 2nd 2010, 09:25 PM
  4. linear algebra proofs
    Posted in the Advanced Algebra Forum
    Replies: 0
    Last Post: March 20th 2010, 11:06 PM
  5. Proofs in linear algebra
    Posted in the Advanced Algebra Forum
    Replies: 1
    Last Post: September 21st 2009, 08:01 AM

Search Tags


/mathhelpforum @mathhelpforum