Have you explored singular value decomposition?
Imagine the following problem: we have a set of n matrix equations in the form of :
[b1] = [A] * [b0]
[b2] = [A] * [b1]
etc.
vertical vectors [b0], [b1], ... are GIVEN. We try to estimate matrix A. As there are many equations (more than cells in matrix A) the system has no solutions.
Is there any method that would allow to concisely write target function to estimate A by, for instance, minimizing sum of squares or applying other optimization technique?
([b1] - [A] * [b0])^2 + ([b2] - [A] * [b1])^2 + ... -> minimize
I have to input explicitly and symbolically the derivative of target function to the software.
Kind regards and thanks for help
J
This may or may not be helpful, but whenever I see matrix exponentiation, I start to think spectrum and eigenvalues. Now, you've got . If you could diagonalize , then you could find an orthogonal matrix , and a diagonal matrix such that . Then, by a telescoping series, you'd have , and is easy to calculate (just exponentiate the numbers on the main diagonal). Like I said, this idea may or may not be able to help you.
I should also point out that just because your system has more equations than unknowns does not imply that the system is not solvable. You might have loads of redundant equations. You might even, if you eliminate enough redundant equations, end up with an under-determined system. Certainly, if the problem was generated by a particular matrix and starting vector , then you've got yourself a consistent system that might well be solvable.