# Matrix question

• Jan 23rd 2011, 10:26 AM
worc3247
Matrix question
Say you have a 4x3 matrix P, whose entries are all integers.
What is a necessary and sufficient condition on $\displaystyle \mathbf b$ such that $\displaystyle P \mathbf x = \mathbf b$ (where $\displaystyle x = \begin{pmatrix} x_1 \\ x_2 \\ x_3\end{pmatrix}$ and $\displaystyle b = \begin{pmatrix} b_1 \\ b_2 \\ b_3 \\ b_4 \end{pmatrix}$) has a solution?
• Jan 23rd 2011, 10:54 AM
TheEmptySet
Quote:

Originally Posted by worc3247
Say you have a 4x3 matrix P, whose entries are all integers.
What is a necessary and sufficient condition on $\displaystyle \mathbf b$ such that $\displaystyle P \mathbf x = \mathbf b$ (where $\displaystyle x = \begin{pmatrix} x_1 \\ x_2 \\ x_3\end{pmatrix}$ and $\displaystyle b = \begin{pmatrix} b_1 \\ b_2 \\ b_3 \end{pmatrix}$) has a solution?

First as a side note the column vector $\displaystyle \vec{b}$ should be a $\displaystyle 4 \times 1$.

If you write this out as a linear system it is overdetermined. It is saying that you have 4 equations in 3 unknowns. So a necessary condition would be that two of the rows (in the augmented) matrix must be identical. This alone is not enough sufficient. Now if you eliminate this 4th row from both $\displaystyle \vec{p},\vec{b}$ you will now have a system of equations in 3 variables with 3 unknowns. Now what has to be true for this system to have a solution for the new column vector $\displaystyle \hat{b}$.
• Jan 23rd 2011, 11:16 AM
worc3247
They all have to be different equations so that you don't actually have two equations for 3 unknowns?
• Jan 23rd 2011, 11:36 AM
TheEmptySet
Quote:

Originally Posted by worc3247
They all have to be different equations so that you don't actually have two equations for 3 unknowns?

If you have a $\displaystyle 3 \times 3$ matrix $\displaystyle A$ and two $\displaystyle x \times 1$ columns $\displaystyle \vec{x},\vec{b}$

This would give the linear system

$\displaystyle A\vec{x}=\vec{b}$. What property must $\displaystyle A$ have for this system to be consistent for every vector $\displaystyle \vec{b}$?
• Jan 23rd 2011, 03:20 PM
Prove It
You want $\displaystyle \displaystyle \mathbf{Px} = \mathbf{b}$ to have solution for $\displaystyle \displaystyle \mathbf{x}$.

Using some matrix algebra...

$\displaystyle \displaystyle \mathbf{P}^T\mathbf{Px} = \mathbf{Pb}$ now gives you a square matrix on the LHS ($\displaystyle \displaystyle \mathbf{P}^T\mathbf{P}$)

$\displaystyle \displaystyle (\mathbf{P}^T\mathbf{P})^{-1}\mathbf{P}^T\mathbf{Px} = (\mathbf{P}^T\mathbf{P})^{-1}\mathbf{Pb}$

$\displaystyle \displaystyle \mathbf{Ix} = (\mathbf{P}^T\mathbf{P})^{-1}\mathbf{Pb}$

$\displaystyle \displaystyle \mathbf{x} = (\mathbf{P}^T\mathbf{P})^{-1}\mathbf{Pb}$.

Of course, this solution will only exist if $\displaystyle \displaystyle |\mathbf{P}^T\mathbf{P}| \neq 0$, so this is a necessary and sufficient condition.
• Jan 24th 2011, 07:20 AM
worc3247
Sorry i'm not entirely sure. A is invertible?