Well, suppose you collected some data and wanted to find the line of best fit, so that you could predict values for the Dependent Variable for any value of the Independent Variable (provided this value of the IV is within the observed values of the IV). The line that best fits the data is the "Least Squares line", which is the line that has the property of having all the squared deviations (differences between the observed values of the DV and the predicted values of the DV) being at a minimum.

It can be shown that you can use your set of observed values to create a matrix equation, which you can then use to evaluate and in your Least Squares line .

Suppose we had collected a set of data, i.e. . Obviously it is very unlikely that they will all be collinear, but we could treat them as such if we want a linear regression model of the form . When we substitute these observed values, we get a system of equations:

which we can then write in matrix form as

Clearly, this will have an infinite number of possibilities, since there are really an infinite number of lines that could fit the data well. However, the least squares line is the one which has evaluated using matrix methods. At the moment, we can't solve for since the matrix is not square (since only square matrices have inverses), but we can make it square by premultiplying both sides by the transpose first.

which has enabled us to evaluate the values of and in our least squares line model .

There you go, a real world example of systems with infinite solutions is for finding regression models with any real observed data.