I have a problem:
A man walks in a store and spend half of his money. After he got out he saw that he has as much cents, as he had dollars in the beginning and that he has dollars as half of the cents he had before he entered the store.
So I have for variables and only three equations.
Let's say something like that:
100*X1 + X2 = 200*Y1 - 2*Y2
Y2 = X1
2*Y1 = X2
where X1 is the dollars he had, X2 -cents he got, Y1 - dollars he has now, Y2 -cents now.
But I also know that in one dollar there is no more than 100 cents.
Can I use that to solve the problem?
Thanks in advance.