first of all, you have to say what you MEAN by:
(yes, we all know your intention is to say "the 9's go on forever"...but what does THAT mean?)
one way to think of this, is to consider 0.99999..... as a SEQUENCE:
f(n) = 9.
then we have a "new kind of number":
NEW NUMBER = INTEGER + SEQUENCE.
can you think of how we might define "addition and subtraction" of sequences ? (you might want to think about this for a while: what happens if we have to "carry and/or borrow ones"? it's not that simple).
if we could do such a thing (which i leave you to ponder for yourself), we could consider:
1.0 - 0.99999......
which we might do "in steps" (this is a hint as to how to define addition and subtraction of sequences in general):
1.0 - 0.9 = 0.1
1.0 - 0.99 = 0.01
1.0 = 0.999 = 0.001
1.0 - 0.9999 = 0.0001
1.0 - 0.99999...... = 0.00000...... = ?
the thing is, if you want to prove this using "ordinary" algebra rules, you're missing something: sequences of rational numbers need not "converge" to something rational (sequences of rational numbers are the same thing as our "steps" in defining an "infinite decimal", for example if we have:
x = 0.12345678901234....
that is f(n) = the remainder of n upon division by 10
then what we actually have is another sequence:
g(1) = 0.1 = 1/10
g(2) = 0.12 = 1/10 + 2/100
g(3) = 0.123 = 1/10 + 2/100 + 3/1000
the thing is, "infinite sums" of rational numbers, need not be rational! so just arithmetic "isn't enough". we need LIMITS. this is subtle: in point of fact, any "approximation"
0.9999....9 (where we end after "so many, many 9's") is NOT 1, but we keep getting "closer and closer". now as MarkFL wrote, we could just consider an "infinite sum", but what is THAT?
note that i have any quibble with Plato's post, it's certainly the simplest "naive" way to show that:
x = 0.9999....
10x = 9.99999.......
9x = 10x - x = 9.9999.... - 0.99999.... = 9
9x = 9
x = 1.
the trouble is: how do we know that 10x = 9.99999.... is TRUE? (it is, but WHY?). how does one actually carry out this "infinite multiplication"? it's not obvious.
(also written as ) is a *symbol*. Before you can manipulate it, or prove anything about it, you need to know what number (real, rational, whatever), if any, that symbol is defined to correspond to. There are several different ways of setting that definition up, but when they're done, they'll all amount to this:
is the limit in of the sequence .
That limit is 1, as can be proven directly using the definition of the limit of a sequence of real numbers.
Once that limit is proven (which is easy - try it!), it follows that the real number the symbol names is the number 1.