I get the idea that 0 is nothing, null, void, etc. Then, we should be able to say that 1 is simply "something", right? So, then, what is 2? Is 2 "something + something?" Of course, something + something = something.

My point is, "something" is just two vague a notion of what a number is.

So, what if we just say that 2 is twice as many as one? I hope you see the obvious flaw in that kind of answer and why we must reject it.

Do we define numbers like this:

1 = x where 0<x<2 or, more precisely: .999...<x<1.0000...1

And so, every number is defined by it's relation to other numbers.

In this system, we still don't "know" what a 1 is, or any other number for that matter, but we would at least never confused a 1 for a 5, or a 3 for a 2.

Of course, the system only works if we already know a priori that 0<1<2.

In fact, it reminds me of Aquinas' method of via negativa. Where he tried to define God by all the things he is not. "God is not physical, not in time, not... etc" However, this method surrenders all hope of finding what God IS. Unless you're of the opinion that Aquinas only used the method itself as a way to come up with an idea that no one else thought of. In a way similar to poets using rhyming to come up with lines they might not have thought of otherwise.

So, with that... How about I try using via negativa as a source of inspiration. We already know that 1 is not 0, 2, 3, or any other number. We've already eliminated an infinite amount of possibilities, and that was just my first negation. And yet, I know that 1 is the only number you can divide by itself and get itself as an answer. It's the only number that equals itself no matter what power it is raised by. It's also the only value of .

So, one negation in and I've already found a plethora of attributes unique to 1. Perhaps this would be the best definition of 1, ...?

So, would it be fair for me to define the rest of the numbers based on the number 1? In other words, ? I think the answer is obvious from the moment I wrote out that equation. I can't use the concept of 2 to define 2!

Maybe I should try via negativa again? Just for old times sake...

I know that 2 is not 1, 0, 3, etc... I could say that 2 is the common factor all even numbers can be divided by, but of course, this is how we define even numbers, so it's only a more complicated circular argument.

What about this: We already understand 0 and we understand 1. Might we say that 2 is the number to represent the number of numbers we have thus far defined? Does that still count as a circular argument? Perhaps it would... If I went to count how many numbers we had, I would say "1... 2... Sothat'swhat 2 is!"

I guess this definition would be no better then me showing you two rocks and having you count them. This, by the way, is the kind of teaching that got us into this mess to begin with. It's really no different than you asking me "What is piety?" And I show you a series of things considered to be pious and saying "Piety is what all these things, stripped of their physical forms, still have in common."

Can we really say that is a fair way to define the pious? We would still know nothing of the pious, well, nothing we couldn't learn from a comprehensive list of pious things.

Of course, no one these days believes their is an objective piety, that it is only opinion. Might we one day have a similar belief about 2?

So, we're back to the problem "What is a 2?" Though, now I've acquired a new doubt that there even is such a thing as 2. How about this, what would I deny if I said there was no such thing as 2?

If you showed me 2 rocks and asked me to count them, shouldn't I have to say there are 2 rocks? I might try to weasle out of this by saying "There are 1+1" rocks." To which you would say "1+1 IS 2!"

So, in otherwords: . However, this is even still a sly way of using the concept of 2 to define the concept of 2. Allow me to explain...

I'm sleepy now and I'm about to get off. In case you didn't realize, I really just typed this out as I thought it up, please tell me what you think.