Why can't cardinality of "infinite sets" be defined in terms of limits as infinity is "approached," so that, say, the cardinality of the set of "all" integers is twice the cardinality of the set of "all" even integers, since that's true of every finite number of even integers? In comparing the cardinality of the set of "all" integers to the cardinality of the set of "all" even integers, is it not at least arbitrary to map integers to even integers, one to one, rather than map each even integer to the identical member in the other set, leaving odd members in the set of "all" integers unmapped?

Doesn't Cantor's proof just prove that there are more real numbers than each real number has decimal places, as is true for any set of all numbers between 0 and 1 with a finite number of decimal places? Is the debate over whether any purported list of real numbers is "square" or not moot, since, if it isn't, the greater cardinality of the set of "all" real numbers over the set of "all" integers must be true, anyway?

What practical applications does proving that the cardinality of the set of "all" real numbers is greater than the cardinality of the set of "all" integers have?

I just posted a message a few days ago in a thread that was active a few years ago, but I can't find it. Was it deleted? Here is the link:

http://mathhelpforum.com/peer-math-review/214565-cantors-diagonal-argument-wrong-3.html