(Very basic non-standard analysis: I wasn't sure where to post this)
My proof seems clumsy and probably wrong, so I was wondering if someone could help me out.
i. Let a and c be infinitesimal.
ii. Assume b is a hyperreal number that is a finite but not infinitesimal and lies in between a and c.
iii. Since c is infinitesimal but b is not, then b > c.
iv. But this is impossible since b lies in between a and c.
v. Therefore, by the above contradiction, if a and c are infinitesimal and b lies in between a and c, then b is infinitesimal.
Thanks for any help in clearing this up!
EDIT- here is a second attempt that I think is more concise and a bit more sound.
i. Assume a and c are infinitesimal.
ii. Let b be a hyperreal number such that a < b < c.
iii. Then b < c + a
iv. But c + a must be infinitesimal, since the addition of two infinitesimal numbers is infinitesimal.
v. Since b is less than an infinitesimal number, b is infinitesimal.
Is that better? Thanks again to anyone who reads this and tries to help.