If 0<a<b, prove 0<a^(1/2)<b^(1/2).

I'm not sure where to even start except that to assume 0<a<b.

Where do i go from here?

Results 1 to 5 of 5

- Oct 29th 2008, 06:43 PM #1

- Joined
- Apr 2008
- From
- Vermont
- Posts
- 318

- Oct 29th 2008, 06:59 PM #2
yes, i am not sure if your professor expects you to use special axioms here or what. but i would just note that the square root function is strictly increasing. we can see this since the derivative is always positive (where defined, it is not defined at zero, but we are not considering zero, so that's ok). thus, for any two nonzero numbers and in the domain of , we have , provided

- Oct 29th 2008, 07:18 PM #3

- Joined
- Apr 2008
- From
- Vermont
- Posts
- 318

Ok, that makes sense. Is there another way to do it kinda like I did with a similar problem?

here is my problem and what I did:

If 0<a<b, prove that 0<a^2<b^2

0*a<a*a<a*b

and 0*b<a*b<b*b

0<a^2<ab and 0<ab<b^2

0<a^2<ab<b^2

0<a^2<b^2

Or would doing something your way be better?

- Oct 29th 2008, 08:03 PM #4
you can use the method here. but the only way i see it working is to use contradiction

let and

then, you are asked to prove: (we are not considering negative numbers here, of course)

thus, assume to the contrary that but

then in much the same way as you did the last problem, you can show that that would mean , but that is contrary to our assumption

- Oct 29th 2008, 08:10 PM #5

- Joined
- Apr 2008
- From
- Vermont
- Posts
- 318