If the standard deviation is a value between 0 and 1, will the variance be less than the standard deviation?
Yes as far as I can tell.
If it was to be between 0 and 1 then it would be in decimal form.
Using the fact that the Standard Deviation^2 = Variance.
multiplying a decimal number by itself will end up making it less than its original value.
Not really conclusive proof at all.