How many correct decimals required ...

Nov 2009
927
260
Wellington
Hello,
somehow I'm stuck on a seemingly simple problem.

Problem said:
Given that \(\displaystyle n\) is an integer, and \(\displaystyle q\) is some rational \(\displaystyle 0 < q < 1\) such that \(\displaystyle \sqrt{nq}\) yields an integer, can you set up a relationship linking the number of correct decimals of the nonrepeating part of \(\displaystyle q\), and the error bound of \(\displaystyle \sqrt{nq}\) (hint : use orders of magnitude).
Attempted proof said:
Apart from the particular (and trivial) case \(\displaystyle k = 0\), knowing \(\displaystyle k\) decimals of \(\displaystyle q\) gives \(\displaystyle q - 10^{-k} < a < q + 10^{-k}\), so \(\displaystyle \sqrt{n\left (a - 10^{-k} \right )} < \sqrt{nq} < \sqrt{n\left (a + 10^{-k} \right )}\) which gives the error bound for \(\displaystyle a \approx q\).
Am I right ? Empirical tests seem to confirm this but I like to have more people look over my work because one always makes mistakes on his own.
 
Last edited:
Nov 2009
927
260
Wellington
Made a little mistake, correcting ...