plz find the attached file.
here's the first. i want you to try the third question on your own, and post your solution.
this is how math. induction works.
we begin with some statement that depends on a set of integers, usually the whole set of natural numbers. to prove the statement is true by math. induction, we do the following procedure.
we prove that the statement is true for the first integer it is claimed to be true for, usually this is 1. so we find P(1) and show that the statement is true if we replace n with 1. if it's true, we are in good shape. this step is called the basis
next we move on to the inductive step or inductive hypothesis. in this step, we assume that the statement is true for some integer k which is greater than or equal to the base integer. and then we prove that if it's true for this term, then it will be true for the next term, that is, the k + 1 term.
so iduction in a nutshell:
given a statement P(n) for n $\displaystyle \geq$ 1
- prove P(1) is true
- assume P(k) is true for some k $\displaystyle \geq$ 1, and derive that P(k + 1) is true
so here is the first:
Let $\displaystyle P(n): " \frac {1}{1 \cdot 3} + \frac {1}{3 \cdot 5} + \frac {1}{5 \cdot 7} + ... + \frac {1}{(2n - 1)(2n + 1)} = \frac {n}{2n + 1}$ for all integers $\displaystyle n \geq 1 "$
Then $\displaystyle P(1): \frac {1}{(2(1) - 1)(2(1) + 1)} = \frac {1}{1 \cdot 3} = \frac {1}{2(1) + 1} = \frac {1}{3}$, which is true.
So $\displaystyle P(1)$ is true
Assume $\displaystyle P(k)$ is true for some $\displaystyle k \geq 1$, we show that $\displaystyle P(k + 1)$ is true.
So we have:
$\displaystyle P(k): \frac {1}{1 \cdot 3} + \frac {1}{3 \cdot 5} + \frac {1}{5 \cdot 7} + ... + \frac {1}{(2k - 1)(2k + 1)} = \frac {k}{2k + 1}$
add the $\displaystyle k + 1$ term to both sides, we get:
$\displaystyle \frac {1}{1 \cdot 3} + ... + \frac {1}{(2k - 1)(2k + 1)} + \frac {1}{(2(k + 1) - 1)(2(k + 1) + 1)}$ = $\displaystyle \frac {k}{2k + 1} + \frac {1}{(2(k + 1) - 1)(2(k + 1) + 1)}$
.................................................. ..................................$\displaystyle = \frac {k}{2k + 1} + \frac {1}{(2k + 1)(2k + 3)}$
.................................................. ..................................$\displaystyle = \frac {k(2k + 3) + 1}{(2k + 1)(2k + 3)}$
.................................................. ..................................$\displaystyle = \frac {2k^2 + 3k + 1}{(2k + 1)(2k + 3)}$
.................................................. ..................................$\displaystyle = \frac {(2k + 1)(k + 1)}{(2k + 1)(2k + 3)}$
.................................................. ..................................$\displaystyle = \frac {k + 1}{2k + 3}$
.................................................. ..................................$\displaystyle = \frac {k + 1}{2(k + 1) + 1}$
.................................................. ..................................$\displaystyle = P(k + 1)$
Thus, $\displaystyle P(k + 1)$ is true.
Therefore $\displaystyle P(n)$ is true for all $\displaystyle n \geq 1$ by the method of Mathemtaical Induction
If there is any step that confuses you, say so
Here's the second question:
Prove $\displaystyle \sqrt {3} + \sqrt {7}$ is irrational
Proof: Assume to the contrary that $\displaystyle \sqrt {3} + \sqrt {7}$ is rational
Then $\displaystyle \left( \sqrt {3} + \sqrt {7} \right)^2 = 10 + 2 \sqrt {21}$ is rational as well.
But $\displaystyle 10 + 2 \sqrt {21}$ is only rational if $\displaystyle 2 \sqrt {21}$ is rational. So $\displaystyle 2 \sqrt {21}$ is rational and so $\displaystyle \sqrt {21}$ is rational.
Assume $\displaystyle \sqrt {21}$ is rational.
Then $\displaystyle \sqrt {21} = \frac {a}{b}$ for $\displaystyle a,b \in \mathbb {Z} \mbox { , } b \neq 0$, and we may further assume that $\displaystyle \frac {a}{b}$ is in lowest terms
This means that $\displaystyle 21 = \frac {a^2}{b^2}$
Which means $\displaystyle 21 b^2 = a^2$
But this means 21 divides $\displaystyle a^2$, which means 21 divides $\displaystyle a$ (can you prove this?)
Since 21 divides $\displaystyle a$, $\displaystyle a = 21 m$ for $\displaystyle m \in \mathbb {Z}$
So $\displaystyle 21 b^2 = a^2 \Longleftrightarrow 21 b^2 = \left( 21 m \right)^2$
So $\displaystyle 21 b^2 = 21^2 m^2$
So $\displaystyle b^2 = 21 m^2$
But this means 21 divides $\displaystyle b^2$, so 21 divides $\displaystyle b$
So we can write $\displaystyle b = 21 n$ for some $\displaystyle n \in \mathbb {Z}$
Therefore, $\displaystyle \frac {a}{b} = \frac {21m}{21n} = \frac {m}{n}$
But this is a contradiction, since we assumed $\displaystyle \frac {a}{b}$ is in lowest terms. So we have that $\displaystyle \sqrt {21}$ is NOT rational, therefore $\displaystyle \sqrt {3} + \sqrt {7}$ is NOT rational. Thus, $\displaystyle \sqrt {3} + \sqrt {7}$ is irrational.
QED.
What I did with $\displaystyle \sqrt {21}$ is the standard way I was taught to prove something is irrational. I admit it is a beautiful method that makes you wonder how the guy that came up with it did come up with it, but I never really liked it. Another way to prove something is irrational was found in one of my other texts, I kind of like such a proof better. It goes like this.
Prove $\displaystyle \sqrt {21}$ is irrational.
Proof: Consider the equation $\displaystyle x^2 - 21 = 0$
By the Rational Roots Theorem, the ONLY rational solutions to this equation are $\displaystyle \pm 1 \mbox { , } \pm 3 \mbox { , and } \pm 7$
But $\displaystyle \sqrt {21}$ is a solution to the above equation. Therefore, $\displaystyle \sqrt {21}$ is not a rational number
QED
Of course, this is not a contradiction proof, but it's a good proof to know if you are not restricted to doing a particular kind of proof. As you can see, it's much shorter.
Also, you may be tempted to begin with, "if $\displaystyle \sqrt {3} + \sqrt {7}$ is rational, then $\displaystyle \sqrt {3}$ is rational, and $\displaystyle \sqrt {7}$ is rational." And then you would proceed to show that both are irrational, and come up with a contradiction. This would be wrong, as it is possible to add two irrational numbers and get a rational one (can you prove this?)
Nothing wrong with your proof. You taken the Pythagorean version and modifed it well. But here is another one.
----
Here is another:
$\displaystyle \mbox{ Let }x=\sqrt{3}+\sqrt{7}$
Then,
$\displaystyle x^2 = 10 + 2\sqrt{21}$
$\displaystyle x^2-10 = 2\sqrt{21}$
$\displaystyle x^4 - 20x^2+100 = 84$
$\displaystyle x^4 - 20x^2 + 16 = 0$
By rational root theorem none of $\displaystyle \pm 1,\pm 2,\pm 4,\pm 8,\pm 16$ solves this.
So it has irrational real roots or complex.
Hello, aamirpk4!
Had to do some acrobatics for #3 . . .
3) Prove by induction that: .$\displaystyle 3^n < n!$ .for $\displaystyle n > 6.$
The base case is $\displaystyle n = 7$.
. . Is $\displaystyle 3^7 \,<\,7!\:?\quad\Rightarrow\quad 2187 \,<\,5040$ . . . yes!
Assume $\displaystyle S(k)$ is true: .$\displaystyle 3^k \,< \,k!$
Multiply both sides by 3: . $\displaystyle 3\!\cdot\!3^k \:<\:3\!\cdot\!k!$
. . and we have: .$\displaystyle 3^{k+1}\:<\:3\!\cdot\!k!$ . [1]
And now a short detour . . .
. . Since $\displaystyle k > 6$, we know that: .$\displaystyle 3 \:<\:k+1$
. . Multiply both sides by $\displaystyle k!\!:\;\;3\cdot k! \:<\:(k+1)\cdot k!$
. . Hence, we have: .$\displaystyle 3\!\cdot\!k! \:< \: (k+1)!$
Combined with [1], we have: .$\displaystyle 3^{k+1} \:<\: 3\!\cdot\!k! \:<\: (k+1)!$
We have proved $\displaystyle S(k+1)$ . . . The inductive proof is complete.
Hey Jhevon
If you wanna align, use
\begin{aligned}
\end{aligned}
Example:
\begin{aligned}
a&=b\\
a^2&=ab\\
a^2-b^2&=ab-b^2\\
(a+b)(a-b)&=b(a-b)
\end{aligned}
This yields
$\displaystyle
\begin{aligned}
a&=b\\
a^2&=ab\\
a^2-b^2&=ab-b^2\\
(a+b)(a-b)&=b(a-b)
\end{aligned}$
This was, the LaTeX's contribution.