# Prove some identities!

Printable View

Show 40 post(s) from this thread on one page
Page 2 of 8 First 123456 ... Last
• Dec 15th 2010, 09:14 PM
simplependulum
To stop this thread from being moved to the second page of the subforum . I would like to
give my solutions to these three integrals I posted previously . Hope more and more wonderful identities will be created here !

Problem :

Quote:

Let $\displaystyle K = \frac{\ln^2(1+\sqrt{2})}{2}$
Show that :
$\displaystyle \displaystyle \int_1^{\infty} \frac{\ln(x+\sqrt{x^2 - 1} )}{x(x^2 +1 )}~dx = K$
$\displaystyle \displaystyle \int_1^{\infty}\frac{\tan^{-1}(x)}{x\sqrt{x^2 - 1 } }~dx = \frac{\pi^2}{8} + K$
$\displaystyle \displaystyle \int_0^1 \frac{\tan^{-1}(x)}{\sqrt{1-x^2} }~dx = \frac{\pi^2}{8} - K$

Solution to the fisrt integral :
Spoiler:

$\displaystyle I = \int_1^{\infty} \frac{\ln(x + \sqrt{x^2 - 1} )}{x(x^2+1)}~dx$
Integration by parts ,

$\displaystyle I = \left[ \ln{\left( \frac{x}{\sqrt{x^2 + 1 }} \right)} \ln(x + \sqrt{x^2 - 1 } ) \right]_1^{\infty} - \int_1^{\infty} \ln{\left( \frac{x}{\sqrt{x^2 + 1 }} \right)} ~ \frac{dx}{\sqrt{x^2 - 1 } }$

$\displaystyle = - \int_1^{\infty} \ln{\left( \frac{x}{\sqrt{x^2 + 1 }} \right)} ~ \frac{dx}{\sqrt{x^2 - 1 } }$

Use $\displaystyle x \mapsto 1/x$ we have

$\displaystyle I = \frac{1}{2} \int_0^1 \frac{\ln(x^2+1)}{x \sqrt{1-x^2}}~dx$

$\displaystyle = \int_0^1 \left[ \int_0^1 \frac{xt dx}{(x^2t^2+1)\sqrt{1-x^2} }\right] ~dt$

Sub. $\displaystyle \sqrt{1-x^2} = y$ , $\displaystyle - \frac{xdx}{\sqrt{1-x^2} } = dy$

$\displaystyle I = \int_0^1 \int_0^1 \frac{tdy}{t^2+1 -t^2y^2} ~dt$

$\displaystyle = \int_0^1 \frac{\ln(\sqrt{1+t^2} + t )}{\sqrt{t^2+1} }~dt$

$\displaystyle = \left[ \frac{\ln^2(\sqrt{1+t^2} + t ) }{2} \right]_0^1 = K$

Solution to the second integral :
Spoiler:
Sub. $\displaystyle x = 1/t$ and we will find that it is similar to the third integral . See the solution of the third integral

Solution to the third integral :
Spoiler:

Let
$\displaystyle I(t) = \int_0^1 \frac{\tan^{-1}(tx)}{\sqrt{1-x^2}}~dx$
Then

$\displaystyle I'(t) = \int_0^1 \frac{x~dx}{(1+t^2 x^2)\sqrt{1-x^2} }$

Sub. $\displaystyle \sqrt{1-x^2}=y$ , $\displaystyle - \frac{x~dx}{\sqrt{1-x^2}}= dy$

$\displaystyle I'(t) = \int_0^1 \frac{dy}{1+t^2 - t^2y^2 }$
$\displaystyle = \frac{1}{2t\sqrt{1+t^2}} \ln{\left[ \frac{\sqrt{1+t^2}+ t}{\sqrt{1+t^2} - t } \right]}$
$\displaystyle = \frac{1}{t\sqrt{1+t^2}}\ln( \sqrt{1+t^2} + t )$

We have

$\displaystyle I(1) - I(0) = \int_0^1 \frac{dt}{t\sqrt{1+t^2}}\ln( \sqrt{1+t^2} + t )$

Integration by parts ,

$\displaystyle I(1) - I(0) = \left[- \ln(\sqrt{1+t^2} + t ) \ln( \sqrt{1+1/t^2 } +1/t ) \right]_0^1 +\int_0^1 \ln( \sqrt{1+1/t^2 } +1/t ) \frac{dt}{\sqrt{1+t^2}}$

$\displaystyle = -2K + \int_0^1 \ln( \sqrt{1+1/t^2 } +1/t ) \frac{dt}{\sqrt{1+t^2}}$

Sub. $\displaystyle t = 1/x$

$\displaystyle I(1) - I(0) = -2K + \int_1^{\infty} \ln( \sqrt{1+x^2} + x ) \frac{dx}{x\sqrt{1+x^2} } = -2K + I(\infty) - I(1)$

But

$\displaystyle I(1) = \int_0^1 \frac{\tan^{-1}(x)}{\sqrt{1-x^2}}~dx$

$\displaystyle I(0) = 0$

$\displaystyle I(\infty) = \frac{\pi}{2} \int_0^1 \frac{dx}{\sqrt{1-x^2 } } = \frac{\pi^2}{4}$ so we have

$\displaystyle 2 I(1) = -2K + \frac{\pi^2}{4}$

$\displaystyle \int_0^1 \frac{\tan^{-1}(x)}{\sqrt{1-x^2}}~dx = \frac{\pi^2}{8} - K$
• Dec 15th 2010, 09:41 PM
Drexel28
This may be too easy, but I find this one surprising.

Compute $\displaystyle \displaystyle \lim_{\lambda\to\infty}\int_0^1|\sin(\lambda x)|\text { }dx$
• Dec 15th 2010, 11:36 PM
simplependulum
I don't know whether my solution is correct or not , please take a look .

Let $\displaystyle k = [\frac{t}{\pi}]$ and $\displaystyle a = \frac{\pi}{t}$

$\displaystyle I(t) =\int_0^1 |\sin(tx)|~dx$

$\displaystyle = \left( \int_0^a + \int_a^{2a} + ... + \int_{(k-1)a}^{ka} + \int_{ka}^1 \right) |\sin(tx)|~dx$

$\displaystyle = k(\frac{2}{t} ) + \int_{ka}^1 |\sin(tx)|~dx$

so $\displaystyle \frac{2k}{t} \leq I(t) < \frac{2k}{t } + \frac{2}{t}$

$\displaystyle \displaystyle{ \lim_{t\to\infty} \frac{2k}{t} \leq \lim_{t\to\infty} I(t) < \lim_{t\to\infty} [\frac{2k}{t } + \frac{2}{t} ]$

But $\displaystyle \frac{1}{\pi} - \frac{1}{t} < \frac{k}{t} \leq \frac{1}{\pi}$ so

$\displaystyle \displaystyle{ \lim_{t\to\infty} \frac{2k}{t} = \frac{2}{\pi} }$ and

$\displaystyle \displaystyle{ \lim_{t\to\infty} I(t) = \frac{2}{\pi}}$
• Dec 15th 2010, 11:48 PM
Drexel28
Quote:

Originally Posted by simplependulum
I don't know whether my solution is correct or not , please take a look .

Let $\displaystyle k = [\frac{t}{\pi}]$ and $\displaystyle a = \frac{\pi}{t}$

$\displaystyle I(t) =\int_0^1 |\sin(tx)|~dx$

$\displaystyle = \left( \int_0^a + \int_a^{2a} + ... + \int_{(k-1)a}^{ka} + \int_{ka}^1 \right) |\sin(tx)|~dx$

$\displaystyle = k(\frac{2}{t} ) + \int_{ka}^1 |\sin(tx)|~dx$

so $\displaystyle \frac{2k}{t} \leq I(t) < \frac{2k}{t } + \frac{2}{t}$

$\displaystyle \displaystyle{ \lim_{t\to\infty} \frac{2k}{t} \leq \lim_{t\to\infty} I(t) < \lim_{t\to\infty} [\frac{2k}{t } + \frac{2}{t} ]$

But $\displaystyle \frac{1}{\pi} - \frac{1}{t} < \frac{k}{t} \leq \frac{1}{\pi}$ so

$\displaystyle \displaystyle{ \lim_{t\to\infty} \frac{2k}{t} = \frac{2}{\pi} }$ and

$\displaystyle \displaystyle{ \lim_{t\to\infty} I(t) = \frac{2}{\pi}}$

It looks the similar to mine. Could you explain it a bit more. In particular what the notation, especially the second step (with the sum of integrals in the parentheses means). Anyways, here's mine

Spoiler:

If $\displaystyle L$ is our limit then, assuming that $\displaystyle \lambda>\pi$,

\displaystyle \displaystyle \begin{aligned}L=\lim_{\lambda\to\infty}\frac{1}{\ lambda}\int_0^{\lambda}|\sin(x)|\text{ }dx &=\lim_{\lambda\to\infty}\left\{\frac{1}{\lambda}\ sum_{k=1}^{\left\lfloor\frac{\lambda}{\pi}\right\r floor}\int_{\pi(k-1)}^{\pi k}|\sin(x)|\text{ }dx+\frac{1}{\lambda}\int_{\pi\left\lfloor\frac{\l ambda}{\pi}\right\rfloor}^{\lambda}|\sin(x)|\text{ }dx\right\}\\ &= \lim_{\lambda\to\infty}\left\{\frac{2\left\lfloor\ frac{\lambda}{\pi}\right\rfloor}{\lambda}+\frac{1} {\lambda}\int_{\pi\left\lfloor\frac{\lambda}{\pi}\ right\rfloor}^{\lambda}|\sin(x)|\text{ }dx\right\}\\ &=\lim_{\lambda\to\infty}\left\{ \frac{2}{\pi}+\frac{\pi \xi_{\lambda}}{\lambda}\right\}\end{aligned}

Where $\displaystyle 0\leqslant \xi_{\lambda}\leqslant 2$ from where it follows that $\displaystyle \displaystyle L=\frac{2}{\pi}$.

• Dec 16th 2010, 12:06 AM
simplependulum
Quote:

Originally Posted by Drexel28
It looks the similar to mine. Could you explain it a bit more. In particular what the notation, especially the second step (with the sum of integrals in the parentheses means). Anyways, here's mine

Sure , my idea is similar to yours :
As the period of the integrand is $\displaystyle a = \frac{\pi}{t}$ ( $\displaystyle t$ here is the same as $\displaystyle \lambda$ ) I first partitioned the interval $\displaystyle (0,1)$ into $\displaystyle (0,a) , (a,2a) ,..., ((k-1)a,ka) , (ka,1)$ . For $\displaystyle x \in (0,a)$ we have $\displaystyle 0 < tx < ta = \pi$ which yields $\displaystyle \sin(tx) >0$ so we find that the values of the first $\displaystyle k$ summands are the same ( $\displaystyle = \int_0^a \sin(tx)~dx = \frac{2}{t}$ ) . For the last summand , $\displaystyle \int_{ka}^1 |\sin(tx)|~dx$ , we know it is less than $\displaystyle \int_0^a \sin(tx)~dx = \frac{2}{t}$ . Therefore , $\displaystyle \frac{2k}{t} \leq I(t) < \frac{2k}{t} + \frac{2}{t}$ . Then i used squeeze principle to evaluate the limit .
• Dec 16th 2010, 12:09 AM
Drexel28
Quote:

Originally Posted by simplependulum
Sure ,my idea is :

First partition the interval $\displaystyle (0,1)$ into $\displaystyle (0,a) , (a,2a) ,..., ((k-1)a,ka) , (ka,1)$ with $\displaystyle a = \frac{\pi}{t}$ ( $\displaystyle t$ here is the same as $\displaystyle \lambda$ ) . We find that the value of the first $\displaystyle k$ summands are the same ( $\displaystyle = \int_0^a \sin(tx)~dx = \frac{2}{t}$ ) . For the last summand , $\displaystyle \int_{ka}^1 |\sin(tx)|~dx$ , we know it is less than $\displaystyle \int_0^a \sin(tx)~dx = \frac{2}{t}$ . Therefore , $\displaystyle \frac{2k}{t} \leq I(t) < \frac{2k}{t} + \frac{2}{t}$ . Then i used squeeze principle to evaluate the limit .

Ah, I understand better now. Thanks.
• Dec 21st 2010, 01:38 AM
simplependulum
Three Integrals
What about these ? The first one is very easy but on the way we generalize this integral , it becomes harder and harder ...

1. $\displaystyle I_n = \int_0^{\pi} \frac{\sin(nx)}{\sin(x)}~dx$

2. $\displaystyle I_n = \int_0^{\pi} \frac{\sin^2(nx)}{\sin^2(x)}~dx$

3. $\displaystyle I_n = \int_0^{\pi} \frac{\sin^3(nx)}{\sin^3(x)}~dx$
• Dec 21st 2010, 02:57 AM
PaulRS
Quote:

Originally Posted by simplependulum
What about these ? The first one is very easy but on the way we generalize this integral , it becomes harder and harder ...

1. $\displaystyle I_n = \int_0^{\pi} \frac{\sin(nx)}{\sin(x)}~dx$

2. $\displaystyle I_n = \int_0^{\pi} \frac{\sin^2(nx)}{\sin^2(x)}~dx$

3. $\displaystyle I_n = \int_0^{\pi} \frac{\sin^3(nx)}{\sin^3(x)}~dx$

See here
• Dec 21st 2010, 12:49 PM
Drexel28
This also might be too easy, but how about $\displaystyle \displaystyle \lim_{x\to 0}\frac{x^n}{\Gamma\left(x^n+1)-1}$. Or, how about $\displaystyle \displaystyle \sum_{n=1}^{\infty}\frac{\alpha n+\beta}{n(n+1)\cdots(n+\ell)},\text{ }\ell\in\mathbb{N},\text{ }\ell\geqslant 3$
• Dec 22nd 2010, 01:32 PM
Unbeatable0
Quote:

Originally Posted by Drexel28
This also might be too easy, but how about $\displaystyle \displaystyle \lim_{x\to 0}\frac{x^n}{\Gamma\left(x^n+1)-1}$. Or, how about $\displaystyle \displaystyle \sum_{n=1}^{\infty}\frac{\alpha n+\beta}{n(n+1)\cdots(n+\ell)},\text{ }\ell\in\mathbb{N},\text{ }\ell\geqslant 3$

For the limit, use L'hospital's rule and see that this is equal $\displaystyle \frac{1}{\Gamma'(1)} = -\frac{1}{\gamma}$

For the sum, we first evaluate:

$\displaystyle \displaystyle{S(\ell) = \sum_{n=1}^\infty \frac{1}{n(n+1)\cdots (n+\ell)}$

Consider the following polynomial in $\displaystyle n$:

$\displaystyle \displaystyle{P(n) = \sum_{k=1}^\ell (-1)^k\binom{\ell}{k}\prod_{\substack{r=0\\r\ne k}}^\ell} (n+r) }$

Then we have:

$\displaystyle \displaystyle{\deg P \le \ell\:\:\:\:;\:\:\:\:\:P(0)=P(-1)=P(-2)=\dots =P(-\ell) = \ell !}$

and therefore

$\displaystyle \displaystyle{\ell ! \equiv P(n) = \sum_{k=1}^\ell (-1)^k\binom{\ell}{k}\prod_{\substack{r=0\\r\ne k}}^\ell} (n+r)}$

Divide both sides by

$\displaystyle \displaystyle{\ell ! \prod_{k=0}^\ell (n+k)}$

to get the following identity:

$\displaystyle \displaystyle{ \frac{1}{\displaystyle\prod_{k=0}^\ell (n+k)} = \frac{1}{\ell !}\sum_{k=0}^\ell \frac{(-1)^k \binom{\ell}{k}}{n+k} }$

from which we obtain:

$\displaystyle \displaystyle{ S(\ell) = \sum_{n=1}^\infty \frac{1}{n(n+1)\cdots (n+\ell)} = \frac{1}{\ell !}\sum_{k=0}^\ell\left[ (-1)^k \binom{\ell}{k}\sum_{n=1}^\infty \frac{1}{n+k}\right] }$

$\displaystyle \displaystyle{ = \lim_{m\rightarrow\infty} \frac{1}{\ell !}\sum_{k=0}^\ell\left[ (-1)^k \binom{\ell}{k}(H_{m+k}-H_{k})\right] }$

where $\displaystyle H_n$ is the $\displaystyle n$'th harmonic number (setting $\displaystyle H_0 = 0$).

Substituting $\displaystyle H_{m+k} = (H_{m+k}-H_{m}) + H_{m}$ and using

$\displaystyle \displaystyle{ \lim_{m\rightarrow\infty} (H_{m+k}-H_{m}) = 0 }$

together with

$\displaystyle \displaystyle{ \sum_{k=0}^\ell (-1)^k\binom{\ell}{k} = 0 }$

we get

$\displaystyle \displaystyle{ S(\ell) = -\frac{1}{\ell !}\sum_{k=0}^\ell (-1)^k\binom{\ell}{k}H_{k} = \frac{1}{\ell !}\sum_{k=0}^\ell \left[(-1)^k\binom{\ell}{k}\int_0^1 \frac{t^k-1}{1-t}dt\right] }$

$\displaystyle \displaystyle{ =\frac{1}{\ell !}\int_0^1 \frac{\displaystyle\sum_{k=0}^\ell (-t)^k\binom{\ell}{k}}{1-t}dt = \frac{1}{\ell !}\int_0^1 \frac{(1-t)^\ell}{1-t}dt }$

$\displaystyle \displaystyle{ = \frac{1}{\ell !}\int_0^1 t^{\ell-1}dt = \frac{1}{\ell \cdot \ell !} }$

Therefore in the proposed problem:

$\displaystyle \displaystyle{\sum_{n=1}^{\infty}\frac{\alpha n+\beta}{n(n+1)\cdots(n+\ell)}}$

$\displaystyle \displaystyle{ = \alpha\sum_{n=1}^{\infty}\frac{1}{(n+1)\cdots(n+\e ll)}}+\beta\sum_{n=1}^{\infty}\frac{1}{n(n+1)\cdot s(n+\ell)}}$

$\displaystyle \displaystyle{ =\alpha \left(S(\ell-1)-\frac{1}{\ell!}\right)+\beta S(\ell) = \frac{1}{\ell !}\left(\frac{\alpha}{\ell-1}+\frac{\beta}{\ell}\right) }$
• Dec 22nd 2010, 07:31 PM
Drexel28
Quote:

Originally Posted by Unbeatable0
For the limit, use L'hospital's rule and see that this is equal $\displaystyle \frac{1}{\Gamma'(1)} = -\frac{1}{\gamma}$

We don't use L'hopital's around these parts! That said, it was so easy that might as well accept that. >:)

Also, I like your method! It was totally different than what I did! Which was:

Spoiler:

Note that our sum can be written $\displaystyle \displaystyle K(\ell)+J(\ell)$ where $\displaystyle \displaystyle K(\ell)=\alpha\sum_{n=1}^{\infty}\frac{1}{(n+1)\cd ots(n+\ell)}$ and $\displaystyle \displaystyle J(\ell)=\beta\sum_{n=1}^{\infty}\frac{1}{n(n+1)\cd ots(n+\ell)}$. Note firstly then that

\displaystyle \displaystyle \begin{aligned}K(\ell) &= \frac{\alpha}{(\ell-1)!}\sum_{n=1}^{\infty}\frac{n!(\ell-1)!}{(n+\ell)!}\\ &= \frac{\alpha}{(\ell-1)!}\sum_{n=1}^{\infty}\frac{\Gamma(n+1)\Gamma(\el l)}{\Gamma(n+\ell+1)}\\ &= \frac{\alpha}{(\ell-1)!}\sum_{n=1}^{\infty}B(n+1,\ell)\\ &= \frac{\alpha}{(\ell-1)!}\sum_{n=1}^{\infty}\int_0^1 t^{n}(1-t)^{\ell-1}\text{ }dt\\ &= \frac{\alpha}{(\ell-1)!}\int_0^1 (1-t)^{\ell-1}\sum_{n=1}^{\infty}t^n\text{ }dt\\ &=\frac{\alpha}{(\ell-1)!}\int_0^1 (1-t)^{\ell-2}-(1-t)^{\ell-1}\text{ }dt\\ &= \frac{\alpha}{(\ell-1)!} \frac{1}{\ell(\ell-1)}\\ &= \frac{\alpha}{\ell! (\ell-1)}\end{aligned}

and similarly

\displaystyle \displaystyle \begin{aligned}J(\ell) &= \frac{\beta}{\ell!}\sum_{n=1}^{\infty}\frac{(n-1)!\ell!}{(n+\ell)!}\\ &= \frac{\beta}{\ell!}\sum_{n=1}^{\infty}\frac{\Gamma (n)\Gamma(\ell+1)}{\Gamma(n+\ell+1)}\\ &= \frac{\beta}{\ell!}\sum_{n=1}^{\infty}B(n,\ell+1)\ \ &= \frac{\beta}{\ell!}\sum_{n=1}^{\infty}\int_0^1 t^{n-1}(1-t)^\ell\text{ }dt\\ &= \frac{\beta}{\ell!}\int_0^1 (1-t)^{\ell}\sum_{n=1}^{\infty}t^{n-1}\text{ }dt\\ &= \frac{\beta}{\ell!}\int_0^1 (1+t)^{\ell-1}\text{ }dt\\ &= \frac{\beta}{\ell!\ell}\end{aligned}

Thus,

$\displaystyle \displaystyle \sum_{n=1}^{\infty}\frac{\alpha n+\beta}{n(n+1)\cdots(n+\ell)}=K(\ell)+J(\ell)=\fr ac{\alpha}{\ell!(\ell-1)}+\frac{\beta}{\ell!\ell}=\frac{1}{\ell!}\left(\ frac{\alpha}{\ell-1}+\frac{\beta}{\ell}\right)$

Same as yours!

I propose three more problems in increasing difficulty:

This first one is super-easy but try to do it in a non-obvious way. Anyways, it's an interesting result!

Problem 1: Prove that for $\displaystyle n\in\mathbb{N}$

$\displaystyle \displaystyle \sum_{k=1}^{n}{n\choose k}\frac{(-1)^{k+1}}{k}=\sum_{k=0}^{n-1}\frac{1}{k+1}$

Problem 2: For $\displaystyle k,n\in\mathbb{N}$ such that $\displaystyle n-k-1\geqslant 1$ compute

$\displaystyle \displaystyle \sum_{t=0}^{n-k-1}\frac{(-1)^t}{k+t+1}{{n-k-1}\choose t}$

Have fun with this one (you won't, unless you 'see it')

Problem 3: Compute

$\displaystyle \displaystyle \lim_{n\to\infty}(n+1)\int_0^1 x^n \Gamma^3(x+1)\zeta^3(x+3)\text{ }dx$
• Dec 23rd 2010, 04:44 AM
Unbeatable0
I like your method very much! Although I liked my solution for being elementary, yours is far more elegant.

Quote:

Originally Posted by Drexel28
This first one is super-easy but try to do it in a non-obvious way. Anyways, it's an interesting result!

What is an obvious way for this? The following?

$\displaystyle \displaystyle{ \sum_{k=0}^{n-1}\frac{1}{k+1} = \sum_{k=0}^{n-1}\int_0^1 x^k dx = \int_0^1\frac{x^n-1}{x-1}dx }$

and then substituting $\displaystyle x\mapsto 1-x$ and using Newton's binom, gives the result.

For problem 2, it's the same method:

$\displaystyle \displaystyle{ \sum_{t=0}^{n-k-1}\frac{(-1)^t}{k+t+1}\binom{n-k-1}{t} = (-1)^k\sum_{t=0}^{n-k-1}\int_0^1 (-x)^{k+t}\binom{n-k-1}{t}dx }$

$\displaystyle \displaystyle{ = (-1)^k\int_0^1 (-x)^k \sum_{t=0}^{n-k-1}(-x)^t\binom{n-k-1}{t}dx }$

$\displaystyle \displaystyle{ = \int_0^1 x^k(1-x)^{n-k-1}dx }$

$\displaystyle \displaystyle{ = B(k+1,n-k) = \frac{k!(n-k-1)!}{n!} }$

Was this your method? If not, I'm interested to see how you solved it.

I'd be glad to post some problems, but none have come to my mind thus far, and in the last days I haven't encoutered with any challenge problems which would fit this thread. I will, however, keep looking for nice problems to post in here.
• Dec 23rd 2010, 11:02 AM
Drexel28
Quote:

Originally Posted by Unbeatable0
I like your method very much! Although I liked my solution for being elementary, yours is far more elegant.

Thank you!

Quote:

Was this your method? If not, I'm interested to see how you solved it.
I had a mental lapse. There is another problem which is super easy to do using analysis/calculus but has other interesting ways to do it. It's (go ahead and try it) compute

$\displaystyle \displaystyle \sum_{j=1}^{n}j{j\choose n}$

Although I just realized you can do this by a little trickery similar to what you did for your proof of the previous problem using Retkes's identities.

Quote:

I'd be glad to post some problems, but none have come to my mind thus far, and in the last days I haven't encoutered with any challenge problems which would fit this thread. I will, however, keep looking for nice problems to post in here.
Here is quite a beautiful one (in my opinion).

Problem: Let $\displaystyle \Omega\subseteq\mathbb{R}^n$ be the set of all $\displaystyle (x_1,\cdots,x_n)$ such that $\displaystyle x_j\geqslant 0,\text{ }j\in[n]$ and $\displaystyle \displaystyle \sum_{j=1}^{n}x_j=1$. Compute

$\displaystyle \displaystyle \int\cdots\int_{\Omega}x^{\alpha_1-1}\cdots x^{\alpha_n-1}\text{ }dx_1\cdots dx_n$

where $\displaystyle \text{Re }\alpha_k>0,\text{ }k\in[n]$.
• Dec 23rd 2010, 11:31 AM
Unbeatable0
Quote:

Originally Posted by Drexel28
$\displaystyle \displaystyle \sum_{j=1}^{n}j{j\choose n}$

Although I just realized you can do this by a little trickery similar to what you did for your proof of the previous problem using Retkes's identities.

I looked up Retkes's identities, but don't understand how you meant to apply them. Care to clarify?

Also, you had a little typo there - the binomial should have been upside down.

Besides the usual proof for this identity, using the derivative of

$\displaystyle \displaystyle{ \sum_{k=0}^n \binom{n}{k} x^k = (1+x)^n }$

I've just found a much more elegant (in my opinion) method: let $\displaystyle k\mapsto n-k$ in the summation to get:

$\displaystyle \displaystyle{ \sum_{k=1}^n k\binom{n}{k} = \sum_{k=0}^n k\binom{n}{k} = \sum_{k=0}^n (n-k)\binom{n}{k} = n\sum_{k=0}^n \binom{n}{k} - \sum_{k=0}^n k\binom{n}{k} }$

Now using

$\displaystyle \displaystyle \sum_{k=0}^n \binom{n}{k} = 2^n$

gives the result.

Edit: an easy problem - prove the following generalization:
Edit 2: there was a little mistake here. Fixed it now.

$\displaystyle \displaystyle{ \sum_{k=0}^n n^m\binom{n}{k}= 2^nn!\sum_{k=0}^m\frac{\left\{{m\atop k}\right\} }{2^k(n-k)!} }$

where $\displaystyle \left\{{m\atop k}\right\}$ are the Stirling numbers of the second kind, $\displaystyle m\in\mathbb{N}_0$.

I'd try the other problem you proposed, but I haven't yet covered multivariable integrals (Giggle).
• Dec 23rd 2010, 06:40 PM
Drexel28
Quote:

Originally Posted by Unbeatable0
I looked up Retkes's identities, but don't understand how you meant to apply them. Care to clarify?

Try applying Retkes' third identity (the one with the sum of the reciporcals) and recalling that $\displaystyle \Pi_j(1,\cdots,n)=(-1)^{n-j}(j-1)!(n-j)!$

Quote:

Also, you had a little typo there - the binomial should have been upside down.
Of course! The first proof is the one I had in mind and the second one is nice indeed! I did it by double counting.

Quote:

I'd try the other problem you proposed, but I haven't yet covered multivariable integrals (Giggle).
Haha, oh well!
Show 40 post(s) from this thread on one page
Page 2 of 8 First 123456 ... Last