To stop this thread from being moved to the second page of the subforum . I would like to
give my solutions to these three integrals I posted previously . Hope more and more wonderful identities will be created here !

Problem :

Let $\displaystyle K = \frac{\ln^2(1+\sqrt{2})}{2} $
Show that :
$\displaystyle \displaystyle \int_1^{\infty} \frac{\ln(x+\sqrt{x^2 - 1} )}{x(x^2 +1 )}~dx = K$
$\displaystyle \displaystyle \int_1^{\infty}\frac{\tan^{-1}(x)}{x\sqrt{x^2 - 1 } }~dx = \frac{\pi^2}{8} + K $
$\displaystyle \displaystyle \int_0^1 \frac{\tan^{-1}(x)}{\sqrt{1-x^2} }~dx = \frac{\pi^2}{8} - K$

Solution to the fisrt integral :

Spoiler:

$\displaystyle I = \int_1^{\infty} \frac{\ln(x + \sqrt{x^2 - 1} )}{x(x^2+1)}~dx $
Integration by parts ,

It looks the similar to mine. Could you explain it a bit more. In particular what the notation, especially the second step (with the sum of integrals in the parentheses means). Anyways, here's mine

Spoiler:

If $\displaystyle L$ is our limit then, assuming that $\displaystyle \lambda>\pi$,

It looks the similar to mine. Could you explain it a bit more. In particular what the notation, especially the second step (with the sum of integrals in the parentheses means). Anyways, here's mine

Sure , my idea is similar to yours :
As the period of the integrand is $\displaystyle a = \frac{\pi}{t} $ ( $\displaystyle t $ here is the same as $\displaystyle \lambda $ ) I first partitioned the interval $\displaystyle (0,1)$ into $\displaystyle (0,a) , (a,2a) ,..., ((k-1)a,ka) , (ka,1) $ . For $\displaystyle x \in (0,a) $ we have $\displaystyle 0 < tx < ta = \pi $ which yields $\displaystyle \sin(tx) >0 $ so we find that the values of the first $\displaystyle k $ summands are the same ( $\displaystyle = \int_0^a \sin(tx)~dx = \frac{2}{t} $ ) . For the last summand , $\displaystyle \int_{ka}^1 |\sin(tx)|~dx $ , we know it is less than $\displaystyle \int_0^a \sin(tx)~dx = \frac{2}{t} $ . Therefore , $\displaystyle \frac{2k}{t} \leq I(t) < \frac{2k}{t} + \frac{2}{t} $ . Then i used squeeze principle to evaluate the limit .

First partition the interval $\displaystyle (0,1)$ into $\displaystyle (0,a) , (a,2a) ,..., ((k-1)a,ka) , (ka,1) $ with $\displaystyle a = \frac{\pi}{t} $ ( $\displaystyle t $ here is the same as $\displaystyle \lambda $ ) . We find that the value of the first $\displaystyle k $ summands are the same ( $\displaystyle = \int_0^a \sin(tx)~dx = \frac{2}{t} $ ) . For the last summand , $\displaystyle \int_{ka}^1 |\sin(tx)|~dx $ , we know it is less than $\displaystyle \int_0^a \sin(tx)~dx = \frac{2}{t} $ . Therefore , $\displaystyle \frac{2k}{t} \leq I(t) < \frac{2k}{t} + \frac{2}{t} $ . Then i used squeeze principle to evaluate the limit .

This also might be too easy, but how about $\displaystyle \displaystyle \lim_{x\to 0}\frac{x^n}{\Gamma\left(x^n+1)-1}$. Or, how about $\displaystyle \displaystyle \sum_{n=1}^{\infty}\frac{\alpha n+\beta}{n(n+1)\cdots(n+\ell)},\text{ }\ell\in\mathbb{N},\text{ }\ell\geqslant 3$

This also might be too easy, but how about $\displaystyle \displaystyle \lim_{x\to 0}\frac{x^n}{\Gamma\left(x^n+1)-1}$. Or, how about $\displaystyle \displaystyle \sum_{n=1}^{\infty}\frac{\alpha n+\beta}{n(n+1)\cdots(n+\ell)},\text{ }\ell\in\mathbb{N},\text{ }\ell\geqslant 3$

For the limit, use L'hospital's rule and see that this is equal $\displaystyle \frac{1}{\Gamma'(1)} = -\frac{1}{\gamma}$

For the limit, use L'hospital's rule and see that this is equal $\displaystyle \frac{1}{\Gamma'(1)} = -\frac{1}{\gamma}$

We don't use L'hopital's around these parts! That said, it was so easy that might as well accept that. >

Also, I like your method! It was totally different than what I did! Which was:

Spoiler:

Note that our sum can be written $\displaystyle \displaystyle K(\ell)+J(\ell)$ where $\displaystyle \displaystyle K(\ell)=\alpha\sum_{n=1}^{\infty}\frac{1}{(n+1)\cd ots(n+\ell)}$ and $\displaystyle \displaystyle J(\ell)=\beta\sum_{n=1}^{\infty}\frac{1}{n(n+1)\cd ots(n+\ell)}$. Note firstly then that

Was this your method? If not, I'm interested to see how you solved it.

I'd be glad to post some problems, but none have come to my mind thus far, and in the last days I haven't encoutered with any challenge problems which would fit this thread. I will, however, keep looking for nice problems to post in here.

Last edited by Unbeatable0; Dec 23rd 2010 at 05:22 AM.

I like your method very much! Although I liked my solution for being elementary, yours is far more elegant.

Thank you!

Was this your method? If not, I'm interested to see how you solved it.

I had a mental lapse. There is another problem which is super easy to do using analysis/calculus but has other interesting ways to do it. It's (go ahead and try it) compute

Although I just realized you can do this by a little trickery similar to what you did for your proof of the previous problem using Retkes's identities.

I'd be glad to post some problems, but none have come to my mind thus far, and in the last days I haven't encoutered with any challenge problems which would fit this thread. I will, however, keep looking for nice problems to post in here.

Here is quite a beautiful one (in my opinion).

Problem: Let $\displaystyle \Omega\subseteq\mathbb{R}^n$ be the set of all $\displaystyle (x_1,\cdots,x_n)$ such that $\displaystyle x_j\geqslant 0,\text{ }j\in[n]$ and $\displaystyle \displaystyle \sum_{j=1}^{n}x_j=1$. Compute

Although I just realized you can do this by a little trickery similar to what you did for your proof of the previous problem using Retkes's identities.

I looked up Retkes's identities, but don't understand how you meant to apply them. Care to clarify?

Also, you had a little typo there - the binomial should have been upside down.

Besides the usual proof for this identity, using the derivative of

I looked up Retkes's identities, but don't understand how you meant to apply them. Care to clarify?

Try applying Retkes' third identity (the one with the sum of the reciporcals) and recalling that $\displaystyle \Pi_j(1,\cdots,n)=(-1)^{n-j}(j-1)!(n-j)!$

Also, you had a little typo there - the binomial should have been upside down.

Of course! The first proof is the one I had in mind and the second one is nice indeed! I did it by double counting.

I'd try the other problem you proposed, but I haven't yet covered multivariable integrals .