Finding Zeroes

• Apr 6th 2010, 01:17 PM
EinStone
Finding Zeroes
Let $\displaystyle N \in \mathbb{N}$. Show that for $\displaystyle s \in \mathbb{C}$ the function $\displaystyle f(s) = \sum_{k=0}^N \frac{(-1)^k {N \choose k}}{k+s}$ has no zeroes.
• Apr 6th 2010, 06:00 PM
chiph588@
Quote:

Originally Posted by EinStone
Let $\displaystyle N \in \mathbb{N}$. Show that for $\displaystyle s \in \mathbb{C}$ the function $\displaystyle f(s) = \sum_{k=0}^N \frac{(-1)^k {N \choose k}}{k+s}$ has no zeroes.

Hint: put the sum under a common denominator.
• Apr 7th 2010, 12:55 AM
EinStone
I tried, but its hard to see what is the numerator in that case, since we have N terms. Any other hint?
• Apr 7th 2010, 03:22 AM
Laurent
The previous hint was a good one: as a matter of fact, $\displaystyle f(s)=\frac{N!}{s(s+1)\cdots(s+N)}$.
A can think of at least three different proofs of that identity; you should give it a try.
• Apr 7th 2010, 03:33 AM
EinStone
Actually the whole idea of this problem was to prove exactly your identity. My idea was to prove that zeroes and poles are the same. For poles I was able to prove that, but for the zeroes I needed help. So I posted this thread, and now you tell me to prove the identity, which I really don't know how :).
• Apr 7th 2010, 07:57 AM
Laurent
Quote:

Originally Posted by EinStone
Actually the whole idea of this problem was to prove exactly your identity. My idea was to prove that zeroes and poles are the same. For poles I was able to prove that, but for the zeroes I needed help. So I posted this thread, and now you tell me to prove the identity, which I really don't know how :).

(Giggle)

a) The function $\displaystyle P(s)=s(s+1)\cdots(s+N)f(s)$ is a polynomial of degree $\displaystyle N$, and it is simple to check that it satisfies $\displaystyle P(0)=P(-1)=\cdots=P(-N)=N!$ (each time, only one term is on-zero). Thus $\displaystyle Q(s)=P(s)-N!$ is a degree $\displaystyle N$ polynomial with $\displaystyle N+1$ roots, hence it is zero.

Then I looked for a more explicit proof. It turns out to have calculus flavour.

b) For $\displaystyle s\in\mathbb{C}$, $\displaystyle \frac{1}{k+s}=\int_0^\infty e^{-(k+s)t}dt$, hence $\displaystyle f(s)=\int_0^\infty (1-e^{-t})^N e^{-st}dt=\int_0^1 (1-u)^N u^{s+1}ds=B(N+1,s)$ (Euler Beta function) hence $\displaystyle f(s)=\frac{\Gamma(N+1)\Gamma(s)}{\Gamma(N+1+s)}=\f rac{N!}{(N+s)(N-1+s)\cdots s}$ (functional equation of $\displaystyle \Gamma$, or multiple integrations by parts)

And if you know the answer, you can also do it this way.

c) $\displaystyle F(s)=\frac{N!}{s(s+1)\cdots (s+N)}$ can be decomposed into simple elements (I'm not sure of the english terminology): $\displaystyle F(s)=\frac{a_0}{s}+\cdots+\frac{a_N}{s+N}$ (simple poles). And you note that $\displaystyle a_k=\Big((s+k)F(s)\Big)_{|s=-k}$ from this expression. Using this formula in $\displaystyle F(s)=\frac{N!}{s(s+1)\cdots (s+N)}$ gives $\displaystyle a_k=\frac{N!}{(-k)(-k+1)\cdots (-k+(k-1))(-k+(k+1))\cdots(-k+N)}$ $\displaystyle =\frac{N!}{(-1)^k k! (N-k)!}=(-1)^k{N\choose k}$, which are the coefficients of $\displaystyle f(s)$, hence $\displaystyle F(s)=f(s)$.

There must be several other proofs...
• Apr 8th 2010, 12:42 AM
EinStone
OK, nice. Now where we have the identity:

$\displaystyle \int_0^N \displaystyle \left(1-\frac{t}{N} \right)^N t^{s-1}dt = \frac{N!}{s(s+1)\cdots(s+N)}$

I need to show carefully that the limit $\displaystyle N \rightarrow \infty$ exists and yields the integral representation of the Gamma function.
This seems somehow trivial to me since $\displaystyle \displaystyle \left(1-\frac{t}{N} \right)^N \rightarrow e^{-t}$ as $\displaystyle N \rightarrow \infty$ and the right side is just the definition of the Gamma function. Or am I wrong?
• Apr 8th 2010, 01:18 AM
Laurent
Quote:

Originally Posted by EinStone
OK, nice. Now where we have the identity:

$\displaystyle \int_0^N \displaystyle \left(1-\frac{t}{N} \right)^N t^{s-1}dt = \frac{N!}{s(s+1)\cdots(s+N)}$

I need to show carefully that the limit $\displaystyle N \rightarrow \infty$ exists and yields the integral representation of the Gamma function.
This seems somehow trivial to me since $\displaystyle \displaystyle \left(1-\frac{t}{N} \right)^N \rightarrow e^{-t}$ as $\displaystyle N \rightarrow \infty$ and the right side is just the definition of the Gamma function. Or am I wrong?

You also have to justify why the left hand side converges to $\displaystyle \int_0^\infty e^{-t}t^{s-1}dt$. You'll probably need the dominated convergence theorem for that. Use $\displaystyle 1-u\leq e^{-u}$, and write $\displaystyle \int_0^N (\cdots)dt= \int_0^\infty {\bf 1}_{[0,N]}(\cdots)dt$ to be in the setting of the dominated convergence theorem.
• Apr 8th 2010, 01:22 AM
EinStone
I dont know this theorem, and Im not too familiar with Lebesque integration either. Do you have any other approach?

EDIT. Actually is Lebesque convergence theorem the same as dominated convergence theorem? Cause that one I know, and then it should be possible.