Originally Posted by

**utopiaNow** Prove that if $\displaystyle a,b$ are positive real numbers then $\displaystyle \displaystyle\lim_{n\rightarrow \infty} (a^n +b^n)^{1/n} = \max{(a,b)}$.

My attempt is to say that without loss of generality let $\displaystyle a > b$, then for sufficiently large $\displaystyle n$ we will have $\displaystyle a^n + b^n \approx a^n$ hence $\displaystyle \displaystyle\lim_{n\rightarrow \infty} (a^n +b^n)^{1/n} = \displaystyle\lim_{n\rightarrow \infty}(a^n)^{1/n} = a $. However this step "for sufficiently large $\displaystyle n$, we will have $\displaystyle a^n + b^n \approx a^n$" does not seem rigorous to me, and I can't think of a way of proving this result more rigorously.

Any suggestions would be greatly appreciated.