
Why not AscoliArzela?
I was wondering. So a big part of Calculus of Variations is deciding when a functional $\displaystyle \varphi:\mathcal{C}[a,b]\to\mathbb{R}$ attains a max/min. Though, I have never tried to actually do it if we are supplied with a family of functions $\displaystyle \mathcal{F}$ which satisfy certain conditions couldn't we maybe use the AscoliArzela theorem to prove that it is compact. Thus, for wellbehaved functionals like most of the form $\displaystyle f\mapsto\int_a^b F(x,f(x),f'(x))dx$ which are continuous wouldn't the compactness of $\displaystyle \mathcal{F}$ guarantee a max or min attained on $\displaystyle \mathcal{F}$? I'm guessing that it's either A) virtually impossible to prove that a family of such functions are compact or B) it almost never happens with the $\displaystyle \\cdot\_{\infty}$ norm.
Anyone have any enlightening comments? It just amazes me that I've never seen it mentioned in my book.