I was wondering. So a big part of Calculus of Variations is deciding when a functional \varphi:\mathcal{C}[a,b]\to\mathbb{R} attains a max/min. Though, I have never tried to actually do it if we are supplied with a family of functions \mathcal{F} which satisfy certain conditions couldn't we maybe use the Ascoli-Arzela theorem to prove that it is compact. Thus, for well-behaved functionals like most of the form f\mapsto\int_a^b F(x,f(x),f'(x))dx which are continuous wouldn't the compactness of \mathcal{F} guarantee a max or min attained on \mathcal{F}? I'm guessing that it's either A) virtually impossible to prove that a family of such functions are compact or B) it almost never happens with the \|\cdot\|_{\infty} norm.

Anyone have any enlightening comments? It just amazes me that I've never seen it mentioned in my book.