I have a problem and I can not solve it. We have the heat equation $\displaystyle u_t=\Delta u, x\in R^d, t>0$ and $\displaystyle u(0)=\phi , x\in R^{d}$. Let $\displaystyle \phi\in H^r(R^d), r\in R, s\ge 0$. Show that it exists a constant C(s) such that for any t positive we have the inequality $\displaystyle ||u(t)||_{H^{r+s}(R^d)}\le C(s)(1+t^{-s/2})||\phi||_{H^r(R^d)}$.

I appreciate any help.

Thank you.