Solutions to ODE y'(t) = t^2 +y(t)^2
The problem assigned to me was to show that the equation y'(t)==t^2+y(t)^2 has a solution for initial conditions y(0) = 0 over the interval 0<= t <= min(a, b/(a^2+b^2)), where we are considering a rectangle R, 0<= t <= a, -b<=y<=b.
What puzzles me is that the uniqueness and existence theorem says that if the DE is of the form y'(t) = f(t,y), where f and df/dy (partial derivative) are continuous over some rectangle containing the initial conditions (t0,y0) then the solution exists and is unique in that rectangle. In this case, f(t,y) is simply t^2 + y^2, and df/dy = 2 y. Both are clearly continuous for all values of t and y, so what is up with this question?
Also, trying to solve both numerically and analytically in Mathematica state that y(0)=0 is a funky point (the error statest that singularity or stiff system suspected).
Thanks in advance,