Sequence Convergence Proofs vs. Continuity Function Proofs

Definition of converge:

For each epsilon > 0 there exists a number N such that n>N implies

|sn -s|<epsilon.

In this case the N you chose was usually a max {constant, f(epsilon)}.

Definition of continuity

For each epsilon > 0 there exists a delta>0 such that x element dom(f) and

|x-x0| < delta implies |f(x)-f(x0)|< epsilon.

In this case the delta you choose is usually a min{constant, f(epsilon)}

I was just wondering how to think about why for one you need a min and the other a max.

Thanks