I was going through the book "Matrix Analysis and Applied Linear Algebra" by Carl D Meyer and was having a good time.
Suddenly I came across vector norms and equivalent norms, and my brain shut down.
So, I did some searching and found what looks like a nice article explaining this madness:
Proof That All Norms On Finite Vector Space Are Equivalent

It first shows that if two norms are equivalent on the unit sphere then they are equivalent everywhere. I understand that to some extent.

Then it goes and shows that if we are working with the 2-norm, any other norm is a continous function with respect to the 2-norm.
I have a question regarding this.
Take an arbitrary finite-dimensional space X and an arbitrary norm ||\cdot||.
Also suppose that [\textbf{b}^n_{i=1}] is a basis of X and so an element of \textbf{x} \in X may be written as \textbf{x}=\sum^n_{i=1}x_i\textbf{b}_i.
Now given an \epsilon>0, choose \delta>0 such that ||\textbf{x}-\textbf{y}||_2<\delta implies that
max{|x_i-y_i|}<\frac{\epsilon}{\sum^n_{i=1}||\textbf{b}_i||  }

How does all that imply that the maximum difference between two components of vectors x and y is less than the ratio of a number larger than 0 ( \epsilon), and the sum of the norms of the basis vectors \sum^n_{i=1}||\textbf{b}_i||?

Thanks guys!