I was going through the book "Matrix Analysis and Applied Linear Algebra" by Carl D Meyer and was having a good time.
Suddenly I came across vector norms and equivalent norms, and my brain shut down.
So, I did some searching and found what looks like a nice article explaining this madness:
Proof That All Norms On Finite Vector Space Are Equivalent

It first shows that if two norms are equivalent on the unit sphere then they are equivalent everywhere. I understand that to some extent.

Then it goes and shows that if we are working with the 2-norm, any other norm is a continous function with respect to the 2-norm.
I have a question regarding this.
Take an arbitrary finite-dimensional space $\displaystyle X$ and an arbitrary norm $\displaystyle ||\cdot||$.
Also suppose that $\displaystyle [\textbf{b}^n_{i=1}]$ is a basis of $\displaystyle X$ and so an element of $\displaystyle \textbf{x} \in X$ may be written as $\displaystyle \textbf{x}=\sum^n_{i=1}x_i\textbf{b}_i$.
Now given an $\displaystyle \epsilon>0$, choose $\displaystyle \delta>0$ such that $\displaystyle ||\textbf{x}-\textbf{y}||_2<\delta$ implies that
$\displaystyle max{|x_i-y_i|}<\frac{\epsilon}{\sum^n_{i=1}||\textbf{b}_i|| }$

How does all that imply that the maximum difference between two components of vectors x and y is less than the ratio of a number larger than 0 ($\displaystyle \epsilon$), and the sum of the norms of the basis vectors $\displaystyle \sum^n_{i=1}||\textbf{b}_i||$?

Thanks guys!