Proving the asymptotes of a hyperbola

This was an assignment given to me by my teacher in calculus during our conics section, but it seems to me that the math involved is more precalculus level math because it really only involves limits. If this thread should be in another thread, I'm sorry.

The problem is as followed:

Given that the equation of a hyperbola is

(x^2/a^2)-(y^2/b^2)=1 and that b^2=c^2-a^2

Prove that the asymptotes of a hyperbola are y=(b/a)x and -(b/a)x

I think I'm pretty close to the answer, but there's a step that I hesitate to make because I'm not sure if it's allowed. Here's what I've done so far

abs(=absolute value, sqrt=square root

(x^2/a^2)-(y^2/b^2)=1

(y^2/b^2)=(x^2/a^2)-1

y^2=b^2[(x^2/a^2)-1]

y=abs(b)(sqrt[(x^2/a^2)-1])

y=abs(b)(sqrt[(x^2-a^2)/a^2])

y=abs(b/a)(sqrt(x^2-a^2))

now if I take the limit of both sides as x->infinity, can I just use sqrt(x^2) in place of sqrt(x^2-a^2) because sqrt(x^2) is the end-behavior model of sqrt(x^2-a^2)? If this is the case, then

y=abs(b/a)x

QED

Is there anything wrong with my logic?