I've asked several of my friends about this question, but nobody knows where to start with it, and unfortunately, neither do I. Here it is:
Given point A(x1,y1) and the line ax+by+c=0, show the distance from Point A to the line is d=[|ax1+by1+c|/root(a^2 + b^2)]. Note that ax+by+c=0 corresponds to a vertical line if b=0 and to a horizontal line if a=0.
Any help would be appreciated. Thank you
Sep 21st 2008, 10:03 PM
It depends on what you can use as tools. (For example, what is the definition of a "distance"? Can we use vectors or the complex plane? etc.)
The easiest way would be to use the Cauchy-Schwarz inequality: .
We have to minimize (x-x_1)^2+(y-y_1)^2 where ax+by+c=0.
By the above inequality, . Substitute ax+by=-c into this and you get the formula.
Let B(x_0, y_0) be the point on the line realizing the distance: d=AB. If you know that AB is perpendicular to the line, there are many more proofs.For example, the line AB is -b(x-x_1)+a(y-y_1)=0, so you can determine B with the condition ax_0+by_0+c=0. (You can rephrase this by using vectors and saying that (x-x_1, y-y_1) is parallel to (a, b).)
If you know trigonometrics, represent any point on the line by As it is on the line, which is for some . The minimum of r is attained when is either 1 or -1.