Originally Posted by

**s_ingram** Hi folks,

I am trying to prove that the minimum distance d, between a line L1, $ax + by + c = 0$ and a point $A(x_{1}, y_{1})$ is given by the expression:

$d = \dfrac{ax_{1} + by_{1} + c}{\sqrt{a^2 + b^2}}$ .......................(1)

I have already worked out that the distance $d = |\vec{AB}|$ where

$\vec{AB} = (b \lambda - x_{1})i - (\dfrac{c}{b} + a \lambda - y_{1})j$

and

$\lambda = \dfrac{b^2 x_{1} - aby_{1} - ac}{b(a^2 + b^2)}$

The solution should be straightforward. Substitute $\lambda$ into $\vec{AB}$ and calculate $|\vec{AB}|$.

The problem is the algebra! I end up with so many terms and they don't cancel or resolve to equation 1.

It's a big ask to expect anyone to type in the answer but a few hints would be great.