Hi all,

I have a camera modelled as a simple standard pinhole (origin of the rays: $\displaystyle (x_C, y_C, z_C)$; optical axis = z axis of the camera coordinates) and I am trying to analytically compute its focal length (distance between the origin of the rays and the projection plane) from two 3D point correspondences.

That is, I know the 3D world coordinates $\displaystyle (x_1,y_1,z_1)$ and $\displaystyle (x_2, y_2, z_2)$ of two points, and their 2D coordinates $\displaystyle (X_{1s}, Y_{1s})$ and $\displaystyle (X_{2s}, Y_{2s})$ on the image plane in the camera coordinates (so that in the full 3D camera coordinate system their coordinates are $\displaystyle (X_{1s}, Y_{1s}, f)$ and $\displaystyle (X_{2s}, Y_{2s}, f)$, $\displaystyle f$ being the wanted focal length.

Unlike the position, the camera orientation is not known.

So far, I have computed the angle between the two points rays using their world coordinates: $\displaystyle cos(\theta) = \frac{\vec{v}_1 \cdot \vec{v}_2} {\left| \vec{v}_1 \right| \left| \vec{v}_2\right|}$ where $\displaystyle \vec{v}_i$ is the vector from the camera center to the 3D point in world coordinates: $\displaystyle (x_i,y_i,z_i) - (x_C, y_C, z_C)$.

Then I've tried using the same formula to derive f, because the angle is the same regardless of the coordinate system used:

$\displaystyle \frac{x_{1s} x_{2s} + y_{1s} y_{2s} + f^2} {\sqrt{x_{1s}^2 + y_{1s}^2 + f^2} + \sqrt{x_{2s}^2 + y_{2s}^2 + f^2} } = cos(\theta) $

Here's where I am stuck - if I try to get $\displaystyle f$ out of that equation I end up with a quintic which I really don't like.

Any tips?

Thank you very much!