Star photography, camera image scaling problem
I want to find out what angle a pixel in the images which my camera produces represent.
So I take pictures of two points (stars in the sky). The true angle between them is known beforehand.
But the camera produces images where the pixels are rectangular, i.e. non-quadratic. They are more elongated horisontally than vertical (in the order of 5% or so, it seems). So the two star points have a larger number of pixels between them if I turn the camera 90 degrees.
If I could take a picture with the two points (stars) on the same pixel line once vertically and once horizontally, the problem would be as trivial as to divide the number of pixels separating them in those two pictures. But this is unfortunately impossible for practical reasons. I can only take pictures where a line between the two stars is diagonal across the image.
How to I calculate the "angular rectangularity" of the pixels given a number of such shots?
I can count the delta-x and delta-y number of pixels between the stars in each image. And the camera is of course rotated differently for each shot.