## Reconstruction of a map of transformations given noisy data

Lets assume I have a map of points in x,y coordinate space like below:

The arrows represent the transformations to get to one point from another, the values above the arrows represent the value of the transformation in terms of a 2d vector.

The image above are the real world values.

My data set contains readings for each transformation, however the transformations I obtain have some non-systemic error (assumed to follow a gaussian distribution with the mean as the real value). The system does not know anything about the real world values.

For example I have obtained 100 readings for the transform between 1 and 2. One of these readings could be x:9.8 y:0.09 and so forth.

My question is what technique can I use to obtain the optimal estimate of the map given my inaccurate values of the transforms between points.

The ideal result would be as close to the real values as possible.

My naive approach would be to take a point as the base, for instance 1. From 1 work out the averages of the transforms to get to another point. keep on chaining this until you generate the map. However this approach leads to problems where the accuracy of the points drops the further away from 1 the point gets. If my average transforms from 1->2 is x:10 y:2 due to some bad readings and my readings from 2->3 are completely accurate (x:10,y:0), it would lead to the below:

Green is the real-world values, black is the calculated.

The final solution should take into account any number of points with any combination of transforms between them. In terms of a graph, the points are connected but does not necessarily form a complete graph.

I am not sure if this the most ideal place to post this, hopefully someone can move this if it is not the right place.