Hi,
I am collecting a set of Gaussian distributed measurements drawn from probability density function f(x). The measurements are transformed through a known function h(x) and the resulting probability density function g(x) is skewed. I'd like to use g(x) in a least squares estimate. The Gauss Markov therom seems to state that because g(x) has a non-zero skew, biases will be present in the result if the samples of g(x) are directly used in a least squares estimator.

Since I know the transforming function h(x), it seems that I should be able to estimate the mean of g(x) and just subtract a constant from it to make it zero mean. Will this effectively remove the bias that would result from using samples from an unmodified g(x)? I also need to estimate confidence intervals for the LSE result. If subtracting the mean from g(x) does in fact results in an unbiased estimator, do the equations for estimating the confidence intervals need to be modified?

If the above idea sounds bad, is there any other way to use a LSE for this case? I've also considered trying to use a maximum liklihood estimator, but would prefer a LSE approach.

Thank you for any help!