Estimate X and Y which best fit the following set of equations:
X: n by 1 comlumn vector (unknown)
Y: 1 by 1 unknown constant
Z: n by 1 measured column vector (known)
n is constant and can be as large as expected.
Could you please tell me what method (either numerical or algebra form) I can use to solve the above-mention problem or any suggestion?
THanks a lot.
In other words, your data is assumed to display an equation of the following form:
z = y*x + b
where b and y are constants. Your task now is to find the values of b and y which yield the best line that describes the data.
A first step might be to use a Linear Least-Squares approach, which minimizes the vertical distance between each measured data point and the resulting interpolating line. Other approaches are also available, depending upon what you mean by "best fit". MATLAB has several tools available. There is also an online tool available too:
Linear Least-Squares Data-Fitting Utility
Thanks a lot for your time and consideration.
My problem is not estimating the two unknown constants Y and b since X is also unknown. I thus do not have a set of pair measured data (X,Z) to estimate Y and b, only Z is measured and available.
I have thought of the need for an iteration within each step. I have tried Recursive least square estimation method as well as Extended Kalman filter method but still have not find the answer.
If you have any other comments or suggestion, please let me know. Your time and kindness are highly appreciated.