Hi Math Experts,
I have a 22 Rain-Gauge network, I have bulit 22 nominal models of each rain gauge from correlation analysis, and I have something like that:
RG1 = a1*RG2+a2*RG7+a3*RG15
RG2 = b1*RG1+b2*RG9+b3*RG19
RG22 = v1*RG3+v2*RG5+v3*RG9
where RGn is the rain income in the last 5 minutes in the RainGauge number n.
now I have to generate a "simulated consistent rain" with a computer program that I have to build. This means that I can fix a rain in RG1, and I have to obtain the rain in the 21 other RainGauges according to the relations.
For me this problem is similar to solving a Linear Equation System like Ax=b, but this system is something like Ax=x and I don't know how to deal with. I've read Gaussian Reduction and LU Decompossition, but I think this is a wrong way.
Can anyone give me some light to see the correct path to solve this.
Thank you in advance,