I'm not sure if I'm posting this in the correct category but here goes:
I have two sets of numbers, milli-Watt (mW) and dBm. The mW values are linear while dBm is a logarithmic scale. The formula to get from mW to dBm is dBm = Log10(mW)*10. This meas of course to get from dBm to mW the equation is mW = 10^(dBm/10).
Now, starting with the values in mW of 10 and 1000, their corresponding dBm values are 10 and 30. Averaging the Mw values gives 505 and averaging the dBm values the same way gives 20 (using arithmetic mean if I'm understanding the term correctly). The problem is when 505 mW is converted to dBm the result is 27.03 and converting 20 dBm to mW gives a value of 100.
Given that I am fairly sure the arithmetic mean always works for linear values, since that is what it seems made for, I am assuming that some other method must be used to get the mean of a set of values that are on a logarithmic scale. I have read about a method called the Geometric Mean (which is sqrt(y*x) ) and when used on the mW values of 10 and 1000 the result is 100 but when it is used on the dBm values the result is 17.32.
To summarize, I have a set of dBm values which change over time and I need to average together, and in another case average together two sets and then subtract one from the other to see if the values are declining over time. So I need to be able to successfully average together the values in the logarithmic scale and at this point I'm not sure what the correct method is.
1) How to correctly average together dBm values that are on a logerithmic scale.
2) How that average should correspond to mW values.
My current suspicion is that 20dBm and 100mW are the correct averages but I'm not sure if I'm right or not, please advise.