I have several, (electronic), voltage analog to digital convertors. The A/D are also opto-isolated which puts a bias on the signal, (i.e. 0 digital does not equal 0 analog).
What I have done is put known voltages, (+10.0V and -10.0V), on the device and recorded the coresponding digital values. Now with these fixed points, I should be able to translate a digital value d into a voltage equivalent +/- full scale of the instrument, (+/- 15.0V).
I have fiddled with some stuff, but can't see quite how to come up with the right way of doing this.
I know (calib_high - calib_low) gives me how many digital points there are between the +/-10V states. I know the digital range divided by the analog range gives me the number of digital points per analog point. But then what?
It is probably obvious, but I seem to be having a bad day.
PS. I calibrate with +/- 10V because I do not have access to a sufficietnly accurate reference source at +/-15V.
Assume the output of the ADC is a signed int of appropriate type.
Originally Posted by in2deep
The gain is G=(calib_high - calib_low)/20 units per volt
The DC bias is b=(calib_high + calib_low)/2 units.
Then a voltage V gives a reading:
and a reading of z units coreresponds to a voltage:
Yes, of course. Sorry, was having a blonde day.