The purpose of the analog to digital is to quantize the input signal from the sample and
hold circuit to 2B discrete levels - where B is the number of bits of the analog to digital
converter (ADC). The input voltage can range from 0 to Vref (or -Vref to +Vref for a
bipolar ADC). What this means is that the voltage reference of the ADC is used to set the
range of conversion of the ADC.
For a monopolar ADC, a 0V input will cause the converter to output all zeros. If the
input to the ADC is equal to or larger than then the converter will output all
ones. For inputs between these two voltage levels, the ADC will output binary numbers
corresponding to the signal level. For a bipolar ADC, the minimum input is
not 0V.
Because the ADC outputs only levels there is inherently noise in the quantized
output signal. The ratio of the signal to this quantization noise is called SQNR. The
SQNR in dB is approximately equal to 6 times the number of bits of the ADC:
So for a 16 bit ADC this means that the SQNR is approximately equal to 96dB. There are, of course, other sources of noise that corrupts the output of the ADC. These include noise from the sensor, from the signal conditioning circuitry, and from the surrounding digital circuitry
The key to reducing the effects of the noise is to maximize the input signal level. What
this means is that the HCI designer should increase the gain of the signal conditioning
circuitry until the maximum sensor output is equal to the of the ADC. It is also
possible to reduce
down to the maximum level of the sensor. The problem
with this is that the noise will corrupt the small signals. A good rule of thumb is to keep
at least as large as the maximum digital signal, usually 5V.