ADC Input Translator
Many high-accuracy analog-digital converters require input levels between 0.0V and 5.0V. As an example, the MAX1402 (an
The circuit of Figure 1 converts an input signal in the range ±10.5V to the input range of the MAX1402 ADC (0V to 5V). Two of the ADC channels (IN1 and IN2 in this case) are configured for either full differential or precision single-ended measurements. Resistor dividers R1 and R2 scale the inputs, and a stable source of 3.28V offsets the inputs. As a result, the ADC input is centered at 2.50V when the measurement inputs are grounded. (That is, the ADC digital output is zero when VIN = 0V.) Precision component values maintain the ADC's 16-bit accuracy.
Figure 1. This circuit enables an ADC with input range of 0V to 5V (single-ended or differential) to accept inputs in the range ±10.5V.
Configuring the MAX1402 for differential measurements allows it to measure the voltage difference between IN1 and IN2. These inputs accept voltages anywhere in the ±10.5V range, and the internal programmable-gain amplifier (PGA) is available to increase the resolution for low-level signals. A gain of four, for example, enables the ADC to resolve a ±2.625V input signal with 16-bit resolution.
To make single-ended measurements you can configure the inputs as two separate channels, and compare them to a 2.50V reference voltage attached to IN6. Or, for higher precision, you can configure the ADC for differential inputs, with one channel acting as a ground-sense input.
The resistor divider ratio can be altered to accommodate different input ranges, but the same ratio is needed in the circuit that generates the offset voltage. A ratio of 5:1, for example, would yield an input range of ±15.0V and an offset voltage of 3.00V. To calibrate the system, simply record the output value with inputs grounded and with a known input voltage. Those two values let you calculate the offset and gain factors for each input range.