Definition

An analog-to-digital converter (also known as an ADC or an A/D converter) is an electronic circuit that measures a real-world signal (such as temperature, pressure, acceleration, and speed) and converts it to a digital representation of the signal.

## How does an ADC work?

An ADC compares samples of the analog input voltage (produced using a Sample and Hold circuit) to a known reference voltage and then produces a digital representation of this analog input. The output of an ADC is a digital binary code. By its nature, an ADC introduces a quantization error, which is simply the information that is lost. This error occurs because there are an infinite number of voltages for a continuous analog signal, but only a finite number of ADC digital codes. Therefore, the more digital codes that the ADC can resolve, the more resolution it has, and the less information lost to quantization error. In A/D conversion, the Nyquist principle (derived from the Nyquist-Shannon sampling theorem) states that the sampling must be at least twice the maximum bandwidth of the analog signal being converted, in order to allow the signal to be accurately reproduced. The maximum bandwidth of the signal (half the sampling rate) is commonly called the Nyquist frequency (or Shannon sampling frequency). In real life, sampling rate must be higher than that (because filters used to re-produce the original signal are not perfect). As an example, the bandwidth of a standard audio CD is a bit less than the theoretical maximum of 22.05kHz (based on the sample rate of 44.1kHz).

Related Pages:

Synonyms
• Analog to Digital
• A/D
• A-D
• A to D

Find a term alphabetically: