Glossary Definition for Difference Amplifier

Glossary Term: Difference Amplifier 

Definition

A difference amplifier is a circuit that takes two inputs and outputs the difference between them. It is a special case of the differential amplifier with a gain of 1. It is also referred to as a voltage subtractor.

What does a difference amplifier do?

A difference amplifier outputs the difference between its inputs.

Vout=V2-V1

In an ideal difference or differential amp, the output depends only on the difference between the two inputs. In any real differential amp, the output also depends on the average of the two inputs, which is called the common mode of the amp.

Vcm=(V2+V1)/2

The common mode rejection ratio (CMRR) is a measure of a device’s ability to reject this signal. Differential amps are very common in analog circuit design and are particularly useful in electronically noisy environments because of their ability to cancel out unwanted noise.

What is the difference between a difference amplifier and a differential amplifier?

The term “difference amplifier” can be used to refer to a type of differential amplifier with a gain of 1. For this reason, it can also be referred to as a unity gain differential amplifier. A differential amp has an output proportional to the difference between inputs, and a difference amp has an output equal to the difference between inputs. The two terms are also often used interchangeably.

What is the difference between a difference amplifier and an instrumentation amplifier?

Difference amplifiers and instrumentation amplifiers are both types of differential amplifier circuits. An instrumentation amplifier is a type of differential amplifier with input buffer amplifiers that eliminate the need for impedance matching. The gain can be adjusted through the variation of just one resistor. Instrumentation amplifiers are also available as ICs that provide very high CMRR.

Learn More: Amplifiers


Find a term alphabetically:
0-9ABCDEFGHIJKLMNOPQRSTUVWXYZ