- Aug 10, 2019
- 2,556
- 5
- 0
This post will tell you how to easily, quickly and reliably work out when your amplifier's volume reaches maximum and when you will introduce distortion.
To work it out, you need to know the input sensitivity of your amplifier's line level inputs and the output level from your line-level source. This should be measure in millivolts (mV).
Let us assume we have an amplifier with the input sensitivity of 150mV at the aux. input. The output level of the CD player we are using is 2000mV (which is 2V). Then use the formula:
20 * log(V1 / V2) When V1 is the input voltage and V2 is the input sensitivity
Substituting in,
20 * log(2000/150) = +22
This answer is in dB, so on the given amplifier at the given output voltage of the line-level source and the given input sensitivity on that amplifier, the maximum power of the amplifier will be at -22db (when referenced to maximum = 0db).
Therefore the true formula to working out this is:
dB = 20 * log(V1/V2).
If your amplifier gives its volume level in -db (such as Arcam and Cambridge Audio) you should be able to work out when the amplifier is at full power before distortion kicks in. Obviously, the speaker load plays a small part in this (as does how clean the mains power is) but on the whole, this is true.
If you have any questions, do not hesitate to ask!
J Hughes.
To work it out, you need to know the input sensitivity of your amplifier's line level inputs and the output level from your line-level source. This should be measure in millivolts (mV).
Let us assume we have an amplifier with the input sensitivity of 150mV at the aux. input. The output level of the CD player we are using is 2000mV (which is 2V). Then use the formula:
20 * log(V1 / V2) When V1 is the input voltage and V2 is the input sensitivity
Substituting in,
20 * log(2000/150) = +22
This answer is in dB, so on the given amplifier at the given output voltage of the line-level source and the given input sensitivity on that amplifier, the maximum power of the amplifier will be at -22db (when referenced to maximum = 0db).
Therefore the true formula to working out this is:
dB = 20 * log(V1/V2).
If your amplifier gives its volume level in -db (such as Arcam and Cambridge Audio) you should be able to work out when the amplifier is at full power before distortion kicks in. Obviously, the speaker load plays a small part in this (as does how clean the mains power is) but on the whole, this is true.
If you have any questions, do not hesitate to ask!
J Hughes.