Amplifier Input Sensitivity - What is it? (Hi-Fi World Article)

Vladimir

New member
Dec 26, 2013
220
7
0
Visit site
sensitivity.png


WHAT IT TELLS US

Sensitivity[/b] is a measure of the magnitude of input signal need for the amplifier to produce full output, at maximum volume. This tells us what signal sources the amplifier can handle, and produce full output if required. But it also tells us where the volume control is likely to be set is use and how quickly volume rises when it is turned up. This affects a user’s perception of power.

All Compact Disc players produce 2V output maximum, because that’s the standard set for it. Successors such as DVD and Blu-ray also produce 2V. Amplifiers purposed for these sources typically have an input sensitivity[/b] of 400mV on all line inputs, such as Tuner, CD, Aux, Tape (this does not include a Phono input if fitted). It’s just enough to cope with tuners and such like having 500mV output, but not so high as to limit the useful travel and therefore resolution of the volume control. However, sensitivity[/b] is rising to cope with legacy sources such as old tuners and cassette decks, that deliver 100mV-300mV, and especially external phono stages that may barely produce 100mV out. Naim amplifiers use a buffer input stage before the volume control and have an inputsensitivity[/b] as high as 90mV.

With very high input sensitivity[/b] an amplifier will jump up to full volume very quickly as volume is turned up, giving a perception of being powerful, irrespective of true power output. For this reason, as well as broadening compatibility with sources,sensitivity[/b] is increasing in modern amplifiers. Having surplus gain in an amplifier increases complexity though, as well as limiting volume control resolution and in theory at least isn’t a good idea if only silver disc players are used. Adjustable input sensitivity[/b] (gain trim) is provided on some amplifiers (e.g. Arcam) so volume doesn’t have to be readjusted when switching from high output to low output sources.

An amplifier with high input sensitivity[/b] will also deliver a worse noise figure under measurement, because output noise is in most amplifiers determined by noise from the first stage, multiplied up by subsequent gain. When gain is turned down however, this noise falls accordingly and it isn’t in practice audible.

A broadly useful input sensitivity is 200mV. It will cope with most sources. A lower value of 400mV suits silver disc players and modern tuners that typically give 1V output. It is too low for many external phono stages however. An inputsensitivity of 90mV such as that of Naim amplifiers is very high, meaning volume will have to be kept low from CD.

It’s possible to reduce input sensitivity[/b] using Rothwell in-line attenuators. There is no way to simply increase sensitivity[/b] though, if a source is too quiet.

Phono input sensitivity[/b] is much higher than line sensitivity[/b], typically 3mV (at 1kHz) for Moving Magnet cartridges and ten times more (0.3mV) for Moving Coil cartridges. A volume control does not precede this input and it overloads at low levels, so no other sources should be plugged in here. Also, it is equalised by an RIAA network (bass boost / treble cut) so cannot be used with very low output sources either.

Source: http://www.hi-fiworld.co.uk/amplifiers/75-amp-tests/150-sensitivity.html
 

andyjm

New member
Jul 20, 2012
15
3
0
Visit site
This whole 'line level' thing is a mess. Specs are generally inconsistent, and misleading. The inclusion of digital systems made matters worse.

In the 'good old days', pro systems were referenced to a nominal level of +4dBu (about 1.25V RMS) and consumer systems to -10dBV(about 300mV RMS). Apart from the confusing choice of different reference level ('u' vs 'V' which goes back to telephone standards), the argument was that pro signals generally had further to go around a studio and needed higher voltage levels because of cable noise and attenuation.

The CD standard is 2V RMS for 0dBFS. This is different again, this means that the highest possible digital value on a CD (0dB Full Scale or FS) will produce a 2V RMS signal.

The interesting thing here is that the analogue reference levels are 'nominal' and the digital reference 'full scale'. There is no head room at all above the digital level, analogue systems are built with many dB of headroom. Digital recording engineers will regularly aim for -16dBFS when mastering to give themselves headroom, where as analogue engineers assume a certain amout of headroom inherent in the system and aim for the 'nominal' value.

The point of this is that the 2V output of a CD is not directly comparable to the 300mV standard for consumer electronic equipment. CD players are undeniably higher output than may other line level devices, but not the 2V vs 300mV ratio that the figures imply.

An interesting question is how the industry got itself into this mess. One quote 'we really like standards in audio, thats why we have so many of them' probably sums it up.
 

chebby

Well-known member
Jun 2, 2008
1,255
27
19,220
Visit site
Is this why my Naim NAT05 tuner (700mV output) sounded better than my CD5i (2V output) ?

I heard many a live Radio 3 lunchtime concert that had uncanny realism (so 'real' it could sometimes distract from the music momentarily) whereas the CD5i ended up languishing in it's box for 6 months before selling the system. (It engaged me that much!)

On paper a BBC FM broadcast (even a live one) should be 'trampled' by a good live recording on CD, but that was never my experience.

Was that 2V output from CD (and the Nait's 150mV input sensitivity) responsible? (Both tuner and cdp were connected via 5 pin DIN to my Nait 5i.)

Was my CD5i / Nait 5i combination actually distorting the music before I even turned up the volume?
 

Native_bon

Well-known member
Nov 26, 2008
181
5
18,595
Visit site
Vladimir said:
WHAT IT TELLS US

Sensitivity is a measure of the magnitude of input signal need for the amplifier to produce full output, at maximum volume. This tells us what signal sources the amplifier can handle, and produce full output if required. But it also tells us where the volume control is likely to be set is use and how quickly volume rises when it is turned up. This affects a user’s perception of power.

All Compact Disc players produce 2V output maximum, because that’s the standard set for it. Successors such as DVD and Blu-ray also produce 2V. Amplifiers purposed for these sources typically have an input sensitivity of 400mV on all line inputs, such as Tuner, CD, Aux, Tape (this does not include a Phono input if fitted). It’s just enough to cope with tuners and such like having 500mV output, but not so high as to limit the useful travel and therefore resolution of the volume control. However, sensitivity is rising to cope with legacy sources such as old tuners and cassette decks, that deliver 100mV-300mV, and especially external phono stages that may barely produce 100mV out. Naim amplifiers use a buffer input stage before the volume control and have an inputsensitivity as high as 90mV.

With very high input sensitivity an amplifier will jump up to full volume very quickly as volume is turned up, giving a perception of being powerful, irrespective of true power output. For this reason, as well as broadening compatibility with sources,sensitivity is increasing in modern amplifiers. Having surplus gain in an amplifier increases complexity though, as well as limiting volume control resolution and in theory at least isn’t a good idea if only silver disc players are used. Adjustable input sensitivity (gain trim) is provided on some amplifiers (e.g. Arcam) so volume doesn’t have to be readjusted when switching from high output to low output sources.

An amplifier with high input sensitivity will also deliver a worse noise figure under measurement, because output noise is in most amplifiers determined by noise from the first stage, multiplied up by subsequent gain. When gain is turned down however, this noise falls accordingly and it isn’t in practice audible.

A broadly useful input sensitivity is 200mV. It will cope with most sources. A lower value of 400mV suits silver disc players and modern tuners that typically give 1V output. It is too low for many external phono stages however. An inputsensitivity of 90mV such as that of Naim amplifiers is very high, meaning volume will have to be kept low from CD.

It’s possible to reduce input sensitivity using Rothwell in-line attenuators. There is no way to simply increase sensitivity though, if a source is too quiet.

Phono input sensitivity is much higher than line sensitivity, typically 3mV (at 1kHz) for Moving Magnet cartridges and ten times more (0.3mV) for Moving Coil cartridges. A volume control does not precede this input and it overloads at low levels, so no other sources should be plugged in here. Also, it is equalised by an RIAA network (bass boost / treble cut) so cannot be used with very low output sources either.

Source: http://www.hi-fiworld.co.uk/amplifiers/75-amp-tests/150-sensitivity.htm If you get l
This is one of the very reason why I bought my Arcam amp. If you get a very hot input signal this can sometimes make things sound horrible. Having the input sensitivity on Arcam amps I think is a very clever function. Always got my input setting on low input. Especially on CD in put.
 

andyjm

New member
Jul 20, 2012
15
3
0
Visit site
chebby said:
Is this why my Naim NAT05 tuner (700mV output) sounded better than my CD5i (2V output) ?

I heard many a live Radio 3 lunchtime concert that had uncanny realism (so 'real' it could sometimes distract from the music momentarily) whereas the CD5i ended up languishing in it's box for 6 months before selling the system. (It engaged me that much!)

On paper a BBC FM broadcast (even a live one) should be 'trampled' by a good live recording on CD, but that was never my experience.

Was that 2V output from CD (and the Nait's 150mV input sensitivity) responsible? (Both tuner and cdp were connected via 5 pin DIN to my Nait 5i.)

Was my CD5i / Nait 5i combination actually distorting the music before I even turned up the volume?

Naim are not fools, and notwithstanding the apparent level mismatch, I would find it hard to believe that their equipment was not compatable.

My guess is that like most of us you have identified that it is the quality of the recording that matters not the delivery mechanism.

The BBC went to extraordinary lengths to do a decent job of recording and mixing and transmitting its material. Having been fortunate enough to participate in BBC training in the early 80s, I can testify to the quality of the approach.

In spite of BBC FM being 'only' 13bits, and CD being 16bits, many on this forum have made similar comments to you about the realism and quality of BBC FM vs CDs.
 

chebby

Well-known member
Jun 2, 2008
1,255
27
19,220
Visit site
Vladimir said:
I think everyone can draw their own rational conclusions from the article. I don't want to get accused of brand hate.

It's only 'sospri' who will accuse you of being a 'Naim hater' (he is always calling me that even when I have praised them) so don't worry.
 

TRENDING THREADS