Distortion Residue is a Lindos measurement provided on both the LA100 and MS20 which measures distortion at 1kHz ITU-R 468 weighted AL. More specifically this involves feeding a 1kHz tone into the equipment under test then nulling out the fundamental (1kHz) and measuring what remains just as if it were noise. The result is then weighted using the ITU-R 468 weighted measurement method. This emphasises high-order harmonics around 6kHz by 12dB but attenuates those above 10kHz, and ignores those above 20kHz. The ITU-R 468 measurement also uses a quasi-peak rectifier, which gives proper temporal weighting to brief bursts that would be largely ignored by the averaging inherent in a RMS measurement, as is the case with a THD+N measurement. The purpose of this is to give a more clear indication of the intrusiveness of the distortion on the listening experience. Manufacturers are keen to produce very low figures for distortion measured at MOL (maximum output level) or FS (full scale), but in practice these are of little relevance compared to performance at –20dB AL (typically -46dB on CD) where low relative levels are harder to achieve and defects are more likely to be heard.
The bar chart representation of Distortion Residue available on the MS20 is particularly interesting as it contains a lot of information designed to allow at-a-glance assessment of performance. The first bar represents the Noise level, followed by bars for relative Distortion Residue at up to six levels. The first of these (at –20dB) is often the most revealing. In the absence of significant distortion it will indicate a figure 20dB worse than the noise bar (because it is showing Distortion Residue relative to –20dB AL) but crossover distortion (in amplifiers), quantising errors (in digital convertors), and modulation noise (on tape) will all show as further worsening. In the case above none of these such issue are apparent as the bar indicates a level 20dB worse than the noise level.
The 0dB figure shows Distortion Residue measured relative to Alignment level, which when compared with the noise bar provides a clear indication of the level of harmonic distortion above the noise level at typical level (AL). In the case above this is approximately 2dB.
The bars at +8dB,+12dB, 15dB and 18dB then show the effect on the amount of distortion by increasing the level of the 1kHz test tone. These are measured relative to the level of the test tone, so the +8dB result on a system without increasing harmonic distortion associated with input level will read 8dB lower than the 0dB result – as is the case in the example chart here.
The +12dB bar shows a 3dB worsening compared to the 0dB level. +15dB bar shows a 6dB worsening and at +18dB we can see serve clipping occurring. Note that this is shown as being worse on the left channel than the right as the blue bar is higher than the pink.