I think they take the maximum theoretical resolution of 576x711 = 409536 for 4:3 pal crt, and compare it to maximum theoretical resolution of 1080x1920 =2073600 for a 16:9 fixed pixel modern display. 2073600/409536=5.06. So upto 5x the resolution
f standard definition.
Visible difference between blu-ray and dvd. Perception of image quality is determined by MTF modulated transfer function the contrast ratio between details. The MTF of large details determines how well objects stand out from the background, how sharp and how much depth the image has. MTF of middle to small details determines how well textures stand out in the objects, how lifelike clothing textures and skin tones look, textures are also a primary source of depth cues for the eye/brain so again it helps with image depth. MTF of tiny details only defines maximum resolvable resolution and is usually the size most image noise will be present. Viewer perception of sharpness is usually measured by the MTF of 3 to 12 line pairs per degree of the viewers visual arc, with the middle 6 line pairs being the most important. By my figuring at 1.5 x screen width a 1pixel white, 1pixel black line pair on a 1080x1920 high definition display is >25 line pairs per degree, at 1x screen width >16 line pairs per degree. So even at close viewing distances with a front projector it is not the increased resolution that makes blu-ray look better. It is the higher MTF in the middle to small details, the textures and skin tone details. The MTF of dvd rolls off more sooner, the MTF curve is lower. The blu-ray curve stays high longer and rolls of more sharply at the end.
PAL DVD resolution is often quoted as 576 lines by 720 sample points because MPEG uses multiples of 16 in its encoding, but actual resolution is lower. DVDs are sampled at 13.5MHz so the Nyquist limit is 6.75MHz for a resolution of about 540x720. But it is lower still as to prevent aliasing artifacts nothing above 6.75MHz can make it on to the recording, so the analogue/digital converter is prededed by a low-pass (anti-aliasing filter) a brickwall 6.75MHz filter is not used as it would cause a steep roll-off toward the cut-off point, instead to prevent overshoot a filter at 5.75MHz (about 460x613) is used, this filter has typically removed all information above 6MHz resolution of 480x640. But it gets even lower still as many dvds only have full luminance resolution up to 5 or 5.5MHz resolution of about 400x533 to 440x587. Since standard definition production also has a limited resolution of 6MHz anyway, anything above 6MHz is removed by many dvd players video DAC because it is most likely noise. The exception is when material has been down converted from a high definition source for mastering onto standard definition dvd and the dvd player is upscalling it to high definition, in this case any information present on the disc upto 6.75MHz might be used.
Actual visible resolution of high definition 1080x1920. I have read a good super 35mm film is only high Modulated Transfer Function (high contrast details) out to about 500 line pairs with some out to 700 line pairs. So old fims transfered to blu-ray may have little contrast at 960 line pairs - 1920 pixels. High definition cameras do not usually supersample and have filters before the photoreceptors to spread details over more than one receptor and according to SMPTE standards a 30MHz filter to roll off contrast above 872 line pairs, these filters are to prevent aliasing banding due to otherwise exceeding the Nyquisit sampling rate. Some high definition Tv stations at least in the UK are actually 1400 resolution due to the encoder then upsampled to 1920.
Maybe with down sampled 2K and 4K digital cinema films you get higher resolution but visible resolution is also limited by the Kell factor of the display. With a Kell factor of 1, each pixel of the image is represented by one pixel on the display, as a 1 pixel in size black detail on a white image moved across the display it would alternate between being represented by one black pixel and being represented by two grey pixels as it went in and out of display pixel alignment, so it would cause flickering noise. Since a image pixel is unlikely to line up exactly with display pixel it usually loses resolution and contrast by becoming two grey pixels. A image of a maximum resolution alternating black and white lines test pattern would be displayed as a solid grey block if it did not line up exactly with the display. With a Kell factor of 0.5 you get best case two black - two white, worst case one black - one grey - one white. With Kell factor 0.666 on a image of two black - two white alternating lines you get best case one black - one grey - one white, worst case one black - two grey - one black or one white - two grey - one white. Kell factor is typically 0.7-0.8 meaning 7 or 8 pixels of image are represented by 10 pixels of display. The loss of contrast in fine details caused by the Kell factor is partially offset by increasing the contrast of fine details, peaking which can casue the noise to be more visible, and with analogue methods ringing. Besides digitally mastered test cards, and cgi I do not think any sources have 1 pixel sized high contrast details to represent. So high definitions quoted 1080x1920 resolution is not usually visible resolution.