I get how film and video differ, whole frames at 24fps vs interlaced (half) frames at 50fps with PAL and 60fps with NTSC. Since modern digital fixed pixel displays are inherently progressive I would like to ask is the interlaced 1080i tv signal generated from the progressive 1080 fim source still pre-smoothed, to reduce line twitter on interlaced displays? Does the transmission contain a film flag or the set top boxes have smart de-interlacers to detect fim mode? How many set top boxes are doing video de-interlacing 540 with bob and weave to make 1080, even with film sources.
"digital anamorphic 16:9, which has the full 576 horizontal lines but 'unstretches' the image from 720 horizontal pixels to 1024"
My understanding is it is 576 lines by 720 sample points on an analogue waveform, a hold over from analogue crt, designed for a dac to convert into a waveform for the crt to display. Fixed pixel displays create pixels between the reference sample points by assumimg a smooth line between the samples. I would not describe the sample points as pixels or the process as unstretching as the samples are points and it does not contain the fine details, in analogue terms the higher frequency details of 1024 samples. The image is inherently softer than say downscaling high definition to 1024.
I think PalPlus was used with some version of DMAC so did also exist as a digital format and was more popular in Europe. PalPlus used the full 576 line resolution for widescreen like anamorphic digital. Not sure if you mean PalPlus could use S-Video and RGB or if you meant it is an advantage of anamorphic digital over PalPlus, but if my memory is correct PalPlus receivers could output RGB.
I agree LCD is a somewhat inherently flawed technology due to limited simultaneous contrast and pixel lag, and sample and hold style display. Much better suited to display text on a monitor rather than films. Things like frame interpolation and led backlighting are attempts to make the image more watchable not improvements to the image the marketing would have you believe.
To not just pick on one display technology.
Plasma also has its flaws due to limited greyscale bit depth and decay times, sometimes causing banding and trails. It also has the oddity of the more white on screen the less bright the white will be thing.
Like you I still use crt in my living room. But crt is also has weaknesses, they are usually lower resolution, geometry is usually not perfect and they have overscan, black level tends to wander a bit so less shadow detail, ansi contrast and perceived image sharpness is also worse. It is also bulky and uses more electricity. One thing that irritates me about modern displays is gamma,. CRT was assumed to be 2.35 in PAL regions, and maybe as high as 2.5 by others. But many modern digital displays seem to think gamma of 2.2 is the target. To me gamma 2.2 looks worse.
For my home cinema I have settled for DLP front projection. DLP has limited greyscale bit depth - relys on dithering in dark greys, dlp rainbow effect if too bright, like all lamp based projectors uncalibrated has too much green not enough red, and can suffer short periods of lamp instability - flickering. It also has poor placement flexibility for a front projector and ideally needs a dedicated room.
We are still waiting for the perfect display, maybe OLED will be the one.