nads:knightout thanks for that, so if i have this right, if you are using HDMI the players DACs are of no importance?
Yes. My understanding is the video dac digital to analogue converters are only being used to generate an analogue signal when you are outputing analogue. Three dacs will be used for component, an additional dac is used to simultaneously output composite video. HDMI output will not use any dacs. The player could in theory convert the signal of the dvd into analogue, manipulate it in analogue then convert it back to digital but I doubt it does.
This is what I believe dac adc do: Dvds are typically pre-smoothed to reduce screen flicker and combing caused by a interlaced displays and reduce the amount of data that needs to be encoded on the disc. The dvd is then encoded on the disc by using an analogue to digital converter that samples each display line 720 times and records the luminance level of each sample point describing it using 8-bits (2x2x2x2x2x2x2x2=256 possible levels). To convert this digital information back into analogue a dvd player needs a minimum dac digital to analogue converter of 13.5MHz to generate 720 samples per line and 8-bit resolution to create the levels. With progressive scan it needs to generate twice as many lines, extrapolating the extra lines from the recorded dvd interlaced signal by either correct cadence detection of film based material or using algorithims to generate extra lines for video material, so needs to be twice as fast 13.5MHx x2 = 27MHz to generate twice the number of lines. Most dacs will oversample, so will generate more than 720 samples per line. Each time the number of samples per line is doubled I believe the bit-depth must also be increased by 1 so the number of values-levels that can be described is doubled, so a new sample point extrapolated half way between the two original sample points that were 1point different in value in the lower bit depth can now be described as having a value half way between the two original samples. So a Panasonic dac with 54MHz 10-bit can generating 1440 samples per line with 9-bit worth of levels, the extra 10th bit may also be used to generate the analogue signal as the waveform has to accomodate reference signals like colour burst, sync, etc... A Sony dac with 108MHz 12-bit can generating 2880 samples per line with 10-bit worth of levels, again one extra 11th bit is probably being used to generate the analogue waveform. The additional extra 12th bit still left is probably being used to enable the Sony to do processing in digital - manipulation in a higher bit-depth than it outputs, so it can manipulate contrast, brightness, etc... without causing compound rounding errors in the output values. The Denon dac with 216MHz 12-bit can generating 5760 samples per line with 11-bit worth of levels, again one extra bit probably being used to generate the analogue waveform. This would make the Denon somewhat overkill, but then it was top of the range, it might function more like the Sony but with alot more processing power for more image manipulation, I do not know. (Note I have read that some budget players alledgedly may state their dacs full spec but actually have the dac performing underspec, like a pc claiming a good cpu but lacking the other neccessary hardware to use the cpu to its full potential)
The dac generated analogue waveform outputted by the players analogue outputs, on crt is converted directly to picture information in the analogue domain, signal voltage = luminance, etc... For modern non-crt displays it is read by the displays adc analogue to digital converters. This again will have a sampling rate - frequency and a bit resolution. I am unsure how you would guess the displays adc, but will assume the number of samples per line might be determined by the displays resolution, so 720p resolution = 720x1280, so 1280 samples per line, 1080p resolution = 1080x1920, so 1920 samples per line. The bit depth will probably be 10-bit but could be more. I believe you ideally want the analogue waveform if generated by a dac, to have been generated using more samples and a higher bit depth than the adc that will be reading it. I think this has something to do with sampling theory or reducing compound rounding errors. The high definition display will know what resolution the analogue input is in because of how many horizontal lines it has per image, 576 for PAL, it will then upscale the image to the displays resolution by generating extra lines.
HDMI bypasses all this complexity since it does not need any digital to analogue, analogue to digital conversion to transfer and read the information. The information should be transfered bit perfect in digital. Most players do not output interlaced signals over HDMI, so the player still handles de-interlacing, geneating the extra lines by correct cadence detection for film based material or algorithims for video based materials. The Denon has an excellent de-interlacer. Modern players often have a upscaler using algorithims to generate a higher definition signal over hdmi. This upscaler may or may not be better than the one in the display. Which since over HDMI the display knows what resolution vertical and horizontal the input signal is in, so it can use its own upscaler if it needs to.
So an analogue waveform generated by a budget progressive scan dvd player could have been generated with 720 samples per line each described using 8-bits. The 1080p high definition display is sampling and displaying this using 1920 samples per line described using 10-bits. Then generating 1080 horizontal lines out of the lines it received. The image will probably not be good. If the same player has hdmi output, hdmi should be a lot better, because the display knows what resolution - how many samples the source is horizontally and vertically and will then upscale it horizonatally and vertically, the player may also have a upscaler over hdmi which maybe better or worse than the displays inbuilt scaler.