HDMI or Component connection?

admin_exported

New member
Aug 10, 2019
2,556
4
0
Visit site
I still use an ''old'' Denon 3910 for playing dvd and sa audio cd's.

My tv is a full HD screen with 3 HDMI inputs.

Which way is best for picture quality, a HDMI or component connection?

Either cable is a good quality cable from the brand MONSTER.

Thanx for any repley!
 

Big Aura

Well-known member
Oct 13, 2008
522
10
18,895
Visit site
Not familiar with that player, but HDMI will be better, particularly if your player or screen has an upscaling function. Won't make any difference to sound (assuming you're using the TV's speakers).
 

Gerrardasnails

Well-known member
Sep 6, 2007
295
1
18,890
Visit site
hanno:
I still use an ''old'' Denon 3910 for playing dvd and sa audio cd's.

My tv is a full HD screen with 3 HDMI inputs.

Which way is best for picture quality, a HDMI or component connection?

Either cable is a good quality cable from the brand MONSTER.

Thanx for any repley!

hdmi.
 
A

Anonymous

Guest
Usually HDMI. HDMI has the following advantages over component. HDMI keeps the signal digital so does not use the players video dac or displays adc. With component video dacs (digital to analogue converters) are used at the player, if their sampling frequency MHz is not higher than the displays adcs (analogue to digital converters) sampling frequency, it is possible for the image to suffer. HDMI also has perfect video frequency response since it is not analogue. With component due to the filters used after the video dacs and if you use a very long cable high frequency signal loss in the cable, it is common for fine details to have less contrast, and possibly colours to be less vibrant. HDMI has seperate sync signals. Component relys on sync signals being extrated from the video signal, this usually works fine, but can cause problems particularly on long cable runs, if the display loses sync it can cause image flickering. Component cables will however usually be cheaper for long cable runs. Some displays automatically overscan - magnify the image from analogue inputs including component, when the source is outputing the same resolution as the displays native resolution this causes slight blurring of fine details, when the display is upscaling from a lower resolution input as will most likely be the case with the Denon 3910 and a modern High Definition display the effect is dependent on the upscaling, and could reduce blurring of fine details or increase it.

Having said that the Denon 3910 was a state of the art top of the range DVD player. Its component video dacs are 216MHz plenty high enough, in comparison sony dvd players often use 108MHz and panasonic dvd players 54MHz. On its component outputs the video frequency response does not curve down, in fact it has a very slight rise - so more contrast in fine details. So picture quality from component should be excellent and you might prefer what the Denon does to the component video analogue output over what the display does to its hdmi digital output. Using HDMI may also lose you picture tweaking options at the player or display, as these are often done to the analogue video signal. Compare the HDMI and Component images, as which is best in practice is dependent on player - display combination.

By the way with the Denon 3910 use the AUTO 2 setting as it will lock on to 2:2 cadence used for UK PAL dvds, the AUTO 1 setting is better at variations of 3:2 cadence used in the USA but fails 2:2 candence. Due to the high quality video dac the improvement in image quality from correct cadence detection may not be as noticeable as with other players.

A possible problem with the Denon 3910 is it uses a Faroudja de-interalcer, which can cause visible macro-blocking on some displays, this is only a problem with certain displays, due to the combination in how they and the player process the image. Also I believe the HDMI output on some Denon players suffered from a early version firmware fault making the screen have a green tint. I do not know if the 3910 was one of the players that suffered from this. It was fixed by a firmware update, which can be downloaded, burnt to cd and installed, or firmware can be updated by a dealer. If the Denon player suffers from this and you fix it your self make sure you down load the UK region 2 firmware update, or you will alter the region code of your player and no longer be able to play region 2 PAL discs.
 

jase fox

Well-known member
Apr 24, 2008
212
0
18,790
Visit site
I used to own the 3910 & yes it was a cracking DVD player, i used to use the component connection as my tv at the time didnt have a HDMI socket, but i must say that the component feed was absolutely stunning, i think it would give HDMI a run for its money.
 
A

Anonymous

Guest
knightout:
Usually HDMI. HDMI has the following advantages over component. HDMI keeps the signal digital so does not use the players video dac or displays adc. With component video dacs (digital to analogue converters) are used at the player, if their sampling frequency MHz is not higher than the displays adcs (analogue to digital converters) sampling frequency, it is possible for the image to suffer. HDMIÿalso has perfect video frequency response since it is notÿanalogue. With component due to the filters used after the video dacs and if you use a very long cable high frequency signal loss in the cable, it is common for fine details to have less contrast, and possibly colours to be less vibrant. HDMI has seperate sync signals. Componentÿrelys on sync signals being extrated from theÿvideo signal, this usually works fine, butÿcan cause problems particularlyÿon long cable runs, if theÿdisplay loses sync it can cause image flickering. Component cables will howeverÿusually be cheaper for long cable runs. Some displays automatically overscan - magnify the image from analogue inputs including component, when the source is outputing the same resolution as the displays native resolution this causes slightÿblurring of fine details, when the display is upscaling from a lower resolution input as will most likely be the case with the Denon 3910 and a modern High Definition display the effect is dependent on the upscaling, and could reduce blurring of fine details or increase it.

Having said that the Denon 3910 was a state of the art top of the range DVD player. Its component video dacs are 216MHz plenty high enough, in comparison sony dvd players often use 108MHz and panasonic dvd players 54MHz. On its component outputs the video frequency response does not curve down, in fact it has a very slight rise - so more contrast in fine details. So picture quality from component should be excellent and you might prefer what the Denon does to the component video analogue output over what the display does to its hdmi digital output. Using HDMI may also lose you picture tweaking options at the player or display, as these are often done to the analogue video signal. Compare the HDMI and Component images, as which is best in practice is dependent on player - display combination.

By the way with the Denon 3910 use the AUTO 2 setting as it will lock on to 2:2 candence used for UK PAL dvds, the AUTO 1 setting is better at variations of 3:2 candence used in the USA but fails 2:2 candence. Due to the high quality video dac the improvement in image quality from correct candence detection may not be as noticeable as with other players.

A possible problem with theÿDenon 3910 is it uses a Faroudja de-interalcer, which can cause visible macro-blocking on someÿdisplays, this is only a problem withÿcertain displays, due to the combination in how they and the player process the image.ÿAlso I believe the HDMI output on some Denon players suffered from a early version firmware fault making the screen have a green tint. I do not know if the 3910 was one of the players that suffered from this. It was fixed by a firmware update, which can be downloaded, burnt to cd and installed, or firmware can be updated by a dealer. If the Denon player suffers from this and you fix it your self make sure you down load the UK region 2 firmware update, or you will alter the region code of your player and no longer be able to play region 2 PAL discs.

Post of the week, crystalline in its clarity. You should write for home cinema magazines.ÿ
 

jase fox

Well-known member
Apr 24, 2008
212
0
18,790
Visit site
Tarquinh:knightout:
Usually HDMI. HDMI has the following advantages over component. HDMI keeps the signal digital so does not use the players video dac or displays adc. With component video dacs (digital to analogue converters) are used at the player, if their sampling frequency MHz is not higher than the displays adcs (analogue to digital converters) sampling frequency, it is possible for the image to suffer. HDMIÿalso has perfect video frequency response since it is notÿanalogue. With component due to the filters used after the video dacs and if you use a very long cable high frequency signal loss in the cable, it is common for fine details to have less contrast, and possibly colours to be less vibrant. HDMI has seperate sync signals. Componentÿrelys on sync signals being extrated from theÿvideo signal, this usually works fine, butÿcan cause problems particularlyÿon long cable runs, if theÿdisplay loses sync it can cause image flickering. Component cables will howeverÿusually be cheaper for long cable runs. Some displays automatically overscan - magnify the image from analogue inputs including component, when the source is outputing the same resolution as the displays native resolution this causes slightÿblurring of fine details, when the display is upscaling from a lower resolution input as will most likely be the case with the Denon 3910 and a modern High Definition display the effect is dependent on the upscaling, and could reduce blurring of fine details or increase it.

Having said that the Denon 3910 was a state of the art top of the range DVD player. Its component video dacs are 216MHz plenty high enough, in comparison sony dvd players often use 108MHz and panasonic dvd players 54MHz. On its component outputs the video frequency response does not curve down, in fact it has a very slight rise - so more contrast in fine details. So picture quality from component should be excellent and you might prefer what the Denon does to the component video analogue output over what the display does to its hdmi digital output. Using HDMI may also lose you picture tweaking options at the player or display, as these are often done to the analogue video signal. Compare the HDMI and Component images, as which is best in practice is dependent on player - display combination.

By the way with the Denon 3910 use the AUTO 2 setting as it will lock on to 2:2 candence used for UK PAL dvds, the AUTO 1 setting is better at variations of 3:2 candence used in the USA but fails 2:2 candence. Due to the high quality video dac the improvement in image quality from correct candence detection may not be as noticeable as with other players.

A possible problem with theÿDenon 3910 is it uses a Faroudja de-interalcer, which can cause visible macro-blocking on someÿdisplays, this is only a problem withÿcertain displays, due to the combination in how they and the player process the image.ÿAlso I believe the HDMI output on some Denon players suffered from a early version firmware fault making the screen have a green tint. I do not know if the 3910 was one of the players that suffered from this. It was fixed by a firmware update, which can be downloaded, burnt to cd and installed, or firmware can be updated by a dealer. If the Denon player suffers from this and you fix it your self make sure you down load the UK region 2 firmware update, or you will alter the region code of your player and no longer be able to play region 2 PAL discs.

Post of the week, crystalline in its clarity. You should write for home cinema magazines.ÿ
mmm Knightout could of just copied all that from a magazine, ha. But im sure he hasnt.......... have you?
 
A

Anonymous

Guest
If I wanted to quote from a magazine. I would just give a subjective opinion about how it all depends on how much you spent on the magic cables. If any of the information I provided is incorrect I dare say what hifi`s experts will point them out and correct them.
 
A

Anonymous

Guest
nads:knightout thanks for that, so if i have this right, if you are using HDMI the players DACs are of no importance?

Yes. My understanding is the video dac digital to analogue converters are only being used to generate an analogue signal when you are outputing analogue. Three dacs will be used for component, an additional dac is used to simultaneously output composite video. HDMI output will not use any dacs. The player could in theory convert the signal of the dvd into analogue, manipulate it in analogue then convert it back to digital but I doubt it does.

This is what I believe dac adc do: Dvds are typically pre-smoothed to reduce screen flicker and combing caused by a interlaced displays and reduce the amount of data that needs to be encoded on the disc. The dvd is then encoded on the disc by using an analogue to digital converter that samples each display line 720 times and records the luminance level of each sample point describing it using 8-bits (2x2x2x2x2x2x2x2=256 possible levels). To convert this digital information back into analogue a dvd player needs a minimum dac digital to analogue converter of 13.5MHz to generate 720 samples per line and 8-bit resolution to create the levels. With progressive scan it needs to generate twice as many lines, extrapolating the extra lines from the recorded dvd interlaced signal by either correct cadence detection of film based material or using algorithims to generate extra lines for video material, so needs to be twice as fast 13.5MHx x2 = 27MHz to generate twice the number of lines. Most dacs will oversample, so will generate more than 720 samples per line. Each time the number of samples per line is doubled I believe the bit-depth must also be increased by 1 so the number of values-levels that can be described is doubled, so a new sample point extrapolated half way between the two original sample points that were 1point different in value in the lower bit depth can now be described as having a value half way between the two original samples. So a Panasonic dac with 54MHz 10-bit can generating 1440 samples per line with 9-bit worth of levels, the extra 10th bit may also be used to generate the analogue signal as the waveform has to accomodate reference signals like colour burst, sync, etc... A Sony dac with 108MHz 12-bit can generating 2880 samples per line with 10-bit worth of levels, again one extra 11th bit is probably being used to generate the analogue waveform. The additional extra 12th bit still left is probably being used to enable the Sony to do processing in digital - manipulation in a higher bit-depth than it outputs, so it can manipulate contrast, brightness, etc... without causing compound rounding errors in the output values. The Denon dac with 216MHz 12-bit can generating 5760 samples per line with 11-bit worth of levels, again one extra bit probably being used to generate the analogue waveform. This would make the Denon somewhat overkill, but then it was top of the range, it might function more like the Sony but with alot more processing power for more image manipulation, I do not know. (Note I have read that some budget players alledgedly may state their dacs full spec but actually have the dac performing underspec, like a pc claiming a good cpu but lacking the other neccessary hardware to use the cpu to its full potential)

The dac generated analogue waveform outputted by the players analogue outputs, on crt is converted directly to picture information in the analogue domain, signal voltage = luminance, etc... For modern non-crt displays it is read by the displays adc analogue to digital converters. This again will have a sampling rate - frequency and a bit resolution. I am unsure how you would guess the displays adc, but will assume the number of samples per line might be determined by the displays resolution, so 720p resolution = 720x1280, so 1280 samples per line, 1080p resolution = 1080x1920, so 1920 samples per line. The bit depth will probably be 10-bit but could be more. I believe you ideally want the analogue waveform if generated by a dac, to have been generated using more samples and a higher bit depth than the adc that will be reading it. I think this has something to do with sampling theory or reducing compound rounding errors. The high definition display will know what resolution the analogue input is in because of how many horizontal lines it has per image, 576 for PAL, it will then upscale the image to the displays resolution by generating extra lines.

HDMI bypasses all this complexity since it does not need any digital to analogue, analogue to digital conversion to transfer and read the information. The information should be transfered bit perfect in digital. Most players do not output interlaced signals over HDMI, so the player still handles de-interlacing, geneating the extra lines by correct cadence detection for film based material or algorithims for video based materials. The Denon has an excellent de-interlacer. Modern players often have a upscaler using algorithims to generate a higher definition signal over hdmi. This upscaler may or may not be better than the one in the display. Which since over HDMI the display knows what resolution vertical and horizontal the input signal is in, so it can use its own upscaler if it needs to.

So an analogue waveform generated by a budget progressive scan dvd player could have been generated with 720 samples per line each described using 8-bits. The 1080p high definition display is sampling and displaying this using 1920 samples per line described using 10-bits. Then generating 1080 horizontal lines out of the lines it received. The image will probably not be good. If the same player has hdmi output, hdmi should be a lot better, because the display knows what resolution - how many samples the source is horizontally and vertically and will then upscale it horizonatally and vertically, the player may also have a upscaler over hdmi which maybe better or worse than the displays inbuilt scaler.
 
A

Anonymous

Guest
You could at least qualify your answers knightout! very vague. ;)
 
A

Anonymous

Guest
I use Samsung 30 inch CRT TV (something like this http://www.practical-home-theater-guide.com/samsung-slimfit.html). It is HD ready with HDMI input

I also bought LG DVD player with upscaling capability and HDMI output.

I decided to test the component (using reasonable quality cable, worth around 8 quid) vs HDMI (using the supplied, free HDMI cable).

What I found: component produces good pictures with saturated colors and deep blacks. HDMI produces a pictures that is too white and very poor color (like a haze / fog is there).

I am assuming HDMI will produce better result but why my experience is different? Is that because of the free cable? Or because the DAC in the TV is poor? (2006 production)

currently i use the free cable as I assume the cable is only needed to carry the 0s and 1s. And reading from WHF (August edition if i am not mistaken), the result of the test between different HDMI cable is mix.
 

TRENDING THREADS

Latest posts