SHARPNESS SETTING ON TV

admin_exported

New member
Aug 10, 2019
2,556
5
0
A forum has been started elsewhere about what the sharpness should be set to on your TV.

An argument is that it should be set to zero on Blu-ray, as it is an 'enhancement' that is not needed, or desired.

Is there a technical viewpoint that corroberates this view?
 
If watching via hdmi (Which you sure as hell should be) then sharpness shouldnt make any difference anyways. If it does then yes ~ it should be set to zero as what you say is completely true (Same with dvds)
 
While watching Blu-ray or DVDs, adjusting the sharpness does make a big difference.

As you adjust up from zero you can see artefacts forming.

On CRTs (that I have owned) there was no option to change the sharpness.

Why is this needed now?
 
IMHO if watching a HD source then no sharpening should be neccesary so I set sharpness to 0.
If I use sharpness at all on SD sources it will only be very slight as too much will give hard edges and accentuate any artifacts.
 
buffalo bill:
While watching Blu-ray or DVDs, adjusting the sharpness does make a big difference.

As you adjust up from zero you can see artefacts forming.

On CRTs (that I have owned) there was no option to change the sharpness.

Why is this needed now?

Because some people are bought into the (wrong) fact that higher sharpness = a better picture. Same reason most tvs now come with an unbelievable amount of 'processing modes' (contrast enhance etc). Nearly all of them are at detriment to TRUE picture quality
 
I guess it depends on what you mean by "sharpness". In more recent products the old sharpness control has been replaced or augmented by "detail" and/or "edge/contour" enhancement setting functions. Depending on the product these operate in a different way to the old 2D luma sharpness process that typically results in halo type artefacts when used. With native HD material the new enhancement processing work at a pixel level around zone plate boundires. This processing can result in the illusion of "sharper" focus or an increase in detail and dimmensinoality. It's really just a processing trick but some people like the result and with DVD it can still lead to noticeable halo artefacts particulary as the image size increases.

Dasp
 
I was hoping someone would come up with something technical.

Thanks.

This magazine gave the Panasonic V10 ( or was it G10?) a poor review on Blu-ray performance.

Could this be because it was on the default sharp setting. Would it have faired better with zero sharpness?
 
Difficult to say without first hand observation.

Some people prefer the more processed image look and thus it may be a case of "preference" v "reference". I've seen similar reviews in this mag on other displays that have poor colour accuracy relative to the reference standards yet the reviewer claimed the colour performance was better on the inaccurate product.

Dasp
 
buffalo bill:
While watching Blu-ray or DVDs, adjusting the sharpness does make a big difference.

As you adjust up from zero you can see artefacts forming.

On CRTs (that I have owned) there was no option to change the sharpness.

Why is this needed now?

It was normally preset in the design, but now we have so many different video sources it's desirable to make it adjustable.
 
On both LCD's in our house the Sharpness settings are turned to nil on all sources.

In my (longer) experience of digital cameras, turning up the Sharpness setting almost always increases noise. I've found exactly the same thing happens with digital TV's.
 
I was very interested to read the various comments on TV sharpness setting - I have always been careful not to have this set too high. However, I did an experiment yesterday while watching Wimbledon on BBC HD (via SKYand HDMI lead). I paused the picture when the camera shot was from behind the server and then scrolled the sharpness setting from minimum to maximum. At max, each square of the net was clearly visible whilst , at minimum, the definition was significantly poorer (though still better than the SD equivalent ). I have left the setting now at max and am not aware of any noticeable picture deterioration. I wonder if anyone else could try this test and give their views? Thanks.
 
When I got my samsung LE32A456 I imediatly went serching for profetionaly calibrated settings for it alas there isnt any posted, but what I got from reading setings posted on various sites on on lots of different tvs is that those in the know generaly have sharpnes set between 0 to 20 (max) out of 100.

In my own experience this works quite well for SD sources and does no harm to HD, and that the edge enhancement feature on my set helped my poor old V box no end (the pictures that bad processing actually benefits it!) but it makes no difference on SKY HD. you could use a movie with a THX calibration feature (like star wars) to set the sharpness and see how that goes
 
I agree with heros ~ if the tvs sharpness DOES make a difference then set using a THX disc so that you dont get 'convergence errors'
 
aliEnRIK:I agree with heros ~ if the tvs sharpness DOES make a difference then set using a THX disc so that you dont get 'convergence errors'

I don't think you know what convergence errors are. If you did, you'd also know they can't occur on a plasma or LCD screen.
 
Red Dwarf:
aliEnRIK:I agree with heros ~ if the tvs sharpness DOES make a difference then set using a THX disc so that you dont get 'convergence errors'

I don't think you know what convergence errors are. If you did, you'd also know they can't occur on a plasma or LCD screen.

Pedantic much?

I think so
emotion-5.gif
 
[I paused the picture when the camera shot was from behind the server and then scrolled the sharpness setting from minimum to maximum. At max, each square of the net was clearly visible whilst , at minimum, the definition was significantly poorer (though still better than the SD equivalent] quote

Yes, this will have this effect, but is it real? In real life, would you see the net that clearly?
 
Tone down the Sharpness

The Sharpness control in your
TV probably doesn't do what you think it does. Rather than giving you
more detail, the Sharpness control actually gives you LESS detail
and a
harder look, by increasing the contrast around the boundaries where
different tones meet, thereby sharpening edges in the picture. This is
bad enough on its own, but remember also that when the sharpening
commands in the TV's processor apply this crude process to the video,
it'll also be emphasising the flaws in the picture as well - not just
the good bits. So, if you're watching Digital TV, those ugly
compression blocks will be being made more pronounced by this process.

Often,
the video that you're watching might already have had it's own
Sharpening applied. Sadly, this process is very common, and once video
has been Sharpened, there's almost nothing you can do about it. But, if
you have your TV set up properly, you can make sure that it isn't
making it any worse!

Knowing where to set the Sharpness
control depends on your TV. What you're ideally aiming for is a picture
that has no additional Sharpening applied whatsoever, as well as one
that isn't being blurred, either - in other words, we want the TV to
reproduce exactly what it's being fed as closely as possible. For some
TVs, the "0" Sharpness level is the correct choice, because it means "0
extra Sharpness". But on other TVs, "0" Sharpness actually instructs
the TV's processor to blur the picture.

A good way of
making sure you're getting the right results is to bring up a screen
with text on it - for example, your DVD player's System Menus. (You can
use a satellite or cable set-top box for the same effect, as well). Why
is this a good on-the-cheap example? Because the System Menus are
usually very basic looking, and typically have text against a solid
background, with no additional fancy graphics. That means that if any
Sharpening is going on, it'll be pretty obvious.

If you have an Upscaling DVD player,
then it's a good idea to turn off the Upscaling mode when you're
setting the Sharpness. The simplest way to do this is to connect the
player using a SCART or Component cable (since most players can't
upscale using these outputs). The reason for this is that most
upscaling players put the On Screen Graphics through the upscaling
process as well (so that they don't look tiny at high resolutions),
which can skew the results and make setting the control harder.

Once
you have a menu on the screen, the chances are, it won't look
completely natural. Look closely at the letters - they'll probably have
big outlines around them that shouldn't be there. These are called
"halos" and are a result of the Sharpening process.
Start lowering the
Sharpness and keep going until the halos are gone. Some TVs are set up
so that these halos won't ever go away, which is a pain, but not the
end of the world - in this case, just make them go away as much as
possible.

On most TVs, the minimum Sharpness setting will
be the best one. As I said before, on other TVs, this will actually be
softening the picture. Set the Sharpness control so that the letters
are clear and distinct - almost like text on a computer screen is - but
without being blurred or sharpened.
 
I know that on my Panasonic LCD the sharpness control is actually non-responsive when accepting signal via the HDMI. It does it's normal tricks when using any of the analogue inputs or the tuners but when viewing one of the HDMI's the control is still present in the menu but moving it from 0 all the way to 100 and back makes no visible difference. This is the way any traditional sharpness control should work on an HDTV.
 

TRENDING THREADS