ellisdj said:
I assume you must have a mega bright room with light shinning directly on the screen for a 2.0 calibration for daytime. 2.2 for 3D and I thought you would have 2.4 for movies I am quite surprised. I wouldnt panic unless you see side by side or have the option to select the 2 you'll never know the difference.
I have vertical blinds up at the window nearest the tv - I wont allow direct sunlight to shine onto my speakers in case of sun bleaching, as the room is south facing we get a lot of sunlight without the blinds it would be horrible and the tv unwatchable I feel
In saying that I have seriously never once found myself looking to change from 2.4 gamma since I have calibrated to it.
That was not my initial intention - I initially was going to use it for dark room movie viewing only but after seeing it, its a deep lush image and I havent found it to crush any detail at all - maybe its because of the size of the screen / where I sit I dont know
I tried 2.2 this morning - its appears to have a bit more pop / brightness as you would expect but I still much prefer the deeper and richer 2.4. 2.2 looks washed out by comparison
That was one of the selling points for me to buy the Panny over the My Lx5090 - on that I could only really cal movie mode to 2.2 gamma. Someone told me to cla the normal mode to 2.4 and that I would love it but never did - that was silly of my, well lazy.
Getting a good 2.4 calibration took me ages
The first 2.4 gamma cal I did was too dark - then I realised that you have to cal all the way through to 90% whote balance and gamma - then and only after your there changing the 100% white balance and gamma aligns the rest of the calibration -- you have then have to go back and tweak all of them again. Very hard to explain that is and it took me a while to figure that out. I am sure its the same with all 10 point systems
Maybe your are supposed to do 100% first - not sure but I always start with 10%
Apologies if offended anyone with earlier comments - I was really shocked that is all.
This guy is the man when it comes to tv reviews I find - he targeted 2.4 so to me thats what I did, think he explains why
http://www.hdtvtest.co.uk/news/panasonic-txp60zt65b-201305062961.htm
Back to the point on IFC - and the OP, you honestly can mess around with the settigns until you are blue in the face, you will never get it to sit right.
Think your having the calibrator in so just wait until you do mate and then make a judgement from there on the other settings such as IFC / 24fps smooth etc.
Hi Ellis, no offence taken here so no worries. It's actually a really interesting subject we've stumbled onto here, have been doing further reading since my earlier posts and this is a subject around which there is very little consensus, indeed "2.2 vs 2.4" is a debate that seems to have been waging with occasional furiousity on some formed for many years now!
Key points to highlight based on what I've read so far are:
A) there is no current official industry standard for what gamma level should be targeted by studios when mastering content, or by manufacturers when building hardware.
B) 2.2 is the 'unofficial' benchmark, but this is purely a result of the legacy of how gamma came about, ie the inherent flaw in CRT technology that produced a non-linear brightness curve (which TV studios had to compensate for by artificially introducing an equal, but opposite gamma curve into their content).
C) as a result, and due to the relatively large numbers of CRT sets still in circulation, 2.2 is still the level that studios assume people will be watching on their home TV's.
D) US TV studios apparently use 2.22 on their mastering monitors, where-as in the UK 2.35 is used (another unhelpful example of inconsistent standards, as per 50hz vs 60hz).
E) most film studios seem reluctant to confirm what gamma level they target when mastering BluRay content (which will be different to cinema, which again has no universal standard but is generally higher, circa 2.8. ).
F) the EBU are advocating the adoption of 2.35 as a universal standard, to address the consistency issues.
G) the Japanese are apparently advocating 2.4, and for the first time this year some Sony TV's (eg the w905) target gamma at 2.4 in their out-the-box settings.
H. ) opponents point out that due to the legacy of CRT driven 2.2, adopting a higher level as standard will mean new content will not be mastered at the optimum level for viewing on legacy hardware, and visa-versa, during what would be a 5 year transition period as people go through the upgrade cycle.
I) 2.2 seems generally accepted as the ideal 'compromise' setting for mixed viewing conditions, where-as 2.35-2.4 is the purist setting for "black-out" viewing conditions. Some therefore argue for targeting 2.3, which our sets don't offer as an option... :wall:
J. ) ironically the while debate is arguably pointless as gamma is an anachronism leftover from CRT days; its an artificially introduced factor designed to compensate for an inherent flaw in a panel technology most people no longer use; modern displays have a linear brightness response so in an all-digital media chain there is technically no need for 'gamma'; the studios are introducing it to compensate for a flaw in CRT sets that most of us aren't using, and then our plasma/LED sets are then trying to recreate that flaw to compensate for what the studios have introduced into the image - madness when you stop to think about it!?!
I agree re 2.0 on the daytime setting, it lacks punch so for the limited viewing time I've had since having it done I've had it set to "nighttime" all the time.
Sorry for banging on, find the whole subject fascinating but appreciate I'm probably boring the pants of everyone else by now! :silenced: