I presume by the number of nits, you mean the difference between the darkest (0 nits) and brightest (4K nits - or more correctly 4,000 cd/m^2) which is about the same order of magnitude brightness as a full moon on a clear night. FWIW 12k nits is like looking at a flourescent bulb close up. Not recommended.
HOWEVER, I don't care how many nits it's been mastered in, if it's a crap film, it's a crap film.
Plenty of really excellent films that have lasted the test of time were made in black and white, remember.
While people still seek out old Bogart and Bacall movies made in B&W, I wonder if our grandkids will be seeking out one of the gazillion extremely dire CGI "masterpeices" plagueing us now in fifty or sixty years time. I very much doubt it. Superhero vs Super-baddie CGI movies are the equivalent of the comics they're based on. Read once,keep if you're that sort of anorak, discard with the trash if not.
But then the role of CGI in movies (I personally think there's too much of it and I am beginning to hate it as a result, FWIW) isn't really what this thread is about, is it?