So... Correct me if I am wrong.. This is my understanding right now with what is wrong with HDR..HDR material is rendered with a target luminosity ? Its not a object model where you can render the material on the fly into a luminosity space of the display device. The metadata in HDR10+ and Dolby Vision lack the power to truly solve the issue. That would require way too much CPU power. So a target is picked and the material is rendered. The target many people seem to be using is for a display that physics has yet to enable. This allows the material to be rendered correctly at some point in the future when displays have reached the ability to do supernova to 0.0001fl range.. So the material we see today is rendered in a way when they create the file that is for a device no one has,,, and when displayed on current displays looks awful because it ends up not having any dynamic range on the actual imager chip in use. Stretching it out to cover the actual display range is almost not possible and results in all sorts of issues. While its possible to make it look "OK", sorta,, its not making full use of the bit depth of the imager chip and not using the full dynamic range of the display device UNLESS its rendered with a luminosity target that is close to the display...Do I have that right ?There is no way to fix this or adjust out a incorrect rendering..Further.. Dolby Vision also seems to use this same general model and more metadat that wont resolve the issue.The only way this will work correctly is a new standard that renders the video into the dynamic range of the display device on the fly. That seems way to mathematically intense to be implemented any time soon.. Sorta Atmos for video..It IS possible to render into different luminosity spaces and offer these up by a choice. Like the S&M disc does. So its possible for a AppleTV for example to know what your choice of luminosity target is and then pick the file to play based on this. Its also possible the consumer could pick it.. This would be a good temp solution.. Of course source material people would need to render a number of files, grade each one, store and offer all these taking up like 7 times more space. So the business end of this might be the real problem.HDR is abhorrent.
I am curretly playing with a Sony VPL-GTZ380, but I have the same experience with Sony OLED, LG, and a JVC projector to name a few. In fact EVERY display I have encountered looks worse in every set of calibrations.As far as adding more math ( Lumingen ) to fix HDR... Hmmm... Thats like trying to fix DirecTV, SURE you can make it look better, but the issue still exists. Plus you get to fool with settings for each bit of source material. ALso you then add more processing to the HDMI signal which mathematically MUST degrade everything that goes thru it. That is just a mathematical reality. The question is does the harm outweigh the good. The math to remap this is lossy Also your starting from less bits which is why once mastered the damage is done. The picture from EVERY other source looks great, 4K SDR is stunning.. Its ONLY HDR that look like ****.. I SHOULD NOT HAVE TO FIX IT.. And even then 4K SDR still beats the result of more processing.My view of HDR being Abhorrent is about the standard. The result of HDR when everything is right, the material was aquired in HDR and the moon is full with jupiter aligned with mars,, like on the Spears & Munsil UHD demo material is wonderful. What has me annoyed is this Consumer Technology Association created standard just to sell product, HDR.. Its a terrible "standard" driven to market in order to sell more devices while not properly vetting a standard and with a deceptive certification regime.I am SMPTE/SID/SPIE so I do understand the standards process and HDR/10/+ along with DV are horrid, misleading and serve one function - for people to buy new gear.With a bitstream compressed down to 20Mbps,,, a 1080P24 4:2:0 will look better then a 4K60 4:4:4 RGB HDR/DV because you end up crushing even more data down a fixed pipe. While it is technically 4K in its pixels, your only getting maybe 1K after compression and less with moving objects. Bit depth can also get crushed. 4K only looks good if nothing moves. Things in fast motion have no resolution because of compression. A 1080 has way more resolution with things in motion because it does not have the compression that 4K would with the same bitstream cap. We don't want to push MORE data down a fixed pipe. I sure don't want material that was not acquired in HDR rendering into HDR 4K and then stuffed down a pipe. All this does is degrade the material. People are adding HDR to material that was not aquired in it and just destroying the material because of the later compression.HDR is a plague. Its spread has gone unchecked.HDR is the Emperors New Clothes.
I suspect the JVC you had was one of the earlier E-shift models? Because HDR on the current JVC's looks quite good. I also assume you have not seem a current version of Lumagen DTM. What size and gain screen are you using? That could be a factor on why you did not think HDR was very good on a JVC. A GTZ380 does not do DTM, but it is not bad with HDR. The GTZ380 is not any better with HDR than any of the other Sony projectors, it is just a whole lot brighter.
Actually, everything going through the Lumagen looks better on my 4K projector. That includes AT & T U Verse 1080p sources ( the worst source I have ), Blu-rays and 4K Blu-rays. No fiddling required - I haven't changed any settings in a year. Watched both Deadpool 1 & 2 on 4K UHD Blu-ray tonight, and generally watch 3 - 4 movies a week. HDR looks great.
Hdr isn’t broken. Displays aren’t capable (yet) of properly displaying the bits for it to be a benefit. That is not without a fix. The lumagen tone maps to the available nits. And the results are outstanding.
So why do I need a Lumagen to remap ? The std is broken.. The remap has a mathematical cost you MUST pay.
HDR from the S&M demo material is jaw dropping on the 380. The HDR from pretty much everything is looks like ****.I have respect for a Lumagen, but, its more math. Its impossible mathematically to not degrade the material to some measurableabe degree. I have played with the Lumagen devices over the years. My AB is with it physically in line and bypassed by skipping it and plugging in directly to the source. This AB has always resulted that a direct connection is always the better picture on material that is well mastered. Of course my Oppo has a big list of mods and I use Wire World Platinum Eclipse cables all short length.It has been 2 years since I dropped in a Lumagen. I will check with Jim and do a bit of AB once again. Plus Jim is a extert on what is wrong with HDR, so I will start that conversation back where I left off with him.MY POINT is that I should not have to FIX HDR. Its a terrible standard and most of it is broken. Setting a AppleTV to HDR is a big mistake as then it rrenders all sorts of stuff into HDR making everything look bad.
Are you doing a direct AB bypassing the Lumagen by unhooking the HDMI and plugging directly into the source and looking a 4K SDR vs HDR... I have a Oppo with a linear power supply, disc stabilizer and femto clock mods plus other lesser mods. I use Wireworld Platinum Eclipse cables all short. 3M direct to projector. No processor in the path. Audio is pulled from the 2nd HDMI out. The screen is a Screen Research ClearPix Ultimate The room has been reviewed. https://www.theaudiobeat.com/visits/paradise_valley_system.htmRecent vid showing current equip. The room from 6 years ago and a lot of gear changed, but it has a better view of the room.