I have an RTX 2070 and can use pretty much max settings within madVR and still have render times around 35ms with UHD BD content. I don't think a 2080 is required.
12 bit, i have black bar detection on, no 3D Lut, no copyback as I use software decoding but I'm using NGU for both chroma and general upscaling. SSIM is used for downscaling.
I am considering building an HTPC with these parts:Grandia Series SST-GD09B, No PSU, ATX, Black, HTPC CaseTUF B360M-E GAMING, Intel B360 Chipset, LGA 1151, HDMI, microATX MotherboardCore™ i5-8400 6-Core 2.8 - 4.0GHz Turbo, LGA 1151, 65W TDP, Retail ProcessorGeForce RTX™ 2070 XC BLACK EDITION GAMING, 1410 - 1620MHz, 8GB GDDR6, Graphics CardOverclocking, Single GPU, Optimal and Stable Performance850 G3, 80 PLUS Gold 850W, ECO Mode, Fully Modular, ATX Power Supply16GB HyperX Fury DDR4 2400MHz, CL15, Black, DIMM Memory250GB MX500 2280, 560 / 510 MB/s, 3D NAND, SATA 6Gb/s, M.2 SSD2TB BarraCuda ST2000DM006, 7200 RPM, SATA 6Gb/s NCQ, 64MB cache, 3.5-Inch HDDBDR-211UBK, BD 16x / DVD 16x / CD 40x, Ultra HD Blu-ray Burner, 5.25-Inch, Optical DriveWindows 10 Home 64-bit DVD OEMAntiVirus 2018 - 1 PC / 1 YearMK320, Wireless 2.4GHz USB, Black, Keyboard & MouseAny suggestions?I plan to use the HTPC for MadVr (ripping UHD BDs, dynamic tone mapping, and 3DLut).
A 2070 is okay but you will have to make some compromises especially with HDR tonemapping.I would recommend pushing to a 2080 if you can, especially if you have any interest in 60p content.
You are using copyback, as that is software decoding, you wouldn't get black bars detection otherwise Depending on the level of NGU upscaling (well, chroma only for UHD content) and if you are not using a 3D LUT for calibration, then sure you can get by with a 2070. I don't think you'll be able to use NGU High and error diffusion 1 or 2 though, you probably have to reduce the NGU quality to medium, which I agree isn't a big compromise.By the way you know that there is a bug in the new JVCs and that RGB Full 12bits is converted internally to YCC422, even for 30p and below, which messes levels and possibly hurts chroma upscaling, as some dithering and chroma downscaling is happening behind madVR's back? I reported it to them, so hopefully they will solve it in a future f/w update.In the meantime, I recommend using 8bits, especially if you use the 3D LUT feature, otherwise you have to force BT2020 to get the most linear gamut when calibrating. That should give you a bit of headroom, and won't cause any banding with the right settings.You need to use Bicubic150 downscaling with the latest test builds for HDR tonemapping if you're using the live algo, as that's what's needed for the best dynamic targets. These latest builds need a lot more power than those even just a few weeks ago, so you're either behind or are not using the right dowscaling algos.If you're not using the live algo because you run measurements files, then you are also saving some precious ms of rendering, but not everyone is happy to go through the hassle of measuring files before playing a film...MadvR is moving very fast at the moment, and the latest test builds give a good idea of the power requirements for the next public build. This is why I am advising, if the OP can stretch his budget, to get a 2080. I'm not suggesting you're slumming it, with a 2070, just that you have to make some compromises (like not using the 3D LUT capability of madVR, which for some is a big compromise without reducing quality) and have little headroom as madVR keeps growing.I have a 1080ti (the equivalent of a 2080) and in about a year I went from having all settings where I wanted them to having to make some (small) compromises, especially in 12bits.And I'm not even talking about 60p content, which is out of the question at the same quality settings. I need a special profile to reduce quality significantly when playing 60p, or it's a slideshow.
Then I wish Onkyoman would have updated his Build thread. I just purchased a 2060.Budget: GTX 1050 3GBMinimum: GTX 1050 TiRecommended: GTX 1660 6GB / RTX 2060Performance: RTX 2080HDR Tone Mapping: GTX 1660 Ti / RTX 2060
If you are using software decoding, why are we even talking about the GPU you use, or why are you considering overclocking it, given that you are using a CPU based acceleration? This is a highly unusual configuration, which is why I assumed you were using copyback, my bad. I'm sure you have a valid reason to not use the GPU acceleration. Did you look at GPU-Z or the GPU performance tab in the task manager? What is the GPU and video engine load?Your rendering times are gone from 35ms to 41ms with these settings, which is not usable. Your present and rendering queues are empty, you are clearly dropping frames with these settings. 35ms would be a safer threshhold with widescreen content, which is what you are showing. You would be dropping frames like mad with 16/9 or 1.78 content, as that requires more power and you're already more than borderline with widescreen content, without a 3D LUT.If you do have a 2070, I highly recommend you try using D3D11 copyback (if you want the software functions to work, such as black bars detection and UHD Bluray menus) or D3D11 native for even better performance if needed. Both are using hardware acceleration, but some software functions are disabled with native, which is why copyback can be needed. At least that way you will be using the GPU hardware acceleration and should have more headroom. But you might have your own reasons for not using your GPU hardware acceleration of course.I did say that NGU chroma medium was fine and not a big compromise, so I'm not saying anything different from Madshi. You will have to use it though, because NGU high for chroma is more than you can chew with your current settings in UHD. The difference it makes depends on content, screeen size and eyes. I don't consider it a significant factor, especially with UHD content as the chroma layer is already full HD, but I prefer to use high if I can. I would never use very high for chroma as it's a waste of power that's better used elsewhere.Anyway, you say a 2070 is enough but you are not using its hardware acceleration and your screenshots show unusable render times and empty queues even with 2.40 content, which requires the least power. You're also conceding, without a 3D LUT enabled, that you would need to overclock it to get more headroom, so I'm confused. Are you still not agreeing that it's not powerful enough to use all the features at the best (not crazy) quality settings, which was my point to start with?I say a 2070 is ok but if the OP has the budget and wants to use all the features (he did mention his intention to use a 3D LUT, which isn't enabled in your already too high settings for that GPU), in the best quality that madVR has to offer, he might want to consider stretching to a 2080, which in the reviews I've read was fairly close to my 1080ti and wouldn't be an upgrade to me, at least until tensor cores are put to use in madVR. The only upgrade I would consider would be a 2080ti, but as it doesn't have HDMI 2.1 and I can still get by with my 1080ti, I passed this gen.Maybe we could agree to leave it there without spending days debating it? I think we are taking the thread on a tangent, discussing the specifics of your settings rather than providing useful advice to the OP.
Of course black bars detection will work, you are using software decoding and are not using the GPU hardware acceleration that would be used with D3D11 copyback or native. If you were, the black bars detection would only work with copyback, not with native.I'm not saying you're inexperienced, you are just making claims about a GPU that go against my own experience (I haven't exactly just discovered madVR either), and directly contradict a statement I've made, without providing anything that backs up your claims, that's all. On the contrary. And now you're suggesting that I don't know what I'm talking about...Anyway, you're happy with your settings, I'm happy that you're happy with your settings, so for the sake of the thread, I'll yield