Does this mean that it is a good time to buy 1080Ti graphics cards used on ebay? https://www.ebay.com/itm/MSI-GeForce-GTX-1080-Ti-DirectX-12-GTX-1080-Ti-SEA-HAWK-EK-X-11GB-352-Bit-GDDR5X/173705847028?epid=3012716392&hash=item2871ace0f4:g:Z74AAOSwlINcIGSl
So they are running around $550.00. Is that the move to make?
Back to GeForce RTX 2060
Total mess.
Cheaper card will be called GeForce GTX 1660 Ti
Price of RTX 2060 is rumored to be $400
To be short - go and buy ex mining RX 580 8GB.
RTX 2060 will be announced during 2nd week of January
RTX 2060 news
Performance is expected to be around 1070 Ti or slightly below.
Davinci by design (node based, tend to add constant features with slow code) is worst possible candidate for such tests.
True. However Node based compositors can (and some do) use GPU acceleration. Fusion however does not, using mainly CPU code (or openCL which mostly falls back to CPU).
Most image transforms should be calculated with shaders. Shaders can handle high bit depth color, and are super simple to implement.
Up until now, main issue with shaders was that raw shader code was not compiled, hence any user could easily be discovered. Vulkan allows compiled shaders, so I suspect that more proprietary softwares will embrace this technology now.
Well, guys focus on 4K with artificial setups, like adding 3 slow plugins and 4 tracking areas.
And they directly are interested in certain tests outcome (with different outcomes they'll go burst), hence all the parameter choosing.
Davinci by design (node based, tend to add constant features with slow code) is worst possible candidate for such tests.
In reality if you write NLE proper 1050 Ti works perfect with 4K at 60fps with one tracking area and color grading+sharpness and other stuff.
Gtx 2080 ti gives performance like titan v. I'm going to buy it.
On DLSS thingy
Nvidia promised another 10% price hike blaming them to US new fees.
Nice news.
35% boost, not bad. And you also get amazing ray tracing performance and DLSS AI-powered anti-aliasing. In the future when games/applications supports these, the performance boost is much bigger.
Well, bad
Most interesting that happened to H,264 decoder - current drivers limit decoding block frequency
Ray tracing is amazing.
Actually it is whole point of Turing that they can do some. So, for mixed they are much better.
Only real advantage can be that some of video filters can be made with FP16 option.
Funny numbers: Those INT32 numbers are total BS! (*)
Looks like they just feed 32bit integers into the 32bit floatingpoint unit, which is no problem (FP is more complicated), just to get a new entry into this "feature list" and "show" that the new cards a sooo much better. I bet the old ones can do that as well.
(* If the new cards can do INT32 and FP32 in parallel, each running at 13-16 TFLOPS at the same time, than I'm sorry and my post is BS... but I highly doubt it.)
Architecture in depth
For video applications advantages seems to be minimal, except better encoders and HDR.
https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/
It looks like you're new here. If you want to get involved, click one of these buttons!