Nvidia’s new GTX 1650 is based on a fresh TU117 GPU which promises to have all the benefits of the Turing architecture. That means you get the ability to run integer and floating point operations concurrently, a unified cache architecture, and the ability to utilise adaptive shading, all to improve gaming performance. Good, eh?
And along with the other new Turing GPUs also came a new version of the NVENC video encoder, a bit of dedicated logic that offers 15% greater encoding efficiency and contains features to avoid artifacting when recording or streaming.
But, while the GTX 1650 does get all the other Turing shader goodness, it turns out the video encoder is actually based on the last-gen Volta version of NVENC and doesn’t have the Turing benefits. So it’s missing some of the Turing goodness, costs more than AMD’s RX 570 and can’t perform at the same level as that top budget card. I don’t want to keep kicking a new graphics card when it’s down, but it’s getting harder to see what the point is in the existence of the GTX 1650.
An eagle-eyed Twitter user spotted the difference on Nvidia’s own product page and questioned the graphics company about the different versions of NVENC being used on the different TU116 and TU117 GPUs.
Image is everything: The best gaming monitors around today
The response from Nvidia is that the Volta version of NVENC is actually more like the Pascal version that accompanied the GTX 10-series GPUs than the silicon used by the rest of the Turing generation of graphics cards. That means you get a weaker encoding part used on the lowest-end 16-series GPU than all the others.
Volta's NVENC performs like Pascal's NVENC. So quality is similar to that of GTX 10 Series. Turing's NVENC is 15% more efficient and has the new features to avoid artifacting.
— NVIDIA Streaming (@NVIDIAStreaming) April 24, 2019
I get that the GTX 1650 is the cheapest of Nvidia’s current generation of pixel pushers, but slicing out the encoder seems like a change that can’t have really saved that much cash during manufacturing. Nvidia does make the point that the Volta encoder is still pretty decent, stil offloading the burden from the CPU and allowing you to play and stream both at 4K… despite the GTX 1650 demonstrably not actually being capable of gaming at 4K.
It does, however, mean that if you were looking for a cheap streaming GPU for your PC then Nvidia’s $150 GTX 1650 isn’t going to be as effective as it might have been with the standard Turing NVENC. And it also means that those GTX 1650 laptops aren’t going to be the sort of thing low-budget media creators will be that happy to pay out on.
It’s not a huge issue – the Turing NVENC silicon isn’t exactly a deal-breaker for a graphics card that is both more expensive and slower than the competition’s two year-old budget hero. So no-one was realistically going to be buying a GTX 1650 for the Turing version of Nvidia’s encoding logic anyways, but it adds yet more weight to the feeling that this is a rather half-hearted GPU release.