For all the good that’s come from 2019, we’ve had our fair share of disappointments too. From Stadia to PC HDR there have been products and services that have not lived up to the hype, the expectations, and the promises made about them. And this here is our 2019 roll call of tech shame.
But that’s not to say that it hasn’t been a strong year for PC gaming hardware, with AMD probably the company to come out of 2019 in the strongest position both in terms of product and perception. It delivered on the promise of the Zen 2 CPU architecture, shipping both Ryzen 3000 and 3rd Gen Threadripper chips that tore up the rulebook on processor core counts and affordability. Though it does still pop up on this year’s list…
The flip side of that is the company who’s had it the worst, and that’s got to be Intel. Quite apart from the staggering resurgence of its closest rival in the processor industry it’s had its own issues rolling on from last year… that it promised to have sorted out.
And so that’s going to be where we start this timely little tale of technological woe, with a giant of a company that’s struggling to pull itself out of a tailspin of its own creation.
The promise of stable Intel chip production
‘It’s all going to be fine,’ said Intel CEO, Bob ‘The Legendary Swan’ Swan, last year. ‘We’re investing a billion dollars more into our 14nm manufacturing sites, don’t worry. CPU supply will be fine.’
In September 2018, then as acting CEO, he took the unusual step of publishing an open letter acknowledging the supply constraints Intel was still suffering, and set out the money and effort it was putting into righting those 14nm manufacturing issues. Surely pumping another billion dollars into the coffers of the fabs would help yield results, even if every slice of Intel silicon outside of the few 10nm chips it was shipping was built from the same 14nm node.
But the issues carried on throughout 2019 despite the promised manufacturing solutions set out by The Swan. And in November Intel’s GM of marketing had to fall on her sword and issue another open letter, this time apologising to its customers for the fact that it was experiencing more “production variability” in chip manufacturing. It turned out Intel had to make the announcement first to try and mitigate the fact that HP and Dell were to blame Intel’s supply problems for its own sales constraints in information given to investors.
The promise of a year free of Intel security flaws
Intel started 2018 on the back foot, with a pair of security flaws colloquially known as Spectre and Meltdown striking fear into the hearts of chip designers and anyone with an Intel processor lurking inside their PCs. Realistically it wasn’t anything to worry anyone not running a hypersensitive datacentre, but it was still a nightmare situation for the CPU manufacturer.
It all stemmed from its own processor layout, which meant AMD chips weren’t materially affected, and meant Intel had to create software mitigations to deal with existing chips and completed designs of new chips, as well as hardware mitigations needing to be baked into future designs.
That was all dealt with, though did mean some there was an impact on performance because of those mitigations. But Intel CPUs were once again secure, and 2019 was going to be plain sailing from a security point of view. Then Zombieload turned up, was apparently also dealt with, and then the mitigations for that had their own security vulnerabilities too. Suffice to say the promise of a secure 2019 for Intel have gone up in smoke.
The promise of Google Stadia’s 4K game streaming
Arguably one of the biggest disappointments of 2019 has got to the game streaming service Google Stadia. And it all started out so well, with the announcement at GDC in March making me question the very future of dedicated PC hardware. After all, if Stadia could deliver games over the intermawebs at 4K resolutions, with HDR and 5.1 surround sound, and look and feel as good as a locally played experience then what hope was there?
Turns out I needn’t have worried. As we edged closer and closer to the November launch date the necessary details of Stadia began to trickle out, and both the tech and the business model started to look flawed. The paid-for Pro subscription was the only chance to get in on the service this year, which seemed to actually net you very little for the money. You get a handful of free games to play (four at the current count, one of them being the already free-to-play Destiny 2) and discounts off a further four games. Everything else is full price to Pro and the eventual free users alike.
Oh, but you get 4K game streaming, right? Except nobody told the devs they would be expected to figure out how to get native 4K gaming on a networked AMD Vega 56-equivalent GPU. Destiny 2, for example, it seems is rendering at 1080p, while Red Dead Redemption is running at 1440p and at a lowly 30fps. Even if Stadia’s outputting an upscaled 4K signal to your TV.
“Stadia streams at 4K and 60 FPS,” says the official statement from Google, “and that includes all aspects of our graphics pipeline from game to screen: GPU, encoder and Chromecast Ultra all outputting at 4K to 4K TVs, with the appropriate internet connection.
“Developers making Stadia games work hard to deliver the best streaming experience for every game. Like you see on all platforms, this includes a variety of techniques to achieve the best overall quality. We give developers the freedom of how to achieve the best image quality and framerate on Stadia and we are impressed with what they have been able to achieve for day one.”
The promise of the 7nm Radeon VII
At the start of 2019 AMD got all excited and released its first 7nm product, the Vega-powered Radeon VII. At $699 it was the first ultra-enthusiast Radeon graphics card released out of the AMD skunkworks in an age, and while it had fewer actual Vega cores inside it compared with the old RX Vega 64, it was clocked faster and ran quicker.
Still not quick enough to beat the Nvidia RTX 2080, however, but it was as close as AMD had got in a long while. And then six months later AMD released the Radeon RX 5700 XT for $300 less than the Radeon VII and matched in practically every benchmark, effectively rendering the expensive 7nm Vega card completely obsolete overnight.
Anyone spending that much cash on a graphics card surely has the right to expect that it’s not going to be end-of-life’d in less than a year. But now it will be sitting like an albatross hanging from the PCIe slot of those AMD-ers’ PCs, or else sat on ebay for a fraction of its initial value.
The promise of the Big Fucking Gaming Displays
You know and I know it. That’s really the hilarious name Nvidia came up with to market its partnership with HP, Acer, and Asus to create 55- and 65-inch TVs with Shield and GSync hardware built-in to make them the perfect 4K HDR screen for big-screen PC gaming. There will have been much back-slapping and high-fiving following the meeting that produced the BFGD brand name; it screams PC gaming, it’s like the gun in Doom… it’s adolescent.
The back-slapping will have stopped now after Asus gave up on the initial spec it’s showed off to Jacob and I at multiple events over the years, shipping a Shield-less version for about $4K this year. To be fair the HP Omen X Emperium is available, with the full spec on sale for $1,200 off on its own site at the moment.
Seems like these Big Format Gaming Displays were real popular.
The promise of HDR on PC
I mean honestly, this is surely going to sit on every broken promise list from here to the end of time because I simply have no faith that there will ever come a time where HDR gaming on PC isn’t anything other than a confusing mess. Every year the potential for the PC to finally take advantage of the high-performance, high-fidelity, high-luminance displays at its disposal is dangled before us and yet there still seems to be no single consensus as to how to get Windows, your graphics card, the display, and finally the game to all work together and give us the full high dynamic range experience.
And yet in console land HDR gaming has become a feature so simple and elegantly applied that it needs no thought on the part of the user and just requires a compatible display. Why is it still so tough to match that on PC coming up to the end of 2019? And yet it seemingly is.