Nvidia's plans to become a "full-service" video performance company | PCGamesN

Nvidia's plans to become a "full-service" video performance company

The die on an Nvidia Maxwell chipset.

Nvidia aren’t just selling video cards anymore. They’re trying to sell a package of services and performance-related products that attempts to make “The Way Its Meant to Be Played” into something more than a slogan on a splash screen.

Nvidia has been heading in this direction for a few years, but at Red Bull Battle Grounds Washington, D.C., they were bringing all the threads of their strategy together to show an eSports-focused audience that Nvidia has more to offer than higher benchmarks.

 The problem facing Nvidia is that we are, and have been, in an era of marginal gains. New cards might be much more efficient and powerful than their predecessors, but those improvements are becoming harder and harder to discern for most players. And even developers and publishers seem to be slinking away from better and faster graphics, quietly lowering the bar for performance or somehow sending PC Gaming back in time.

Put another way, when I went from my trusty old Radeon 4850 to a GeForce 560 in 2011, it was a night-and-day difference. Games went from looking good to looking absolutely tremendous.

But when I upgraded again this spring to a 770 and a 2560 x 1440 monitor, I was underwhelmed. My games went from looking great to looking slightly sharper and... still tremendous. A little smoother, with slightly more bells and whistles. But it was no longer that take-your-breath away moment that video cards used to promise. Ironically, the biggest difference it made for me was in cutting my electric bill by $20 a month. Important, but not exactly the sexiest thing you can say about a video card.

Don’t get me wrong: high draw distances, improved anti-aliasing, and better fog and water effects are all appreciated. But even in an era of new consoles, the graphics arms race is largely about going from “this looks amazing” to “no, really guys, check out that tesselation. Here, let me blow up this screenshot for you. What do you mean you don’t see it?”

I explained this to Andrew Coonrad, Nvidia’s Technical Marketing Manager of Gaming Technologies. He was demoing Nvidia’ G-Sync monitors for the crowd at Battle Grounds, and evangelizing for Nvidia’s vision. But he knows that most PC gamers are simply not as invested in performance and graphics as they used to be.

Downscaled version of a DSR picture from Assasssin's Creed.

“We’re making [games] look better, and we’re pushing the boundaries of the technology all at the same time. But what are we getting?” he said. “A lot of gamers don’t play those kinds of games anyway. They play League of Legends.”

He laughed. “That game was originally designed on the Adobe Air Engine. Which is like the most non-exciting engine ever. It was great because it runs on a potato, right? That’s awesome. But it doesn’t make you want to to school to go learn how to make games beautiful.”

But that problem is also what gives rise to Nvidia’s overall vision. As Coonrad put it, “So we’re trying to inspire people to want those graphics. We’re trying to inspire developers to want to add those [graphics] in. We’re trying to make it easy to add it in. And then we’re trying to give you the graphics horsepower to achieve that.”

Cutting the cards

“It is truly getting to the point where we’re really adding a lot of value to just buying a graphics card,” he said. “It’s not a just a card and some drivers. It’s a whole host of other technologies and other things that you get in our ecosystem.”

Nvidia’s ecosystem, ideally, has players running an Nvidia graphics card on a G-Sync monitor, calibrating their games via GeForce Experience, and playing games that use development tools that Nvidia promote to developers. But each part of that requires buy-in from someone else. So Nividia have to make a case.

As always, the cornerstone of their pitch is new hardware.

“We just released our new graphics cards, the GTX 980 and GTX 970. And that’s based on our Maxwell architecture... We’re really proud of Maxwell because it increased the performance a lot... but it’s very cool and quiet. The 980 is two times the performance of our last generation.”

The Maxwell cards, Coonrad says, are vastly more energy and heat-efficient than the last generation of Nvidia cards. But perhaps in recognition of the changing marketplace, they’re also relatively inexpensive compared to a lot of earlier graphics-card rollouts. The days of the $1,000 graphics card are over, at least for now.

“The GTX 980 is $550 at launch. And that price may come down later, with bundles and such. And the 970 is launching at $329. ...We got the least trolled about the 970 ever in the history of our videocards,” Coonrad said. “That’s when you know you have an excellent product. When even the worst, stinkiest trolls are actually like, ‘Oh, wow, that’s a really good deal.’”

The GeForce 980 with some improbable green lighting effects.

But Coonrad and Nvidia know that they have to add more and more value beyond just efficiency and improved performance if they want to convince gamers that it’s time to upgrade, much less switch from Radeon to GeForce. So they’re offering more options to improve not just the latest games (which used to drive hardware sales) but  also to turn older games into higher-fidelity experiences.

“We added Dynamic Super Resolution, where basically you’re able to render your game at a higher resolution than your monitor, and then downscale it to the resolution of your monitor,” he explained. “It’s similar to supersampling, where basically you’re using a higher resolution render to get that AA-like quality. And sometimes with older games you have a ton of perf on the table, and you can use that performance to increase the quality of your game. Whereas previously, you could only turn up the settings so much, and that’s all you get.”

Experience Points

This is not a new idea, and the mod community has been doing stuff like this for ages. But what Nvidia has tried to do is improve the accessibility of those features by putting them inside GeForce Experience.

If you’re anything like me, GeForce Experience is something you accidentally installed when you updated your Nvidia drivers. It’s been sitting in my system tray, ignored, for about six months.

“You know, it’s funny, I did the same thing at first,” Coonrad admitted. “I wasn’t used to it, and I don’t like bloatware. But it’s actually a really robust, lightweight application that has incredible amount of stuff you can do with it. ...Because it involves all those other technologies.”

So if you have an older game, like Half-Life, you should be able to just go into GeForce Experience and render it at 4K, then down-sample it to your 1080 or 1440 monitor, just by using a dropdown menu.

And you won’t necessarily need a Maxwell-powered Nvidia card, either. “We weren’t saying this before, but Dynamic Super Resolution will be available to older generation cards as well,” Coonrad said. “So if you have like a 780 or a Titan, with plenty of performance, you might ask, ‘Do I really need to buy a new card to get this?’ The answer is no.”

Watch Dogs running Nvidia DSR

Coonrad also touted another feature that I, and a lot of other longtime PC gamers, tend to look very skeptically at: automatic game optimization.

GeForce Experience will automatically scale each game’s settings to your hardware profile at the click of a button. But it always struck me as a pointless feature, because I could as easily to the same thing inside the game (albeit without the aid of DSR) to tailor the game’s appearance and performance to my own preferences. In fact, a lot of players consider this kind of tinkering to be part of the fun of hardware upgrades. Just how far can you push your rig? How many dynamic shadows do you want to see?

Coonrad, however, thinks Nvidia can offer something better than just eyeballing a video settings screen and tweaking values.

People can access most of the settings they need, Coonrad said, “but even then, they don’t know how much or what to do. So we actually have a huge database of all the settings, and all the combinations of performance hardware that you have in your computer that allows it to optimize for your specific configuration.

“This the magic of our technology. This particular thing took a lot of time to build. We have a huge combination of machines in our GTL, our Global Test Laboratory. It’s machines all over the world that belong to Nvidia, and that are part of our engineering systems or our cloud-based systems to test the different combinations of our hardware.”

In other words, GeForce Experience isn’t just kicking out the video settings equivalent of a Windows Experience number. It’s using well-tested settings on each game for your exact hardware configuration. You just tell it where you want to strike the balance between performance and image quality, and it handles the rest.

“There are guys who go through every game, and go to the parts that we know to be the most difficult parts of the game [from a performance standpoint]. Then we run them through these tests and determine how they affect performance. There’s a lot of man-hours that goes into making GeForce Experience work.”

I find this kind of enticing. There are a lot of settings that I simply don’t have a lot of investment in, and it’s hard to know exactly what “ultra” vs. “high” always translates to. For someone like me, who doesn’t want to lose his mind tweaking things like “shadow quality” and “terrain noise”, the idea of having GeForce knowing exactly the right settings has some appeal.

 

Nice and smooth

Even more tempting, however, is a G-sync monitor. The demo unit at Battle Grounds was, fittingly enough, running StarCraft 2. It’s a technology that’s become a bit more urgent for me ever since I went to a 27-inch 1440 monitor and, for the first time, saw levels of screen-tearing that I simply couldn’t deal with.

G-Sync really does offer some tangible gains over that experience. Scrolling around a busy StarCraft map, I was taken aback by how smooth it was. It felt unnatural because I’m so used to turning-off V-sync and simply putting up with the tearing. Playing a game where the video is always as smooth as glass is a bit disorienting at first.

The reality isn’t always as ideal as it was in StarCraft 2, and Digital Foundry did a lovely breakdown of the good and bad of G-Sync over on Eurogamer. But if you were in the market for a new monitor and had the required GeForce 6 series or higher video card, G-Sync would be an easy recommendation.

But G-Sync is an all-or-nothing proposition. It doesn’t work with AMD cards, and it also doesn’t play nicely with other monitors. That last part, Coonrad admits, is a problem.

“We’re working on that. Right now we cannot have more than one monitor running with G-Sync. That’s the glorious problem with G-Sync. It’s such a different way of sending frames to the monitor, it needs that other chip inside the monitor,” he said. ““You can have dual-monitor G-Sync, but not G-Sync and non-G-Sync.

“Because basically you have to have one display head giving the variable refresh rate signal, and another giving that standard locked signal. And since the GPU does the work together, it essentially has to have two frame buffer outputs. One that’s locked, and one that’s unlocked,” he said.

Not everyone will find this a “glorious” problem. I told Coonrad that having to give up my Ultrasharp monitor for a G-Sync device is a deal-breaker, and he acknowledged that a lot of people will probably be in the same boat. Nvidia are trying to find a solution.

“There are theories on how to do it. Like, you could have a virtualized environment with the locked one. Because that’s easier to do. So let’s say you had a the desktop with your chat open, and you wanted to game on G-Sync. That would be easier to do. We just haven’t figured it out. We’re working on it.”

Hard sell on the walled garden

The thing is, like anyone else trying to lure people into an ecosystem, Nvidia have to contend with the fact that a lot of people have already made choices that make it hard to bring them aboard. A Radeon owner can’t run G-Sync, and someone running a dual-monitors is unlikely to run out and buy two new G-Sync devices. Nor can Nvidia go back in time and get their cards into the PS4 or Xbox One.

But bit by bit, Nvidia are making the case for their technology as something more than just a way to get higher framerates and slightly nicer effects. They’re leaving Coke vs. Pepsi territory and getting closer to Apple vs. Windows. Whether PC gamers will want to follow is an open question.

It’s an interesting idea. Nvidia aren’t just selling the idea that they can push technology harder than anybody else, but that they can make the day-to-day experience of using it better than it’s ever been before.

But the catch is that you have to become an Nvidia gamer, not just a PC gamer. But with the technology and services Nvidia are developing, that no longer seems like such a leap.

Subnautica
Sign in to Commentlogin to comment
Mountain_Man avatarBelimawr avatarNick Wilson avatarboniek83 avatarSathlin avatar
Belimawr Avatar
1276
3 Years ago

I only got a GTX980 because it was relatively cheap for what it was and I needed to move up from my pair of 470's,

really while the nvidia experience is there other than shadowplay I can't think of a single reason to actually use it.

but end of the day it's all good and well pushing tech with the cards, but the truth of the matter is like with physx if you have a fragmented market most developers will ignore the tech, as why make something for a tech that only half of people can use?

1
Nick Wilson Avatar
359
3 Years ago

I actually upgraded myself from a 23' IPS Dell Ultrasharp to the new 27' ASUS ROG Swift (2560x1440p, 144hz, G-sync). It's a purchase I absolutely do not regret.

It may be a TN panel, but I don't notice the bad viewing angles because I'm either right in front of it or just alter the angle if I want to play in bed. The colour reproduction is actually a lot better than other TN panels. The high resolution and refresh rate are a match made in heaven for both my 780 Ti's, and G-Sync makes sure I don't experience any tearing or stutters.

The only downside is the price. It was £680, which was rather costly; even more than I paid for a single GTX 780 Ti Classified from EVGA. If you've got the cash spare and the hardware to drive it, I fully recommend it. Otherwise there are smaller screens available.

1
boniek83 Avatar
110
3 Years ago

Got the same monitor with just single 670 and I recommend it as well even for mid range card like mine.

1
Sathlin Avatar
1
3 Years ago

Not being able to have 2 monitors, one with and one without gsync only applies to if you are gaming on both or not?

Just bought a monitor, and planning on buying a gsync one in the future. i only game on one.

1
Mountain_Man Avatar
731
3 Years ago

"We’re making [games] look better, and we’re pushing the boundaries of the technology all at the same time. But what are we getting? A lot of gamers don’t play those kinds of games anyway. They play League of Legends."

.

And why do you suppose that is? Could it be that gamers value fun gameplay over state-of-the-art "uncanny valley" visuals bolted onto a mediocre game?

0