40 days 4 hours
25 days 9 hours
Galactic Civilizations III
4 days 8 hours
Automation - The Car Company Tycoon Game
4 days 4 hours
Kerbal Space Program
14 days 12 hours
Grand Theft Auto V
10 days 23 hours
Intel don't have the edge. Nvidia do. Ryzen has been shown to dominate Atoms and Skylake at super low draw (sub 20W) and there is no low draw Kaby.
Pascal is so many miles ahead I think it will take more time to catch it from AMD. The reason it is less efficient may also be the reason it is so good at coin mining however, so it may be against AMD's interests.
It's just a matter of time before Ryzen starts coming out in mobile form though.
the reason they "appear" power efficient, is because Nv chopped a great deal out of Pascal to make it sip power (who knows as the power meters read the digital footprint and Nv uses fancy digital circuitry and would not be first time the played BS games) but alas, of course it uses less power, there is not as much to power, so instead they chose to chop a large chunk of things away to open up the power budget some and clock the transistors up more.
Clock per clock, density vs density, Polaris and Vega are by far a more complete package BUT this comes at more power to drive it all. if AMD could clock polaris or Vega up to similar clock speeds they would royally own Pascal big time, but they cannot.
At least Polaris/Vega are not built as "cheap as possible" and still demanding an "branding tax" like Nv does with their products where they cheap out on Vreg design, use lower quality capacitors, are that much more prone to shorter life (lower thermal threshold)
anyways, would really love to see prices stabilize some, AMD and Nvidia both need to stomp on their customers (Asus/MSI etc) so their customers (me and you) get them closer to the price they should be...putting an extra $50-$120 on top of an expensive product not good for anyone except the caviar eating execs ^.^
I very much doubt AMD included the price of the displays, as they both differed massively and AMD had said initially that they were near-identically specced. We know that G-Sync is more expensive because of Nvidia tax and greater complexity, but I don't think if there is $600 difference in the price of the two displays it would have been factored in.
I'd love it if the Vega machine was $300 cheaper, just counting the graphics card, that would put Vega at $200. But yeah, probably unlikely.
The screens are near identically specced - both 100Hz, and the same screen size and resolution - but very different prices. AMD wanted to demonstrate the difference, or lack thereof, between FreeSync and G-Sync.
Absolutely, because there is no known way you could have lost your account due to this exploit.
Just because an attacker can compromise your computer doesn't mean he automagically compromises Valve's. If he hijacks your account you just call them and get it recovered.
deep learning is more important than rendering some random game.
Which is why Vega can be a flop as a gaming card but still incredibly profitable for AMD. That and Altcoins.
We all know that if not for the coin mining craze there might not be an AMD today.... but also how long you couldn't buy a 480/580 at a sane price because of it.
Yes but the Xbox has hUMA can make GPU memory transfers in hardware without memory protection. Also, they don't have Windows running underneath with it's 1.5GB of libraries and processes.
if your talking about the ESRAM in the X1 most developers are actually blaming it for holding back the X1 as it is too small to be utilised effectively, it's actually been given as the reason there is such a massive gap between the PS4 and X1 in the new metal gear, with the PS4 at 1080p with higher detail textures and lightings, while the X1 run's at 720p with graphics that aren't that far ahead of 360/PS3 due to the restrictions caused by the ESRAM.
but the problem is most likely caused by the fact no developer has really had to deal with ESRAM in the past so they are still looking at how to work round it's restrictions.