AMD Vega reviews, news, performance, and availability | PCGamesN

AMD Vega reviews, news, performance, and availability

AMD Vega reviews news performance and availability

The AMD Vega GPU is the top Radeon graphics card technology available right now and, as the video card nerds we are, we’ve got all the latest Vega info right here.

The Vega architecture has now been around in consumer form for close to a year now, and it’s fair to say it’s had a bumpy start to life. But with its sterling showing in the Raven Ridge APUs, new mobile cards on the way, and Vega set to appear as AMD's first 7nm product, the architecture's certainly getting around a bit. At launch, however, it was rather a silicon contradiction - the first cards managed to be simultaneously disappointing and completely sell out. 

Want to know what the best GPU is right now? Check out our guide to the best graphics cards available today.

It's that classic AMD 'fine wine' approach where it only gets better over time. And Vega genuinely has - with the most modern game launches it has shown some performance improvements to where it's almost  competitive. Damning with faint praise there...

The Vega launch felt classically AMD: it worked hard to create some interesting hardware, with some unique selling points, but it turned out a little too forward-thinking, and not competitive enough right now, to be worth the outlay. There’s also the potential problem that when the forward-thinking side of the Vega tech does become useful this iteration of it will be outdated.

Unfortunately, right now, the AMD RX Vega cards suffer by comparison to their smaller, more efficient, and more powerful Nvidia competition.

Vital stats

AMD Vega release date
The latest AMD Radeon cards were finally launched on August 14, 2017

AMD Vega pricing and availability
The RX Vega 64 and RX Vega 56 were both hard to track down for a decent price long after launch, but just when it looked like stock was coming back so did the mining boom. Now an RX Vega 64 is $677 (£549) and the RX Vega 56 $529 (£463)... at best.

AMD Vega architecure
AMD called it the biggest architectural change in years, with the advanced Infinity Fabric interconnect, memory, and caching system.

AMD Vega performance
The biggest disappointment was that, after all the hype, the top AMD RX Vega 64 couldn't really keep up with the Nvidia GTX 1080 it was aiming its sights on.

AMD Radeon RX Vega 64

AMD Vega reviews

AMD Vega news

________________________________________________

A couple of slides have come out of the latest AMD investor presentation highlighting the role, or non-role of the AMD Vega GPU architecture throughout 2018. 

Naively we were hoping for a broader roll-out of the AMD Vega tech, but this year it sticking with the RX Vega 64 and RX Vega 56 cards at the top of the stack with the only new GPUs arriving in the mobile and machine learning spaces.

AMD Vega 2018

That means our Radeon gaming graphics cards will only have a pair of Vega chips available to them, with the rest of 2018's AMD cards still running the old Polaris GPU tech of the RX 480 and RX 580 fame. Given the sizeable lead Nvidia already has in the gaming graphics card market, that means AMD has essentially ceded 2018 to Team GeForce. Fingers crossed Navi turns out as well as we hope, and not as bad as is feared...

AMD GPU stack 2018

You will,  however, be getting Vega graphics in processor form, and not just from the AMD Raven Ridge APUs either. Intel has licensed Vega GPU silicon to form the graphics portion of the Intel Kaby Lake G processors, mixing Radeon and Core architectures together for the first time. It's an alliance predicted by Nostradamus himself as a portent of the end times.

Subscribe to PCGamesN on YouTube

There will also be mobile Vega cards, as mentioned by Dr. Lisa Su herself at the pre-CES event in Las Vegas this January. But it looks like any hopes we might have had for a Vega GPU refresh, along the same lines as that facing the inaugural Ryzen CPU range with the AMD Ryzen 2 Pinnacle Ridge update this year, have seemingly been dashed.

Originally we had been hoping that AMD would be creating a 12nm refresh of the Vega 56 and Vega 64 cards, sporting the same level of efficiency improvements and higher clockspeeds the Ryzen 2 refresh is offering. Unfortunately the Vega refresh has been removed from the new AMD roadmaps we were shown at their CES Tech Day in January.

Instead AMD is bringing a 7nm version of the Vega architecture to market at the end of the year - likely unveiling at SIGGRAPH in August - in the guise of a new Radeon Instinct card designed specifically for the rigours of machine learning. There is a slight chance that release, like the original Vega Instinct cards, will eventually bleed into a consumer-facing product launch, but maybe not this year.

So, it looks like the next graphics component we're getting from AMD right now is going to be based on the new AMD Navi design sometime in 2019.

 

AMD Vega release date

AMD Radeon RX Vega 64 Liquid

The gaming variants of the AMD Vega architecture released on August 14, 2017, and there was much rejoicing. Mostly because it was my mum's birthday, but also because it represented the first big Radeon GPU redesign in an age.

AMD released the Radeon RX Vega 64 and Radeon RX Vega 56 cards, and we've not seen anything else to fill out the stack in discrete graphics card form since then. And realistically it seems unlikely that we're going to. The costly HBM2 VRAM means that it's potentially complicated to switch out for GDDR5 or GDDR6, and too expensive to use in more mainstream versions.

There was an even more costly liquid-cooled version of the Vega 64, but that doesn't seem to have survived past launch. We've also not been treated to an official Vega Nano, but there is expected to be a Powercolor RX Vega 56 'Nano Edition' appearing at Computex this year.

 

AMD Vega pricing and availability

AMD Radeon RX Vega graphics cards

We're starting to see more RX Vega cards becoming available now, but thanks to the cryptomining boom prices are still well over the original MSRP for the cards even though it's now actually possible to find them on sale. That's despite the fact the crypto market has slowed down with ASICs more able to cater to the previously GPU-only currencies.

Still, prices are better than they were...

AMD Radeon RX Vega 64 (MSRP $499 | £450) - $677 | £549

AMD Radeon RX Vega 56 (MSRP $399 | £350) - $529 | £463

 

AMD Vega architecture

Subscribe to PCGamesN on YouTube

The new AMD Vega architecture represents what it called the most sweeping architectural change its engineers had made to the GPU design in five years. That was when the first Graphics Core Next chips hit the market and this fifth generation of the GCN architecture marks the start of a new GPU era for the Radeon team.

Fundamental to the Vega architecture, represented here by the inaugural Vega 10 GPU, is the hunt for higher graphics card clockspeeds. The very building blocks of the Vega 10, the compute units, have been redesigned from the ground up, almost literally. These next-generation compute units (NCU) have had their floorplans completely reworked to optimise and shorten the physical wiring of the connections inside them.

They also include high-speed, mini memory SRAMs, stolen from the Zen CPUs and optimised for use on a GPU. But that’s not the only way the graphics engineers have benefitted from a resurgent CPU design team; they’ve also nabbed the high-performance Infinity Fabric interconnect, which enables the discrete quad-core modules, used in Ryzen and Ryzen Threadripper processors, to talk to each other. 

AMD Radeon RX Vega GPU layout

Vega uses the Infinity Fabric to connect the GPU core itself to the rest of the graphics logic in the package. The video acceleration blocks, the PCIe controller and the advanced memory controller, amongst others, are all connected via this high-speed interface. It also has its own clock frequency too, which means it’s not affected by the dynamic scaling and high frequency of the GPU clock itself.

This introduction of Infinity Fabric support for all the different logic blocks makes for a very modular approach to the Vega architecture and that in turn means it will, in theory, be easy for AMD to make a host of different Vega configurations. It also means future GPU and APU designs (think the Ryzen/Vega-powered Raven Ridge) can incorporate pretty much any element of Vega they want to with minimal effort.

The NCUs still contain the same 64 individual GCN cores inside them as the original graphics core next design, with the Vega 10 GPU then capable of housing up to 4,096 of these li’l stream processors. But, with the higher core clockspeeds, and other architectural improvements of Vega, they’re able to offer far greater performance than any previous GCN-based chip.

The new NCUs are also capable of utilising a feature AMD is calling Rapid Packed Math, and which I’m calling Rapid Packed Maths, or RPM to avoid any trouble with our US cousins. RPM essentially allows you to do two mathematical instructions for the price of one, but does sacrifice the accuracy. Given many of today’s calculations, especially in the gaming space, don’t actually need 32-bit floating point precision (FP32), you can get away with using 16-bit data types. Game features, such as lighting and HDR, can use FP16 calculations and with RPM that means Vega can support both FP16 and FP32 calculations as and when they're necessary.

AMD Radeon RX Vega NCU

The Far Cry 5 developers came out in support of RPM, and made FC5 very Vega-friendly. 3D technical lead, Steve Mcauley, went on record stating: “there’s been many occasions recently where I’ve been optimising shaders thinking that I really wish I had rapid packed math available to me right now. [It] means the game will run at a faster, higher frame rate, and a more stable frame rate as well, which will be great for gamers.”

The Vega architecture also incorporates a new geometry engine, capable of supporting both standard DirectX-based rendering as well as the ability to use newer, more efficient rendering pipelines through primitive shader support. The revised pixel engine has been updated to cope with today’s high-resolution, high refresh rate displays, and AMD have doubled the on-die L2 cache available to the GPU. They have also freed the entire cache to be accessible by all the different logic blocks of the Vega 10 chip, and that’s because of the brand new memory setup of Vega.

AMD’s Vega architecture uses the second generation of high-bandwidth memory (HBM2) from Hynix. HBM2 has higher data rates, and larger capacities, compared with the first generation used in AMD’s R9 Fury X cards. It can now come in stacks of up to 8GB, with a pair of them sitting directly on the GPU die, making the memory both more efficient and with a smaller footprint compared to standard graphics chip designs. And that could make it a far more tantalising option for notebook GPUs.

AMD Vega High Bandwidth Cache and Controller

Directly connected with the HBM2 is Vega’s new high-bandwidth cache and high-bandwidth cache controller (HBCC). Ostensibly this is likely to be of greater use, at least in the short term, on the professional side of the graphics industry, but the HBCC’s ability to use a portion of the PC’s system memory as video memory should bare gaming fruit in the future. The idea is that games will see the extended pool as one large chunk of video memory, so if tomorrow’s open-world games start to require more than the Vega 64’s 8GB you can chuck it some of your PC’s own memory to compensate for any shortfall.

"You are no longer limited by the amount of graphics memory you have on the chip," AMD’s Scott Wasson explains. "It's only limited by the amount of memory or storage you attach to your system."

The Vega architecture is capable of scaling right up to a maximum of 512TB as the virtual address space available to the graphics silicon. Nobody tell Chris Roberts or we won’t see Star Citizen this side of the 22nd century.

AMD Vega performance

AMD Radeon RX Vega performance

The thinking behind Vega seems to have been to put the RX Vega 64 up against the GTX 1080 with the RX Vega 56 going head-to-head with the GTX 1070. Unfortunately, with most games on the market today, the AMD cards are always that little bit behind the Nvidia GPUs. It’s only when you start looking at the more modern DirectX 12 and Vulkan APIs that the Vega architecture starts to show its worth.

It’s this bifurcated performance - poor in legacy games and impressive with modern software - that makes the Vega cards difficult to recommend right now. AMD’s classic ‘fine wine’ approach may mean that when architecture matures, and devs start to use the impressive feature set to its fullest, the AMD cards might be able to push past their Nvidia rivals.

But that’s scant comfort to anyone wanting class-leading performance for every game in their Steam libraries, or even just the games they’re playing at the moment. There is a little light at the end of the overclocking tunnel however, with tweakers uncovering increased capabilities of the card, unlocked by undervolting the GPU. But that’s a whole other story...

GOTW
Sign in to Commentlogin to comment
SkankwOn avatarDave James avatarKeyvan avatarShriven avatargamertaboo avatarfuchs avatar+41
SkankwOn Avatar
195
1 Year ago

I've pretty much tied myself to Radeon GPUs after buying a Freesync monitor, so I've been awaiting Vega with hope and interest.

Could even be time to upgrade CPU/Mobo/RAM, as I'm guessing my trusty old i5 2500K (at 4Ghz) will likely bottleneck the GPU.

Ryzen perhaps. We'll see on both fronts.

7
Dave James Avatar
627
1 Year ago

Genuinely really excited about what AMD could be like in 2017. A serious Vega-based RX Fury and 16-thread Ryzen CPU could make for a stunning rig - and will make your Freesync monitor shine :)

6
qepsilonp Avatar
5
5 Months ago

If the performance you are getting is not satisfactory with your current CPU then go for it, but if it's ok but you wanna upgrade anyway I would wait for Zen 2 / Ryzen 3 if you can hold out for another year, and tbh you can overclock a 2500k to like 4.2 - 4.3 without even increasing the voltage... ah no that was a 3570k but anyway the 2500k was known to be a excellent overclocker so if you can at all manage to hold off I think Zen 2 / Ryzen 2 will be worth the wait.

As I believe and obviously this part isn't confirmed but I believe they will go for 12 cores, but the more important part is that there will be an IPC increase and a clockspeed increase.

The clockspeed increase can somewhat be inferred from global foundries claims about the 7LP node which is a 40% performance increase, which I would take those claims with a massive gain of salt so lets say 20% clockspeed advantage over Ryzen.

that would mean something like a 4.9Ghz XFR on the replacement for the 1800X and 1600X and a 4.4Ghz all core turbo.

add ontop of that a 10% IPC increase and you are already looking at something like a 32% single threaded performance increase, that is forgetting that you might add 50% more cores onto that which is a 98% multithreaded performance increase.

which it wont add up that way in the vast majority of cases, but like 50% in most highly multi threaded workloads and 28% in most single threaded workloads.

I would wait personally if you can, obviously as said if you can't wait you can't wait, I couldn't deal with a 3570k overclocked to 4.7Ghz because I play EVE Online with 5 clients and like to do other stuff while I am doing that, Ryzen was perfect for that.

But atleast wait for Ryzen 2 it's only like 3 months and a bit away then I would also advise waiting for the die shrink of Vega which is confirmed directly by AMD to be the case, so they can either cut power consumption by 60% which I find hard to believe GF's claims on that probably 40% max and that's just taking it as a power consumption improvement only I think they will take a 10 - 15% power cut and put 10 - 15% into performance, which still isn't that great on power consumption or performance, yay 15% extra performance where it usually performs nearer a 1070 than a 1080 so you are talking most like between the 1080 and 1080Ti and decidedly towards the 1080 rather than the 1080Ti.

although GPU's due to there low clockrates compared to CPU's do usually benfit more from die shrinks than CPU's so maybe they could keep power the same and get the full 40% that would only put them a bit ahead of the 1080Ti pulling 300 odd watts when Nvidia will have launched there next gen architecture so AMD might be a little bit faster than the 1180...

oooo it sucks...

2
b¤cek Avatar
1
1 Year ago

i'm also tied due to uncertainity. i'm still using i5 3570k with gtx660 on a 1080p display. wanna change them all and performance and prices of vega vgas will be the decision maker for the trio.

1
NandorHUN Avatar
1
1 Year ago

I see no reason to upgrade from the 2500K, if you are a 1080p player just OC it to 4,5Ghz and you are good to go, or change it to 2600K, today's games will benefit from the 2600K, and it is still a much better buy then any other new CPU.

If you are a 1440p or 4K player, then there is no reason to change the CPU, because bottlenech only ocures if you play on high end GPU and 1080p, so the CPU can't handle the fps. In case of 4K and 1440p there is bigger stress on the GPU, because of the less fps there is no bottleneck.

That's why I bought a new 1440p monitor instead of an upgrade from my 2600K.

You should only consider an upgrade if you are playing on 1080p AND 144Hz freesync monitor.

Other than that a new CPU is just a waste of money.

0
Salty Mac Avatar
6
1 Year ago

you explain it very odd but you are right for the most part. all except the very beginning.

if he is playing on 1080p and upgrades to high end vega the 2500k will 100% bottleneck the gpu. i would guess even a 2600k would bottleneck if he went high enough.

also a new cpu is not a waste of money if you are on the 2500k. a new cpu/motherboard/ram would give many new features, speed, reliability, cooler, and energy efficiency.

3
SkankwOn Avatar
195
1 Year ago

Interesting replies from both 'Salty Mac' & 'NandorHUN'.

For the record, I'm gaming on a 1440p/144Hz monitor. Right now, the only game that I've actually struggled to run is PUBG (which is still in Early Access anyway).

Concerning the news (or lack of) today of Vega's two-month wait. Well, it's a bit disappointing that AMD is still lagging behind NVIDIA when it comes to the higher end cards. VOLTA could well trump VEGA straight off the bat ... :(

1
vajjala1986 Avatar
4
1 Year ago

Freesync monitors are much more cheaper than Nvidia G-Sync. Unncessarily, we have to pay 200$ on average to get a G-sync over freesync monitor. If AMD puts a Vega card with performance between between GTX 1070 and 1080, but with a price tag of GTX 1070 - then I am buying it. Otherwise, I will wait for GTX 1070/80 price drop.

4
Salty Mac Avatar
6
1 Year ago

hey you got your 1080 price drop, you getting one??

1
xGhostFace0621x Avatar
2
10 Months ago

Yeah, good luck with the prices. I bet you anything they'll be overpriced, thanks to cryptocurrency miners out there. AMD cards are usually their go to cards when it comes to that.

1
sixsixtysix Avatar
3
5 Months ago

I'm looking to buy an un-gougey-priced Vega 56 in the month or so. If AMD doesn't have their shit together by then, they'll never win me back. Granted, the last time I bought one of their's was when ATI stopped their all-in-wonder stuff.

3
Keyvan Avatar
11
1 Year ago

If AMD can pull this, then it'll be a nightmare for NVIDIA on the gaming market. They have more FreeSync screens, a more open standard, and better value for the money. I've said this before though and have been disappointed. That's why I'm saying "IF" they can pull it off... they still have to get the DEVs onboard for programming in the new offerings.

2
fuchs Avatar
2
1 Year ago

deep learning is more important than rendering some random game.

2
RanC Avatar
4
1 Year ago

And documentaries are more important than porn, but I think both of us know which one is the money maker as of right now. If you're short on cash and it makes money, you simply don't say no to it. They aren't mutually exclusive either. Neither do you say no to superb profit margins if you want to do expensive R&D. So what was your point again?

Besides, rendering some random games is what paved the road to HPC, deep learning, AI etc for Nvidia.

2
Gen Avatar
5
11 Months ago

Which is why Vega can be a flop as a gaming card but still incredibly profitable for AMD. That and Altcoins.

We all know that if not for the coin mining craze there might not be an AMD today.... but also how long you couldn't buy a 480/580 at a sane price because of it.

1
gamertaboo Avatar
4
1 Year ago

Can't believe they are going to wait until may to release Vega. They are out of their minds waiting that long. They should just release it alongside Ryzen in a month.

2
dwearle1 Avatar
1
1 Year ago

As this article indicates, it may just be a strategic marketing move on BOTH GPU manufacturers to wait for the other one to drop their next-gen GPU - if they drop too soon, they could be shooting themselves in the foot, so to speak.

1
RanC Avatar
4
1 Year ago

Yeah I don't really see any reason to hurry. From what I can tell, AMD is promising a product that is essentially a 1080 Ti but with some features which Nvidia decided to just not bother consumers with and instead push them for Volta (but Pascal workstation cards have them though, looking at you Unified Memory vs. High-Bandwidth Cache). I honestly they did it out of greed and because they didn't perceive AMD as a big enough threat... until suddenly you start seeing GP102 chips flooding the market without being neutered, at a price you supposedly just can't refuse. AMD has only one chance to do this right, better safe than sorry anyway. It might be the comeback of the decade or the biggest disappointment of the century. If there's any room for something between those two options, I think it's going to be pretty narrow.

I'm not particularly on either side of the fence, I admire both companies currently for very different reasons though. I have to say though, it's been way too long since I last owned an AM.. ATI card. It's been way too long since I updated my gpu too, but hey, it's not my fault Nvidia can't (won't, really) offer a reasonable upgrade for 780 Ti which has gotten me this far reasonably well.

2
Windows 10 Avatar
2
1 Year ago

If there is a card going against the GTX 1070 it cannot be over $400. There are a lot of 1070's dipping below $400 on amazon right now.

2
0V3RKILL Avatar
298
1 Year ago

all I'm going to say is that I am glad I waited. time to replace this twin frozr 290x. its been very good to me I have to say

2
SkankwOn Avatar
195
1 Year ago

We still gotta wait my friend ... 2 months!

I have the MSI Gaming Twin Frozr 290X also.

1
Death of Chaos Avatar
2
1 Year ago

Seeing as how there's a 500Mhz difference in clock speed between the chip tested and the 1080, I'd say that's not too bad. If this is a competing card for the 1070, that's still impressive in it's own right because it's hitting roughly the same scores as a 1070 at a clock speed lesser to it by 400Mhz. Would that mean it would be less hot and more power efficient than a 1070? Obviously that isn't taking into account the build of the card, it's possible that it could run hotter and all that at lower clocks. I'm still looking forward to what the actual, final card will present us with.

2
crashman95t446 Avatar
2
1 Year ago

Problem is the repeated word power hog. As great as it is for competition and to keep Nvidia prices reasonable, where electricity use is high Intel and Nvidia still have an edge . Although the Ryzen for X264/hevc rocks

2
Gen Avatar
5
11 Months ago

Intel don't have the edge. Nvidia do. Ryzen has been shown to dominate Atoms and Skylake at super low draw (sub 20W) and there is no low draw Kaby.

Pascal is so many miles ahead I think it will take more time to catch it from AMD. The reason it is less efficient may also be the reason it is so good at coin mining however, so it may be against AMD's interests.

It's just a matter of time before Ryzen starts coming out in mobile form though.

1
Dragonstongue Avatar
8
6 Months ago

the reason they "appear" power efficient, is because Nv chopped a great deal out of Pascal to make it sip power (who knows as the power meters read the digital footprint and Nv uses fancy digital circuitry and would not be first time the played BS games) but alas, of course it uses less power, there is not as much to power, so instead they chose to chop a large chunk of things away to open up the power budget some and clock the transistors up more.

Clock per clock, density vs density, Polaris and Vega are by far a more complete package BUT this comes at more power to drive it all. if AMD could clock polaris or Vega up to similar clock speeds they would royally own Pascal big time, but they cannot.

At least Polaris/Vega are not built as "cheap as possible" and still demanding an "branding tax" like Nv does with their products where they cheap out on Vreg design, use lower quality capacitors, are that much more prone to shorter life (lower thermal threshold)

anyways, would really love to see prices stabilize some, AMD and Nvidia both need to stomp on their customers (Asus/MSI etc) so their customers (me and you) get them closer to the price they should be...putting an extra $50-$120 on top of an expensive product not good for anyone except the caviar eating execs ^.^

1
muniix Avatar
2
7 Months ago

I've been using Vega 64 for simulations using Vulkan compute API and Julia Lang, taking full advantage of the HSA architecture and reduced need to marshall and copy data to/from an nVidia card previously and seeing a large performance improvement. Given my code is written correctly to a nice architecture for the developer to be productive and happy. I have no idea where your getting inferior performance numbers from I can only think that most developers are effing morons.

2
Shriven Avatar
3515
1 Year ago

I just worry about Nvidia locking off driver development access to certain titles before release. Shady shit, but not having a working driver on launch day is putting me off AMD. That, and past experience.

1
Dave James Avatar
627
1 Year ago

The recent AMD drivers have been really solid and they've also been a lot better at getting launch day driver fixes out too.

I reckon if they can get the performance at a good price they deserve to do well.

1
Recko Avatar
1
1 Year ago

AMD drivers and there Relive software have actually surpassed Nvidia now. AMD has also been releasing drivers on launch days but more often they release them before game launch day

1
meLAW Avatar
1
1 Year ago

Mhm, I wonder if it's really that smart to bring the top-tier model first, then fill up with the lower models afterwards. Yes it promises more hype and more profit right away, but that strategy also demands fully operational chips prior to launch. Whereas the other way round, optimisation in the fabrication process can run parallel with the launch sequence.

1
Bitdestroyer Avatar
2
1 Year ago

That is specifically why they released the 4xx series as budget cards... they already have offerings to compete at those levels.

2
Ramboy Avatar
1
1 Year ago

My ancient gtx560 died recently (not totally). I can still play some games but on low settings. Im looking for a replacement card. Should I go buy a rx480 8gb now or should I wait for the 500 series. And also when is the release date of the new radeon series?

1
7UKECREAT0R Avatar
2
1 Year ago

I would say choose if you want a budget card right now (rx480) or if you want a more expensive, more powerful card, wait for the 500. You probably already have a new one, but just helping :)

1
Dragonstongue Avatar
8
6 Months ago

RX x80 are not "budget cards" the "budget" for AMD ix x50/x60 for Nvidia it tends to be similar x50/x60 naming e.g RX 460/550-560 or 1050/1050Ti.

480/580 or 1070/1080 are "performance" level, not budget, can tell by the price anything $250+ is not a "budget card" the budget ones are usually $180 or less.

Cannot tell via the amount of memory used, can only go by the product number designation numbers.

1
Salty Mac Avatar
6
1 Year ago

yea even if you got a good 480, the 580 is not a big jump in performance. hope whatever you got is making you happy!

1
msroadkill612 Avatar
5
11 Months ago

To be the devils advocate, if the most recent incarnations of amdS gpus have any distinct and meaningful hardware legacies in vega, then its tempting to be closer to the vega ecosystem.

dunno, but i dont think much changed betw 480 & 580 - both are 14nm polaris?

if u gpu is so old, u r easy to please for a while.

u could go below the level miners want & escape that premium, spend $150 on an amd polaris gpu, wait a year & get vega, sell polaris for $80?

some folks mainly want modern connectors on their pc - too much power is a pain, & the cheapest such card is ~90usd new. it should sell ok.

the brand escapes me, but its well known asian one - i hear good things about their factory OC 470 gpu cards, cheap, may only be 4GB tho.

1
FC_Nightingale Avatar
1
1 Year ago

Well the 1080TI was announced and drops in a couple days, what's your answer AMD??

1
dmoody19 Avatar
1
1 Year ago

THe answer is "YAWN... HO HUMMMM....." SO? lol

0
ŊU | Xxx Avatar
1
1 Year ago

Waiting in patience, promised myself never to go amd again 8 years later i now own freesync screen thank to Nvidia GREED and waiting for new Gpu release :) Not going back again before amd really dissapoint me again wich i hope then don't.. Looks like AMD have done a great job now and also compete on cpu's too.

Give us release date :)))

1
ju-uh77 Avatar
2
1 Year ago

I agree with you on nvidia greed, Be a cold day in hell before i ever spend the cash they want for a Gsync monitor. One the other side I also think Amd was dumb as shit for giving away the freesync, They should have charged 45$ for it so it didn't adjust the price of the monitor much and they could have made some extra cash for R&D rather then giving the chip away and all the monitor makers getting the chip free and still jacking up the damn prices. I like the free mindset but have seen over and over again it either not get adopted or some other company get fat off their freebies.

2
wiesner8 Avatar
1
1 Year ago

does this mean i need to sell my dual R9 Fury X cards :(

1
Salty Mac Avatar
6
1 Year ago

no way, dual r9 fury x do some work in games.

1
[HFA]Dragonstongue Avatar
5
1 Year ago

did you seriously state "WFCCTECH as from the always trustworthy" LMAO, even a blind squirrel can manage to get a few nuts now and then, I would not trust one to feed me however.

1
Dave James Avatar
627
1 Year ago

Hehe, no it really wasn't serious. Was hoping folk might sense the sarcasm there ;)

2
daroule1982 Avatar
1
1 Year ago

Couldn't of happen at a better time. I just went to shopping for a new gpu after I found out my GTX 750ti wasn't so great for game modding. This is going to arrive just in the nick of time!

1
Teemu Avatar
1
1 Year ago

It is an overheater More watts than Ti. Only the basic cooler of the Ti. Why wouldn´t the overheating issues of Ti limit Vega?

1
UbajaraMalok Avatar
1
1 Year ago

Those guys at amd are fucking kidding! No vega until the end of july!

1
Suros# Avatar
1
11 Months ago

What's this guy talking about when he says you'll need a 1000 watt power supply? I've got a GTX 770 pulling 230 watts, and my 750W supply is easily overkill for it. Pretty sure I don't need an extra 250W extra to supply a ~300W card. Estimated draw of my whole build is only around 450-500W under extreme load.

1
msroadkill612 Avatar
5
11 Months ago

Does anyone give a rats about power usage when the power is needed?

What matters to most imo, is how little is wasted when its idling, yet it barely rates a mention usually.

1
msroadkill612 Avatar
5
11 Months ago

Does anyone give a rats about power usage when the power is needed?

What matters to most imo, is how little is wasted when its idling, yet it barely rates a mention usually.

1
xGhostFace0621x Avatar
2
10 Months ago

I really can't wait to see what Vega has to offer. This reign of Nvidia GPUs being the go to for gaming needs to end. We need some pretty good competition in the market.

1
danteandvirgil Avatar
1
10 Months ago

Sounds like AMD is releasing another set of cards that on paper carry significantly higher specs than the competition yet only just reach the same if not lower level of performance. Aside from HBC if developers actually use it, performance wise there's nothing to be excited about. If the lower Vega is as good as GTX 1070 then I fail to see what's special about it. GTX 1070 is a 980Ti, hat's old now. Still powerful without question but that it means it's taken AMD quite a while to catch up to it. How is it AMD cards carry twice the raw TFLOP rating of Nvidia cards yet fail to demonstrate it...Despite what benchmarks have already shown I wouldn't be surprised if the gTX 1070 outpaces all three cards. AMD has been cursed bad drivers for years and this is always the topic of discussion. Nvidia cards see little problems in regards. Trust goes far.

1
MadMage999 Avatar
1
10 Months ago

Let's talk about the elephant in the room, the GTX 1080Ti. Where are those numbers? No way I'm buying a 1080 or an rx vega!

1
MRlacp Avatar
1
10 Months ago

These tests don't say nothing, because Amd Rx Vega 64 is superior to Geforce GTX 1080. In other reviews sites the Amd RX Vega 64 appears clearly ahead in the benchmarks of the Geforce GTX 1080. It is between the 1080 and 1080 Ti in the majority of the tests.

1
7UKECREAT0R Avatar
2
9 Months ago

Can it run my minecraft at 8 f p s???/?

1
mikeweatherford7 Avatar
4
9 Months ago

These cards respond extremely well to undervolting. The difference can be dramatic, higher stable clocks, at much lower temps, and lower fan levels. My Reference Vega56 can do 1550 Mhz (actual full 3d load clock) at 1040 volts, and 950 MHZ HBM clock. At these settings it's significantly faster then a 1070, on the heels of a 1080, at reasonable temps and noise levels. This result seems to be typical with Vega56 owners.

1
Tkon Avatar
1
5 Months ago

What the article does not mention is that although HBM2 may not have been justified for gaming purposes, it was apparently very valuable for mining. In fact Vega 56 surpasses Vega 64 in total mining performance, i.e. hash rate per watt consumed. And both of them completely destroy the other rival cards in XMR mining performance, exceeding the closest match by 50%. That is why these cards have literally vanished from the market. People are selling even used ones at ridiculous 4xMSRP! Sometimes, I wonder if AMD should maybe make two distinct kind of cards, one of graphics and other for mining...

1
Iluv2raceit Avatar
10
5 Months ago

With the super inflation of Vega card prices, it just doesn't make any sense to buy a Vega 56 or 64 card. Isn't the whole point buying an AMD card is to save money and have great performance? Well, that point is now moot, thanks to the hyperbole of bitcoin mining. The fact that Vega cards are not any better/more powerful than Nvidia cards for bitcoin mining has apparently been lost to those that don't know how to do their own research.

1
joki8688 Avatar
2
1 Year ago

GPU as good as 1070, maybe 1080 (in the real applications) released a year later, when Pascal will go even cheaper, and NVIDIA will release faster version of Pascal...

And if you think that those cards will be cheaper than a year old Pascal, you are CRAZY (new type of memory for ex) . AMD was always cheaper for a reason. Because they had old and slower products. In the best scenario we will have the same products, for the same price, just in different colors...

I'm sorry, but I don't care about Vega.

0
jr2 Avatar
5
1 Year ago

Cared enough to troll...

5