Dragonstongue profile
Dragonstongue's Avatar

Dragonstongue profile

8
Dragonstongue Avatar
Dragonstongue commented on
2 Months ago
AMD and Nvidia graphics card prices may start to rise with Ethereum ASICs "a non-starter" AMD and Nvidia graphics card prices may start to rise with Ethereum ASICs "a non-starter"
Dragonstongue Avatar
"may start to rise" really, oh so you mean over the last two years because of mining this had ZERO impact on pricing from the sellers (such as newegg, amazon, ebay, kijiji) greed let alone potentially from the "makers" (Asus, MSI, XFX big time on XFX etc)

give me a F^^^ing break.

mining has been out quite a few years now (all the way back to AMD Radeon 4870-5870 time frame) only quite recently for Ngreedia (to even be remotely worth doing it with them because their performance lagged a massive amount behind radeons at 3-8x)

I call BS on any seller, analyst or whatever saying "its because of mining that GPU are priced the way they are" totally discounting greed from the makers/sellers, lack of Vram, the millions of units sold to other "sources" professional or otherwise.

This so called back to near MSRP pricing, ahahaha, I have not seen ANY evidence of this, not with Radeon 200-300-400-500-Vega GPU and similar side of the equation with Ngreedia.

seems if you want "fair price" you have to take the chance (and have the money) to buy them as soon as they are available, otherwise you run into something that should be price ~$150 Canadian MAX being $180+ tax/shipping, the ones that should be $200-$250 MAX end up being $400-$700 plus tax/shipping, and the ones that should be $350-$550 routinely going for $500-$1100 (or more for all these brackets) if you are even remotely willing to pay the massive % of overpricing in the range of 15-150% (or you go for the worst model selection and have to overpay just to send it back because the quality control is absolute crap from what it used to be with MOST makers, seems they really do not care to QC test things as thoroughly as they should, not make sure they are firmly packed/handled when it comes to shipping, why? because they are not the ones who paid the big bucks to buy the stupid things it seems IMO)

As far as placing the blame on "GPU prices may go up because the ASICS are not selling as well" what a flat load of BS that is, way back when the FPGA and ASIC were released for Bitcoin it had virtually zero impact on GPU pricing, whereas NOW I have the feeling it is nothing more than greed talking, why sell for $250 and make a scant 5% profit margin when you can overprice it to $350-$550 and make a solid 25-50+% profit instead "they really really want it, they have no choice, miner or not ahhaha lets go buy some champagne"

seems if you ARE NOT mining or have mommy/daddy to pay for it and actually have to work for a living (and make good income) you get told to bend over make sure to hold your knees!

Really Really sucks because there have been some really nice cpu-motherboards-cases etc released over the last 2-3 years, yet most of the "nice things" are obscenely overpriced unless you are "fortunate enough" to be USD based" where they cannot seem to greed price things nearly as easily (or take a chance on being dragged into class action lawsuit" but do they really care, nah because they make $$$$$$$$$$$$$$$$$$$$$$$$ hand over fist, if it mattered someone like Nv would not use lowest quality capacitors etc.

really sucks to be CAD or EUR etc I can tell you this outright.

As for Intel an 10nm woes, yeh for the "leader" in fab design, they really screwed that one up, they had a good lead on everyone, then all of a sudden AMD "caught up" for 45nm-32nm-20nm etc (unless Intel let them catch up?)

I just hope prices stabilize BEFORE "new gen" releases for CPU-GPU-RAM otherwise many folks who are "waiting to buy" may just say F it and get console, go without, or freak out when they realize how much "extra it costs" compared to what it was 3-5 years back...it really is disgusting, some awesome parts but man are they pricey (even though is way lower cost to make them now than ever before)
sign in to comment
Dragonstongue Avatar
Dragonstongue commented on
5 Months ago
Best free Steam gamesBest free Steam games
Dragonstongue Avatar
IMO Path of Exile may have once been great but the developer focused on making pretty vs running excellent.

Everyone that I know of has horrendous problems playing that game, some have systems that are WAY more $$$$$ and top of the line than my system could ever hope to be, others are about the same as mine, and others are very low spec systems.

Seems it really does not matter the generation of cpu, graphics, amount of ram, drivers, OS you are using, it either plays perfectly, or it is usually a mix of sometimes butter smooth or amazing lag fest.

All them pretty spell effects trash whatever performance one should have. for example, my system can get in the 140fps range, go into certain zones drops to a kind of steady at best 40fps and as soon as certain monster, player or environment effects happen becomes very jittery lag fest.

The engine they are using was not designed for all the pretty light show they use it for, hell it is not even properly optimized for DX11 hardware and multi-core support.

To each their own, played it for ~2 years all told, spent some $ to "support it" one patch everything excellent, next, broken, another patch back to almost as good, another worse than first breaking etc...you try to issue support ticket to see if the dev can help..no answer AT ALL..try to ask community for help, you either get told the same old try this try that even when you showed them "proof" you did, or, you get told "STFU go play something else than"

PoE may be "considered" F2P I consider it "freemium" because if you do not at least put a little into it, it becomes extremely limited mainly because you run of of stash space super quick and dev refuses to give more on character bag space so it a constant back and forth to town, and some "parts" for crafting are crazy expensive for no real reason.

All in all IMO what is by design a great game in its own right (by thinking outside of the box) becomes a broken mess the further you get into it, unless you are VERY lucky that your system plays it well.

So no, I do NOT consider it a diablo 3 killer, at least D3 plays perfectly for pretty much everyone instead of the "few" of those who play it.

going to try out that golf one though, lets hope is fun ^.^

I really wish armored warfare did not kick obsidian entertainment out, was another game that was shaping up to be top notch till the "owner" screwed OE over and the player base (many of whom including myself put in a fair amount of $ to support it)

I wish they made a new bomberman F2P via steam..miss that game..ramparts would be another one to kill sometime and be competitive play..had a great idea for an advanced chess game, but, I am not a coder :(
sign in to comment
Dragonstongue Avatar
Dragonstongue commented on
6 Months ago
Intel have quietly unveiled their quad-core AMD hybrid Core i7 8809G processor... kindaIntel have quietly unveiled their quad-core AMD hybrid Core i7 8809G processor... kinda
Dragonstongue Avatar
if anything it just goes to show AMD has "room" to up their powerbudget to allow a more beastly less constrained by power/TDP "ryzen" couple with a Vega variant...am surprised they did not do so, unless was their plan all along to "give" Intel this "win" i.e above 35w TDP becomes Intel territory, because, the 35w or below Ryzen has sown that up tight.

I would imagine the Ryzen based ones when Vega is "fully active" likely gains a good chunk of extra performance than what we currently see, as some things are not currently fully enabled so there is at least some bottleneck or at least not fully able to "stretch its legs"

I would imagine the folks who built the graphics core as well as the processor core maybe know a bit of "secret sauce" to couple them together very very well, alas, they were very much working on a limited budget limited developer $ for many years, that is until 2016/2017+ came to be as we have seen it.

(not to mention at least the Ryzen based chips can handle extra heat without running crazy hot for no reason e.g better thermal interface/design)

There is also a "new bug" that was recently discovered x86 that affects like 80+% of all processors released, but, AMD chips are x64 with x86 ability "tacked on" they do not seem to be affected by it at all.

^.^
sign in to comment
Dragonstongue Avatar
Dragonstongue responded to Gen comment in
1 Year ago
AMD Vega reviews, news, performance, and availabilityAMD Vega reviews, news, performance, and availability
Gen Avatar

Intel don't have the edge. Nvidia do. Ryzen has been shown to dominate Atoms and Skylake at super low draw (sub 20W) and there is no low draw Kaby.


Pascal is so many miles ahead I think it will take more time to catch it from AMD. The reason it is less efficient may also be the reason it is so good at coin mining however, so it may be against AMD's interests.


It's just a matter of time before Ryzen starts coming out in mobile form though.

Dragonstongue Avatar

the reason they "appear" power efficient, is because Nv chopped a great deal out of Pascal to make it sip power (who knows as the power meters read the digital footprint and Nv uses fancy digital circuitry and would not be first time the played BS games) but alas, of course it uses less power, there is not as much to power, so instead they chose to chop a large chunk of things away to open up the power budget some and clock the transistors up more.


Clock per clock, density vs density, Polaris and Vega are by far a more complete package BUT this comes at more power to drive it all. if AMD could clock polaris or Vega up to similar clock speeds they would royally own Pascal big time, but they cannot.


At least Polaris/Vega are not built as "cheap as possible" and still demanding an "branding tax" like Nv does with their products where they cheap out on Vreg design, use lower quality capacitors, are that much more prone to shorter life (lower thermal threshold)


anyways, would really love to see prices stabilize some, AMD and Nvidia both need to stomp on their customers (Asus/MSI etc) so their customers (me and you) get them closer to the price they should be...putting an extra $50-$120 on top of an expensive product not good for anyone except the caviar eating execs ^.^

sign in to comment
Dragonstongue Avatar
Dragonstongue responded to deksroning comment in
10 Months ago
AMD’s Vega isn't finished yet, Vega 11 goes into production to replace PolarisAMD’s Vega isn't finished yet, Vega 11 goes into production to replace Polaris
deksroning Avatar

Vega 56 already outperforms GTX 1070.

Also bear in mind that when Vga 56 is undervolted at the core, and overclocked at the memory, its performance gets close to or exceeds GTX 1080 (with lower power draw than 1080).


AMD overvolts their GPU's to increase yields... hence why undervolting fixes power efficiency and performance (as it also removes thermal throttling).

But AMD is also making their GPU's on a manuf. process that's not suitable for high clock speeds. Nvidia has access to Samsung's manuf. process that allowed them to make Pascal (which is nothing more than an overclocked Maxwell).

Finally, Vega's architecture apparently is not optimized for games per Raja Koduri's statement, and there are several features in Vega that need to be used by developers in order to see more performance.

Dragonstongue Avatar

Nvidia uses TSMC NOT Samsung, GF (Global Foundries) uses/shares IBM/Samsung 14nm designs, whereas TSMC uses their own 16nm process, the main reason why Nv 1000 series is "so fast" they optimized the design to cut out all the "extras" to focus purely on gaming tasks not the advanced stuff that can be found in hashing or "true" DX12 product stack which polaris/vega are capable of far far more than GTX 1000 series.


if Nvidia were to use more or less the same transistor density and extras to get all that DX12 offers without relying on software tricks/hacks to make it "seem" like they are fast as they appear, their apparent efficiency would go into the toilet as would their raw clock speed advantage drop like a stone....they can force transistors to operate at a higher frequency if they are "lean" read optimized for clock speed alone, but, if they are "fat" to be able to do more they simply cannot clock as high.


Polaris/Vega may not be able to clock as high, but, they have much performance for the clocks they CAN hit, much like Ryzen may have a clock speed deficit vs various core i models they compete on near even footing at a lower clock speed might suffer some on "optimized" high IPC apps/games just like GTX1000 series does.


anyways, long story short, Vega was meant to be a workhorse just like pretty much every radeon ever released, whereas GTX cards for years now have been going more and more towards gaming grunt with less "fat" design.


Also to guy below me, supposedly it is NOT Volta that was rumored to use GDDR6 (they were supposed to be using either HBM2 or HBM3 once AMD "first access" is gone) but likely it will be GDDR5x for Nv at faster speed and AMD to use GDDR6 first.


Historically AMD used the newer memory standards and much of the newer features with various DX/OpenGL versions and Nv came AFTER others proved it useful just so Nv could come in and force multi millions of $ down throat of folks like MSFT to optimize/tweak for their specific needs at the cost of gamers/devs.


Tesselation is a prime example of that, had MSFT NOT basically forced AMD to build according to NV whims crying "unfair advantage" after AMD spent many years and many millions of $ implementing it generation after generation they would likely have stomped Nv to the curb, instead, NV was allowed to "trick" software to make them appear so much better at it even if the density of the final image was subpar but faster than AMD, same with PhysX before NV took ability away from Radeons to use it (because it made many tier higher NV cards look crap in comparison)


Radeon tend to be more horsepower/grunt whereas GTX tend to be (for many years now) all about tuner type in comparison.


anyways :)

sign in to comment
Belimawr Avatar
Belimawr responded to Dragonstongue's comment in
3 Years ago
AMD unveil new Radeon GPU range: the R7300 and R9300 seriesAMD unveil new Radeon GPU range: the R7300 and R9300 series
Dragonstongue Avatar

See that is not 100% true, the ONLY card that Nvidia currently has that 100% supports DX12 is the 980Ti, the rest are some support here, some support there. DX11.1 is essentially a subset of DX12, to be fully compliant with DX12 one also has to support 11.1 to be a DX12, 12.1, 12.2 level support, was looking at this yesterday, Nvidia chose not to support DX 11.1 so thereby many of the features of DX12 WILL NOT be anything but software driven, not via hardware, so something like the advanced tiling used for tessellation will be usable for many of Nvidias cads but the fact of the matter is the MOST IMPORTANT of DX12 features are currently only available and drive by ALL GCN based products and basically just the 980Ti cause Nvidia basically decided to pick and choose what they will support and not support, so for them to even remotely say FULL DX12 support is an outright lie, whereas AMD CAN SAY THIS without at all lying they choose not to use some of the basically unneeded things in DX12 but besides that all GCN can and are dx11.1 compliant so by virtue are also getting the lions share of what DX12 brings to the table, and that's fact, not software driven but HARDWARE which is bar none always better.

Reply
Belimawr Avatar

guess you should go tell them they are lying......

.

http://blogs.nvidia.com/blog/2015/01/21/windows-10-nvidia-dx12/

.

shouldn't be that surprising really when it's Nvidia and Epic that MS has been working with for nearly every DX12 demo.

sign in to comment