Trick or treat? Gigabyte's killer clown hints at an imminent launch of Nvidia's GTX 1070 Ti

Gigabyte Clown 1070 Ti

Update September 21, 2017: Gigabyte have alluded to the rumoured Nvidia GTX 1070 Ti on Facebook, with a rather spectacular Photoshop fail of Stephen King’s It movie-style poster. If you thought killer clowns were horrifying wait until you see this...

Your graphics card deserves a good screen, so feed it one of the best gaming monitors around to see it blossom.

The recent cinema release of Stephen King’s It has obviously spurred up some creativity over at Gigabyte. Creating their own movie poster with the title inverted to spell ‘TI’, a spooky clown holding an Aorus graphics card, and a wallet with a ten dollar bill floating away from it held up by a red balloon... Just wait till the World Clown Association hears about this one, Gigabyte.

Gigabyte Killer Clown 1070 Ti

The horror theme and the ‘coming soon’ hashtag seen in the image could imply a near-Halloween launch for the card.

The top of Gigabyte’s TI movie poster also features the tagline, “You’ll float too,” referencing the cash floating away from your wallet, I suppose. Ideally, not too much money will have to float from my wallet to pick up a 1070 Ti, considering a GTX 1080 sits just above the rumoured card in the product stack.

We assume this Facebook post is regarding the 1070 Ti, all we know for sure is this is regarding /some/ Ti model. It’s not outside the realms of possibility that the poster references the entry of another 1080 Ti into Gigabyte’s lineup.

Original story September 14, 2017: There are fresh rumours filteringout of forums in Scandinavia and the East that Nvidia are prepping a new Pascal-based hammer to bash that final nail into the AMD Vega coffin. That card is apparently the Nvidia GTX 1070 Ti.

Before we get too excited, the only source for this is an image that’s been circulating the interwebs, appearing to show an Asus rig with a GTX 1070 Ti GPU inside it. That’s it. No one’s been able to confirm whether there is a shred of truth to the rumour or whether it’s just some Asus intern dropping extraneous letters onto their marketing shizzle.

But that hasn’t stopped the rumour getting fleshed out with alleged specifications, touting a GP104 GPU with either 2,048 or 2,304 CUDA cores inside and 8GB of video memory. For reference, the existing GTX 1070 has 1,920 cores and the GTX 1080 has 2,560. The former sounds more likely, as it would otherwise sit far too close to the GTX 1080 for comfort.

Nvidia GTX 1070 Ti

There’s always a chance that this is just evidence of Nvidia getting bored. With the spluttering launch that has been AMD’s RX Vega cards, there’s almost no impetus for the GeForce engineers to move any quicker with the new Nvidia Volta GPU architecture. Any new Pascal release would kinda be like kicking a sickly puppy.

The existing Pascal cards are quick, efficient, and available, which makes it very difficult for anyone to make a case for buying new AMD graphics silicon. Nvidia’s CEO has gone on record during a recent super-serious investor briefing saying that for the foreseeable future, “Pascal is just unbeatable.”

So, the only reason for Nvidia to want to release an updated Pascal chip with the GTX 1070 Ti card is because they’re either bored or determined to stick another brogue into AMD’s bruised ribs. Maybe this is why Radeon Tech Group’s Raja Koduri has decided to take a break from GPU whispering until the start of next year - he knows there’s more punishment on the horizon.

Poor Vega...

The other big question everyone’s asking is where the hell the GTX 1070 Ti will fit into the existing graphics stack? There’s not a lot of clear air between the current GTX 1070 and GTX 1080, either in performance terms or pricing, so dropping a new card in between will surely cannibalise the sales of its older siblings. I guess there’s always the possibility of a price cut for the GTX 1070 to give a little space to the Ti card, but given the current pricing struggles of AMD’s GPUs there’s little need for that.

Whatever the truth of it, however, none of this makes for pleasant reading for AMD's graphics card fans.

Paladins
Sign in to Commentlogin to comment