Nvidia GTX980Ti Review

Memory bandwidth can still get you far if they designed for it and it outmet the needs that GDDR5 could meet. In games i'm unsure how it would work, obviously they would need huge loads at 4k. Also remember that the HBM isn't just capable of higher bandwidth, the latency is much much lower due to arch designs and the fact they are so close to the core. Latency is probably more important currently than top end bandwidth I would feel like at this current moment for games. Also have to remember that they are running at 500mhz and outpacing GDDR5 by loads. So another way to think of it is GDDR5 OC'd by another couple ghz(example:p) so it's just a faster version- how that translates to games, we will see:)

I agree, the real advantage of HBM is going to be latency as this could result in smoother gameplay and better frametimes.

The increased bandwidth that comes with HBM I think will do very little as once you have got enough you don't need any extra.
 
Say what?...

A word he used in describing it.

I looked it up in the Concise OED since... I misspelled it.

spondulicks
n plural noun British informal money.

ORIGIN
C19: of unknown origin.

I know a little bit of British slang, but...

We are a people separated by a common language ;)

(Could be worse, ever try to understand south Pacific Pigeon?)
 
Last edited:
A word he used in describing it.

I looked it up in the Concise OED since... I misspelled it.

spondulicks
n plural noun British informal money.

ORIGIN
C19: of unknown origin.

I know a little bit of British slang, but...

We are a people separated by a common language ;)

(Could be worse, ever try to understand south Pacific Pigeon?)

No idea what you just said mate :huh:...
 
I'm a bit curious why they suddenly rushed the 980 TI out? Is it because they want as much revenue as they can before 390X will be out or will AMD actually be able to take the torch in best GPU this time. It just seems odd since they already had the Titan out and could possibly earn some good money on that before they released this. We will just have to see.
 
I'm a bit curious why they suddenly rushed the 980 TI out? Is it because they want as much revenue as they can before 390X will be out or will AMD actually be able to take the torch in best GPU this time. It just seems odd since they already had the Titan out and could possibly earn some good money on that before they released this. We will just have to see.

Not a lot of us can afford the Titan at its price point. Nvidia releasing something good now is a no brainer. They're hitting a much lower price point with a high-performance GPU and racking up the sales as they go. Many of the people buying from them now were waiting for Fury to drop, GTX-980Ti forced them into action.

Crazy like a Fox.
 
Not a lot of us can afford the Titan at its price point. Nvidia releasing something good now is a no brainer. They're hitting a much lower price point with a high-performance GPU and racking up the sales as they go. Many of the people buying from them now were waiting for Fury to drop, GTX-980Ti forced them into action.

Crazy like a Fox.

As Chrazey said it is a very limited stock so it seems rushed. But I see you reasoning the only downside is that the Titan X owners must feel a bit let Down because of this but thats the pc race I guess
 
I'm not a Titan X owner, was considering it though and I wasn't aware or knew.

Well ok.. Either way it was obvious that the 980ti and the Titan X was going to be like the 780ti to the Titan

780<Titan<780ti<Titan black (at stock without end user OC)
 
As Chrazey said it is a very limited stock so it seems rushed. But I see you reasoning the only downside is that the Titan X owners must feel a bit let Down because of this but thats the pc race I guess

Why would Titan X owners feel let down, neither the 980 Ti or AMD Fury X will get close to matching the Titan X @2160p.

12gb of memory is a very solid argument and the reason a lot of people bought the Titan X.
 
Why would Titan X owners feel let down, neither the 980 Ti or AMD Fury X will get close to matching the Titan X @2160p.

12gb of memory is a very solid argument and the reason a lot of people bought the Titan X.

But it's a silly amount of VRAM and a silly argument.

They've paid £400 for 6gb of extra VRAM. And no matter which way you look at it, or argue it, that's ridiculous.

Up until the Titan X launched people weren't using anywhere near 6gb of VRAM. Just because you have it, and can use it, doesn't mean it's making a difference or making anything better.

Now granted, the Titan X may just be fast enough to one day actually possibly use that extra 6gb of VRAM but you would be a dreamer to think that it would. Why? because we get what the consoles had already. That means textures, maps, etc are already designed and made and they won't go back and redo it all to make use of what a PC could offer.

So for example we may never even see a real, actual 4k game. What we will see however is those console textures upscaled to 4k and nothing more. And that means that they won't use up any more VRAM than the console has, which is 6gb once you take away the other 2gb they have.

So for this generation at least 6gb is the magic number.

Having said all of that what's the point in talking sense or reason in this community? it's clearly all about showing off and having the best, no matter the price. And it's that that has alienated me and made me not give a crap any more. Over time technology gets cheaper. In 1990 a TV set would cost about a grand for something passable. Now you can get a flat screen TV for about £150. That's how it works. It's the same with refrigerators, microwaves, mobile phones* and so on. In 1990 the RC car I just bought for £350 cost over £600. So over time they have actually become cheaper, not more expensive.

GPUs are no longer objects for gamers to play games with. They're status symbols, stupidly packed show off items that are for anything but gaming.

And it shouldn't be like that. I remember back in the late 90s a top end GPU or accelerator would cost you about £150. And it stayed that way until Nvidia got rid of 3DFX. Then all of a sudden it's £500 or more. And the more time goes on the more they up the price and the more people they price out of their own market.

And the more they do that the more fanboys and brand whores show up and the more it all becomes very expensive. Because it's no longer about this item that you plug into a PC to play games with it's all about showing off and rubbing other people's noses in it.

Honestly? you pick up a Titan X and show it to some one who doesn't give a crap about computers and tell them you paid a grand for it. Then, note the look on their face.

Says it all really....

* mobile phones. You can get a dual core 1.3ghz 5" Android phone for £60 or there abouts these days. And it will do pretty much anything a £600+ Iphone will do. At least in the mobile phone market there are other competitors and manufacturers to keep the prices sensible. I had a Swees 5" phone for over a year and I absolutely loved it.
 
But it's a silly amount of VRAM and a silly argument.

They've paid £400 for 6gb of extra VRAM. And no matter which way you look at it, or argue it, that's ridiculous.

Up until the Titan X launched people weren't using anywhere near 6gb of VRAM. Just because you have it, and can use it, doesn't mean it's making a difference or making anything better.

Now granted, the Titan X may just be fast enough to one day actually possibly use that extra 6gb of VRAM but you would be a dreamer to think that it would. Why? because we get what the consoles had already. That means textures, maps, etc are already designed and made and they won't go back and redo it all to make use of what a PC could offer.

So for example we may never even see a real, actual 4k game. What we will see however is those console textures upscaled to 4k and nothing more. And that means that they won't use up any more VRAM than the console has, which is 6gb once you take away the other 2gb they have.

So for this generation at least 6gb is the magic number.

Having said all of that what's the point in talking sense or reason in this community? it's clearly all about showing off and having the best, no matter the price. And it's that that has alienated me and made me not give a crap any more. Over time technology gets cheaper. In 1990 a TV set would cost about a grand for something passable. Now you can get a flat screen TV for about £150. That's how it works. It's the same with refrigerators, microwaves, mobile phones* and so on. In 1990 the RC car I just bought for £350 cost over £600. So over time they have actually become cheaper, not more expensive.

GPUs are no longer objects for gamers to play games with. They're status symbols, stupidly packed show off items that are for anything but gaming.

And it shouldn't be like that. I remember back in the late 90s a top end GPU or accelerator would cost you about £150. And it stayed that way until Nvidia got rid of 3DFX. Then all of a sudden it's £500 or more. And the more time goes on the more they up the price and the more people they price out of their own market.

And the more they do that the more fanboys and brand whores show up and the more it all becomes very expensive. Because it's no longer about this item that you plug into a PC to play games with it's all about showing off and rubbing other people's noses in it.

Honestly? you pick up a Titan X and show it to some one who doesn't give a crap about computers and tell them you paid a grand for it. Then, note the look on their face.

Says it all really....

* mobile phones. You can get a dual core 1.3ghz 5" Android phone for £60 or there abouts these days. And it will do pretty much anything a £600+ Iphone will do. At least in the mobile phone market there are other competitors and manufacturers to keep the prices sensible. I had a Swees 5" phone for over a year and I absolutely loved it.

Check the memory usage

Mw2Wzrj.jpg



4gb AMD Fury and 6gb 980 Ti cards are not going to get the job done.

Game Set and Match to me I believe.:)

Satisfied Titan X owner.:)
 
Just because you are using that much VRAM on a Titan X does not mean you would use the same on other cards though. That's what you're missing. I use less VRAM on my Titan Blacks at 4k in GTAV than people running a 980 @ 1440p.

It's all down to the drivers and how they handle textures on each specific card they are talking to.

I've tried in vain to make my Titan Blacks use more than 4gb but I've yet to see it.

Edit. As for the Fury? No idea. I've no idea because I have no idea how they actually work. What I do know though is that for years and years certain cards by certain manufacturers set aside some of your physical ram and use that as some sort of buffer.

But until HBM launches and we get to understand it? no idea. Tom seemed pretty convinced that it wasn't a problem and he knows more than we do about Fury.
 
Last edited:
Just because you are using that much VRAM on a Titan X does not mean you would use the same on other cards though. That's what you're missing. I use less VRAM on my Titan Blacks at 4k in GTAV than people running a 980 @ 1440p.

It's all down to the drivers and how they handle textures on each specific card they are talking to.

I've tried in vain to make my Titan Blacks use more than 4gb but I've yet to see it.

I have also got original Titans and I can assure you if you load up Watch Dogs @2160p maxed on yours you will run out of memory.

The game uses 8gb something the original Titans don't have.


Edit. As for the Fury? No idea. I've no idea because I have no idea how they actually work. What I do know though is that for years and years certain cards by certain manufacturers set aside some of your physical ram and use that as some sort of buffer.

But until HBM launches and we get to understand it? no idea. Tom seemed pretty convinced that it wasn't a problem and he knows more than we do about Fury.

Unless AMD are going to rewrite every game ever made you will find that the HBM cards will handle memory exactly the same way as GDDR5 cards.

4gb is 4gb
 
Last edited:
I have also got original Titans and I can assure you if you load up Watch Dogs @2160p maxed on yours you will run out of memory.

The game uses 8gb something the original Titans don't have.

On a game that isn't even worth playing because it's rubbish.

I'm just not sold any more dude. We get what? three? decent games a year and every single year we need to upgrade and spend another thousand pounds for three games.

This year? the only game I've actually played is GTAV. The rest don't do anything for me at all because I don't like racing games and I hate the theme of The Witcher.

Oh sorry no, Dying Light was pretty good too.

Other than that? I really don't fancy paying £1600+ a year to play one ferking game. Sorry.

Next year Nvidia will launch another Titan like card and next year you will need that card to play the one sodding decent game we get a year.

It's madness. Utter, utter madness.

When my Titan Blacks choke out (note I've said when, not if) I am buying a console. You get every single game that comes out and they cost £300 or so every six or so years.

Consoles have also become cheaper. I remember a mate of mine paying about £600 for a PS3. PS4 was what? £350?

Nvidia can continue to shrink their market and increase their prices but they can do it to some one else. The absolute last thing I wanted to happen to the GPU sector was "Apple" or "Beats headphones".

But that's exactly what has happened to it.
 
You guys should remember that games will use as much vram as they can... If you had 4GB it will use upto 4GB.. if you had 8GB on the same chip the game will extend its usage closer to 8GB. BF4 is a great example of this. Just because it has 12GB of vram and software reports it's using 8GB, doesn't mean it actually needs all of it. It just stores more data in the pool so it won't have to process it again later. A good modern engine will scale pretty well.
Oh and 4GB is not in theory virtually 4GB. Based off of improved color compression algorithms and other tech involved with vram, the ending result can actually be less depending on how efficient the algorithms are. But yes physically 4GB = 4GB.
just thought I could clear that up.. hopefully it makes sense
 
Back
Top