When Nvidia lost "G" from "GPU" - my thoughts about GTX 750 (Ti) a.k.a. GM107

agent_x007

New member
When Nvidia lost "G" from "GPU" - my thoughts about GTX 750 (Ti) a.k.a. GM107

Hello everyone.

I recently stumbled opon a problem we didn't had so far (and it's quite scary for me), but I start from the begining :

In Q1 2014, Nvidia launched a new product called GM107, or as they called it :
"1-st generation Maxwell GPU".
Everything seems great :
It was consuming a lot less than previous GK107 (or Kepler based GPU), and was still giving us performance gain.
But there was a small problem with it.
In whitepaper (LINK) we read :
From a graphics features perspective, our first-generation Maxwell GPUs offer the same API functionality as Kepler GPUs.
So we get all the advances in architecture but without any new graphics functions.

U can now say : So what, Kepler was similar, right ?
And that is almost true.

But then we get to scary part :
Few months ago Nvidia launched second generation Maxwell, with VXGI, MFAA, better memory compresion algorithm, DX12, etc.
And ALL Kepler based GPU's do support DX11, or TXAA graphic funtions, and since GTX980/970, that is NOT the case with Maxwell based GPUs.

End result :
About a week ago Nvidia relesed to public new tech demo for "Maxwell architecture" called :
Apollo 11 : LINK
It uses new VXGI technology to get better image quality.
BUT this demo will NOT run on NOT Maxwell GPUs.
AND GM107 is NOT a Maxwell GPU, according to this official NV demo.
So my problem is : What is the meaning of having a naming scheme for GPU architectures, if GPUs from the same architecture will not support the same graphic functions ?

It's GPU - so it stand as "Graphics Processing Unit", right ?
"G" is first and should be most important from NV perspective.
But based on this, GM107 is not a new generation GPU.
It's a new generation "PU", or "Processing Unit".
Or basicly : A re-think of doing the same thing's, just better.

That's why, to me it's not enough to call GM107 a "new generation GPU" and Nvidia agrees aparently with me by locking Maxwell Demo to only 2-nd gen. Maxwell cards.

Conlusion :
What I'm afraid is that Nvidia start's locking graphic features to markets they target.
For example : DX12, VXGI, etc. will be able locked to mid to high end cards because... architecture naming is irrelavent (+ they get more ca$h from this).
And I don't want, for example "Big Maxwell" to be called a "Pascal architecture GPU", when it won't support the same features as rest of "Pascal architecture" lineup.
How about U guys ?

I may be overreacting here, sure. I just deeply concerned about what future NV products will look like from a graphic feature perspective (or GPU perspective).

Thank U all for reading.
 
Last edited:
Its Nvidia being Nvidia and milking things.
Whodathunkit.
(yes AMD do it too, I know)

GPU companies will always do this. I don't know why you'd trust them to support older generations of cards that are, in all fairness, not that far behind the current gen in terms of performance.
 
Not support older GPUs ?
Im talking about a MAXWELL here, that was REALESED not even a year ago !

What did U say, if to have... for example DX11 support, U would need at least a 300$ GPU - because DX11 support would be limited to mid/high end market ?
We talking about a GTX 560/550 Ti class GPU, without DX11.
That doesn't seem scary, at all ?
 
Last edited:
I was also a little bemused at the Apollo tech demo for "Maxwell", after downloading and getting a message "Maxwell GPU not found" ....eh! It's a pain but, like Barnsley said they have always done it. I guess in this market the only way you can be 100% certain with GPUs is to purchase flagship models when ever you upgrade.
 
Well, 8600 series (or G84) supported DX10 and all other graphics features from G80...
GF106 supported all features from GF104 line as well...
GK107 : same story - all things making Kepler GK104 are there.
And now we get this "Maxwell"... yeah.
I don't see them "doing it" from the beggining.
 
Last edited:
We talking about a GTX 560/550 class GPU, without DX11.
That doesn't seem scary ?

No not really, I'm genuinely surprised that cards like the 740/750 (for example) support DX11.

The 5/6/760/60ti has always been the mainstream gaming card from Nvidia. I really wouldn't be suprised if Nvidia got rid of DX support for the low end cards as they're not even marketed (with any effort) at gamers anymore. They've not done Sli for the x50 series for a while now (exlcuding the 650Ti Boost) so why support DX?

x50 and below for office/media pc duties, x60 and above for gaming.

-edit- Don't get me completely wrong, the 750ti is still very much a marketed gaming card, but I honestly don't know how long cards like it will last now Nvidia have stopped being so uppity with prices.
 
No not really, I'm genuinely surprised that cards like the 740/750 (for example) support DX11.

The 5/6/760/60ti has always been the mainstream gaming card from Nvidia. I really wouldn't be suprised if Nvidia got rid of DX support for the low end cards as they're not even marketed (with any effort) at gamers anymore. They've not done Sli for the x50 series for a while now (exlcuding the 650Ti Boost) so why support DX?


I would loved to have seen what the 750/ti could have done in Sli.........
 
@up Even in SLI, they still couldn't do Maxwell demo I wrote about earlier work :P

I think current users of GTX 750 (Ti) should be OK, at least till DX12/DX11.3 games arrive.
But a few big games with "GPU hogging level" on AC:U scale - may change that faster...
 
Last edited:
I'm not surprised, they quite regularly go in the direction of anti-consumer, whether it's only allowing PhysX to work when the GPU is being used as a primary display driver (or along those lines at least) stopping people from using one with an AMD card to get the best of both teams, or not supporting Free-sync not only reducing monitor options but also making it harder for them to update to new display port versions.

I'm happy to go with either side but Nvidia are not looking very appealing with recent moves.
 
Its Nvidia being Nvidia and milking things.
Whodathunkit.
(yes AMD do it too, I know)

GPU companies will always do this. I don't know why you'd trust them to support older generations of cards that are, in all fairness, not that far behind the current gen in terms of performance.

AMD doesn't do that. Well they have CGN. Everything from HD7xxx series up to the 295x2 is CGN. CGN has revisions. CGN can get some stuff added to it for all gens (Mantle,DX12) but some other stuff (true Audio) can't be done for the older gen products. They still are GCN though and never heard AMD denying even their marketing leaders saying that all CGN arch gpus are the same in terms of features
 
The reason why the 750/Ti doesn't support VXGI is simply down to the fact that the card is not powerful enough so putting a feature on a card that won't be able to handle it makes no sense at all.
 
Tom Peterson did say in interview that VXGI was definitely 100% compatible with AMD/Kepler based hardware but that they were primarily optimizing it for Maxwell for the time being. I'm inclined to say that it will most definitely be a Maxwell only feature now, and who can blame them Maxwell is such a work of art as far as architecture goes.

Also the 750 ti while it still astounds me with pretty much every game I throw at it, it's still shadowed performance wise by the GTX 480, 3 generations and 4 years later and look how far we've come... side by side nearly equal in performance with the aged flagship but over half the power draw. So in say 4 years from now, mid ranged cards are going to be the equivalent to the 980 and run on bios batteries :lol:
 
AMD doesn't do that. Well they have GCN. Everything from HD7xxx series up to the 295x2 is GCN. GCN has revisions. GCN can get some stuff added to it for all gens (Mantle,DX12) but some other stuff (true Audio) can't be done for the older gen products. They still are GCN though and never heard AMD denying even their marketing leaders saying that all GCN arch gpus are the same in terms of features

Yikes.. fixed that for you. At least call it the right name... its not "CGN"

Also TrueAudio is only supported on newer hardware because they have the DSP on the PCB. It's not a software thing its hardware.
 
AMD doesn't do that. Well they have CGN. Everything from HD7xxx series up to the 295x2 is CGN.

There is actually GCN 1, GCN 1.1 (the HD7790 onwards, also most newer APUs and the 290/290x) and GCN 1.2( The 285 and whatever the next gen is). 1.1 is more important for APUs tbh.
I was referring to re-branding previous gen cards in my comment anyway.
 
The reason why the 750/Ti doesn't support VXGI is simply down to the fact that the card is not powerful enough so putting a feature on a card that won't be able to handle it makes no sense at all.
Performace of card wasn't a problem for the features it can support (at least in past).
Like someone said earier :
I bet NV will make GM207 for GTX 950 (or someother name), with VXGI, DX12, MFAA support.

We had for example FX 5200 back in the day WITH DX9 support (it's true that it was HORRIBLE in it, but it was supported).
GF 8300 GS supports DX10 and GT410 - Tesselation, those are more recent examples.
But they are all are absolutly slowest models of that generations.
GM107 isn't THAT slow.
 
Last edited:
Back
Top