Nvidia Next-Gen GPU PCB leaks - 12GB of GDDR6 memory

With all these delays, I do wonder whether this will actually be more than just a 1080Ti replacement and more like the way the 1080 was a replacement to the 980Ti, as in a lot more powerful. To wait two and a half years for essentially a lower TDP version of the same GPU with only a slight price decrease would be... ehhhh... maddening.
 
you are confusing Gb (Gigabit) and GB (Gigabyte). 8Gb = 1GB

Hey strip out the middle bit (mind bogglingly insane) and my statement would hold true. :D

Also sod memory manufacturers. that GB Gb BS is such a crappy move. :mad:
 
8 G bit = 1 G byte


So its correct i think

Thinking about the whole Gigabit and Gigabyte in terms of VRAM, shouldn't we be measuring it as Gibibits and Gibibytes since they all go to base 2 instead of base 10?


edit: Actually forget that, just remembered Semiconductor memory uses the JEDEC standard which is Kilo, Mega and Giga to base 2
 
Last edited:
Hey strip out the middle bit (mind bogglingly insane) and my statement would hold true. :D

Also sod memory manufacturers. that GB Gb BS is such a crappy move. :mad:

Bits and Bytes have been around since the origins of computers fetta, it isn't new.

I do agree that it confuses things for people that are not super into computers, as Internet transfer speeds are often measured in Mb and not MB, causing all kinds of confusion for people.

It all comes down to how PCs use a base-2 binary number system and how an on/off switch isn't that useful for most things. A single 1/0 or switch is a bit, whereas eight of these in a sequence is a byte.

So a byte (8 bits) of data can be used to select 2^8 options, or one of 256 different values. Long story short is that Bytes are a lot more useful.

An interesting side note is that the old Nintendo Gameboy was an 8-bit console, which is why the game is only able to offer 251 pokemon, as that is dangerously close to the 256 choice limit of the hardware.
 
It's quite feeble to see Nvidia in the position they are in now. I really never thought we would see a day where so long as you were not running a stupid resolution that the games were actually behind a GPU. What I mean is, all the time I grew up there was always something bigger and better looking on the horizon. And graphical changes back then were enormous. Now? a 1080Ti is enough for 4k so unless you are running daft 5k or above everything needed is already on the market.

Other than Metro I have not seen a single game coming out that could make a GPU sweat, and you would have to be Bob Barking mad to buy a £700+ GPU for one game. I think those days are coming to an end.

I have mentioned before how consoles hold back the PC. This is truer now than ever. These games are actually being made and optimised for a console over the PC, meaning they will only push the graphics about as far as a console can go. I mean sure, they will add in extras on the PC like lots of AA options and stuff but nothing that a console could not manage. Which is why Nvidia, in their "Rabbit and tortoise" like haste have over taken what there is to play. I think this is also a large part of why I just don't care any more.

It doesn't help when you put a disc into a XB1X and after a couple of hours of downloading have a game that looks almost as good as a game you spent two grand to play. Actually no I will change that, from 30ft away on a very large 4k TV looks every bit as good.

You will have to be a bloody good salesman to make me spend another bag of sand to play a game, the same game I may add, at higher FPS.
 
Hey strip out the middle bit (mind bogglingly insane) and my statement would hold true. :D

Also sod memory manufacturers. that GB Gb BS is such a crappy move. :mad:

Pretty much all ISPs uses it too, but yeah, it's confusing.

Seen so many people say they have X Gb speed and then cry because the highest download speed they've ever had was X GB/Sec. :p
 
I have mentioned before how consoles hold back the PC. This is truer now than ever. These games are actually being made and optimised for a console over the PC, meaning they will only push the graphics about as far as a console can go. I mean sure, they will add in extras on the PC like lots of AA options and stuff but nothing that a console could not manage. Which is why Nvidia, in their "Rabbit and tortoise" like haste have over taken what there is to play. I think this is also a large part of why I just don't care any more.

It doesn't help when you put a disc into a XB1X and after a couple of hours of downloading have a game that looks almost as good as a game you spent two grand to play. Actually no I will change that, from 30ft away on a very large 4k TV looks every bit as good.
.

I have been saying this for years and always got flamed. But it is an honest truth. Devs are just to lazy and cheep to develop for true PC hardware.
 
I have been saying this for years and always got flamed. But it is an honest truth. Devs are just to lazy and cheep to develop for true PC hardware.

I think its a little harsh to say lazy. Maybe limited is more of a word given how hard EA drives devs to release something before they are satsified. Cheap is definately true. Everyone triest to cut costs wherever possible. And now with the popularity of the e-sports culture and gaming becoming ever so "normal" in life, its easy now than ever.

Some devs are passionate about their IP and would happily delay everything to get it polished. After all, you want to be proud of your launch success. But when some big publisher pushes you on an almost impossible deadline, then you need to down prioritise some things. I use this as an example of Multi GPU no longer being a viable investment now.
 
I have been saying this for years and always got flamed. But it is an honest truth. Devs are just to lazy and cheep to develop for true PC hardware.

This is extremely ignorant.

Maybe it's just the fact that it's far more difficult to develop for PC than a console.
 
Back
Top