Godfall will use 12GB of VRAM at 4K Ultra settings - The RTX 3080 has 10GB...

HWUB did a chat about it thismorning (VRAM gate). They said that the 1% don't completely fall apart, but FPS do.

Time is going to tell I guess.
 
That's a lot of VRAM. Should be a gorgeous game. If this becomes the norm, the 3070/3080 as Alien has always said will become inferior products even if only in absolute terms.

I wonder if it's is a deliberate attempt by AMD to outdo Nvidia.
 
That's a lot of VRAM. Should be a gorgeous game. If this becomes the norm, the 3070/3080 as Alien has always said will become inferior products even if only in absolute terms.

I wonder if it's is a deliberate attempt by AMD to outdo Nvidia.

This is an AMD sponsored title, so yeah some of it could be deliberate.

The problem as I see it is that many games will be made to console parameters before the PC. Meaning they won't bother to completely redo textures and so on for the PC.

Every one knows that devs will do as little as possible for the cash in. We all know this, maybe we all agree on it? IDK. This is why all of the awesome new features of DX12 have fallen by the way side. Remember implicit and explicit multi GPU? remember all of those other performance boosting features? yeah, so far none of them have even been touched. Now whilst I doubt that will change much I would throw a wild guess and say that the Xbox uses DX. That is in fact where it got its name from (Direct Xbox, or just Xbox for short). That was the whole reason Microsoft developed it in the first place.

So, if people are coding for that why would they bother to code for anything superfluous that won't improve the payday?

In any usual event I would say yeah, the 30 series will be absolutely fine. However, that is the mentality that caught me out before. I just totally didn't fathom in the unknown. Which is what it is when the consoles evolve. The entire dev cycle, game demands and etc all completely change.

I even stuck my neck out saying 10gb was more than enough for the 3080. However, I also said that Nvidia know exactly how to play the market. They want you on Ampere as long as THEY want you on Ampere and no longer. As soon as they launch something else they want you to buy it, this is all planned and thought about in advance.
 
That's a lot of VRAM. Should be a gorgeous game. If this becomes the norm, the 3070/3080 as Alien has always said will become inferior products even if only in absolute terms.

I wonder if it's is a deliberate attempt by AMD to outdo Nvidia.

TBH, this is why AMD's Infinity Cache is so clever. Yes, it takes up a lot of die space, but so does a wider memory bus.

With a 256-bit memory bus, AMD can make a 16GB card with eight 2GB GDDR6 modules. With a 192-bit bus, they will have 12GB on lower-end cards. That's plenty.

Nvidia snookered themselves with GDDR6X, as Micron has no 2GB modules to give them. They needed with wider memory bus and the higher speeds of GDDR6. More chips mean more power and GDDR6X has to be more costly than normal GDDR6 memory.

Nvidia has no real answer to this. They need Micron to release 2GB GDDR6X modules. for 20GB RTX 3080 cards, but that will kill the RTX 3090's market niche. A 16GB RTX 3070 could also be bandwidth staved when compared with AMD.

The problem that AMD has now will be normalising higher levels of VRAM usage.
 
12 gb vram?

I believe Godfall 12 GB VRAM requirement for 4K is just artificial amount. This AMD sponsored / bundled game is clearly promoting RX 6000 card with 16GB VRAM over RTX 3080 with 10GB VRAM.
 
The problem that AMD has now will be normalising higher levels of VRAM usage.

Usually when you give a dev options they will take them tbh.

One thing we never see on PCs is optimisation like the consoles get later in their life. Like, if you go back and compare a launch title to one near the end of life? the launch titles look terrible and don't run much better.

Sadly GPU technology in PCs is never around long enough.
 
I believe Godfall 12 GB VRAM requirement for 4K is just artificial amount. This AMD sponsored / bundled game is clearly promoting RX 6000 card with 16GB VRAM over RTX 3080 with 10GB VRAM.


never heard such a thing happening..... :D


nvidia and their 10gb cards are just stupid.
it was pretty clear that for 4k gaming and new titles 10GB would be a potential issue.


the whole 3080 series is a mess.
with 20xx it was price with the 30xx series it is everything else.
the price was a lie.. there is no way to get a 3080 for 699 euro here in the middle of europe.
 
Last edited:
Usually when you give a dev options they will take them tbh.

One thing we never see on PCs is optimisation like the consoles get later in their life. Like, if you go back and compare a launch title to one near the end of life? the launch titles look terrible and don't run much better.

Sadly GPU technology in PCs is never around long enough.
I don't think that's quite 100% the reason, if you took say a HD7870XT it would run a lot of 2019 games at 1080p about as well as it ran 2013 games while the games would look a lot better too, a lot of the progress comes from better techniques generally and making better use of the overall philosophy of architectures/APIs than fully hardware specific optimisations.
 
Last edited:
I don't think that's quite true, if you took say a HD7870XT it would run a lot of 2019 games at 1080p about as well as it ran 2013 games while the games would look a lot better too, a lot of the progress comes from better techniques generally and making better use of the overall philosophy of architectures/APIs than fully hardware specific optimisations.

The hardware uptick in consoles was quite small at first. Well, in console terms? it was pretty big but yeah in PC terms? it was lol really. You got what? a 470 in the XB1X? a base entry level GPU.

So the coders will work to the parameters they have. In the consoles. They get it running on them first and foremost, then they create what they need to to get it running on a PC and usually nothing more. In some cases (like Rockstar in their later years) they hold off on the cash in, get it running nicely, make it better ETC but seriously how many actually bother to do that? I can't tell you how many "PC games" I have fired up in the past few years (since the consoles went X86) and had to wait until 15 minutes into the game to even access the settings. You fire up the game and it just immediately gets going like it would on a console. They didn't even bother to code in a frickin PC menu or anything else.

This uptick however? is absolutely enormous. Which will push even the entry level gaming PC to their limits, given the console is about equal to a £1000 PC.

The cost of entry is about to go up. Well, unless Nvidia can make loads of these wonderful cheap Ampere GPUs (yes there is sarcasm in there).
 
I don't think that's quite 100% the reason, if you took say a HD7870XT it would run a lot of 2019 games at 1080p about as well as it ran 2013 games while the games would look a lot better too, a lot of the progress comes from better techniques generally and making better use of the overall philosophy of architectures/APIs than fully hardware specific optimisations.

As someone who tests a lot of games, I'll say that I had to abandon R9 380 and GTX 960 testing because 2GB of VRAM wasn't enough anymore for new titles without big compromises.

Strangely enough, it was Vulkan and DX12 games that really hated insufficient VRAM. A lot of it was due to how developers controlled those kinds of situations in their engines.

When many games are made with Xbox One as a baseline, and Xbox One features 8GB of memory, 2GB just isn't enough in the long term. That said, it did take a few years for 2GB cards to start running into issues.
 
Nvidia making a mistake with VRAM? Noooo, *cough 970. *cough 780.

First things first. It's not a mistake. AT ALL. 970 wasn't a mistake. It was completely deliberate. Everything they do is deliberate. They knew exactly what they were doing with the 970, and they know exactly what they are doing now.
 
I found my HD7870XT still did quite well on a fair few games, first game I ever couldn't run at 1080p was Warzones but then I'm not the biggest gamer. However, games like Forza Horizon 4 (And any new racing games), and quite a few online shooters (Including non-Warzone Modern Warfare) handled stuff pretty okay, sometimes I'd have to turn the rendering res down, but not any lower than you'd typically find it turned down on an Xbone.
 
First things first. It's not a mistake. AT ALL. 970 wasn't a mistake. It was completely deliberate. Everything they do is deliberate. They knew exactly what they were doing with the 970, and they know exactly what they are doing now.
Oh, I know it's deliberate. I'm still mad that they gimped the 780 with 3GiB rather than 4GiB, my card went from top end to irrelevant overnight because of that.
 
Oh, I know it's deliberate. I'm still mad that they gimped the 780 with 3GiB rather than 4GiB, my card went from top end to irrelevant overnight because of that.

That die could not handle 4gb dude. It could only handle multiples, like many other archs. So 6gb would have been the next up (Titan) and to be fair to Nvidia you did have a choice. There was also a select few 6gb 780s.

This is just a stark warning. Stop.early.adopting.stuff.

That way Nvidia's plans go perfectly and you double dip. Wait for stability, wait for drivers and support and wait for things to settle out.

The problem is? control. No one seems to have any any more.
 
Two questions:


  1. How much VRAM does a current game at 4k actually use?
  2. Does Godfall support DLSS?
Since I'm still playing at 1080p and will most likely only upgrade to higher refresh rates instead of resolution, I would have no problem with Nvidia's offerings and my 1070 Strix of course doesn't get even to half of its 8GB of VRAM.

Now, if Godfall supports DLSS, then 4k gaming would be no problem even with 10GB of VRAM, right? Or did I get something wrong there? If I didn't then I see absolutely no problem, especially since I haven't seen a completely used up VRAM once.
 
Two questions:


  1. How much VRAM does a current game at 4k actually use?
  2. Does Godfall support DLSS?
Since I'm still playing at 1080p and will most likely only upgrade to higher refresh rates instead of resolution, I would have no problem with Nvidia's offerings and my 1070 Strix of course doesn't get even to half of its 8GB of VRAM.

Now, if Godfall supports DLSS, then 4k gaming would be no problem even with 10GB of VRAM, right? Or did I get something wrong there? If I didn't then I see absolutely no problem, especially since I haven't seen a completely used up VRAM once.

It doesn't support DLSS. And it depends on the game TBH. DOOM Eternal can do well over 8GB at 4K max settings.
 
It doesn't support DLSS. And it depends on the game TBH. DOOM Eternal can do well over 8GB at 4K max settings.

Watchdogs is another. 8gb is barely even enough for 1440p. At 4k? forget it.

Now I studied that one very carefully and apparently it's a mixture between last gen and next gen. So basically they got half of it done, then got handed a new data sheet. As such I accept it may not be very well behaved.

Some of the early DX11 titles were system killers. It did get much better though over time thankfully.
 
Back
Top