AMD says that the era of 4GB graphics cards is over - Future games need more VRAM

Well, it's a 1080p card and IMO you don't need current gen Ultra textures at that resolution. Dropping to High would equalize the FPS in most cases.
 
Wonder if AMD got wind that Nvidia will be releasing their 3060 or whatever it is as a 4gb model and are trying to make these look redundant before they come out or push Nvidia to put more ram on there mid range cards and push the prices up or hurt their profitability.
Either way they're right, as processors are getting quicker and able to push more data to the GPU's to render, more vram is going to be needed as is ram for the processor. I wonder how soon the combined 16gb is going to be a bottleneck in these new consoles?
 
Nah I think it is more than the new consoles have more, so games will need more. And it will get to a point where lowering settings just won't work, same as the 780 and all of those 3gb cards.

As consoles get more VRAM and higher resolutions the textures just become bigger.
 
Wonder if AMD got wind that Nvidia will be releasing their 3060 or whatever it is as a 4gb model and are trying to make these look redundant before they come out or push Nvidia to put more ram on there mid range cards and push the prices up or hurt their profitability.
Either way they're right, as processors are getting quicker and able to push more data to the GPU's to render, more vram is going to be needed as is ram for the processor. I wonder how soon the combined 16gb is going to be a bottleneck in these new consoles?

An Nvidia XX60 series card with 4GB would be a terrible idea for Nvidia. The 1060 from 2016 had 6GB.

TBH, these days I feel sorry for anyone that was suckered into getting the 3GB model of the 1060. I still think that card is false advertising, it just wasn't a 3GB 1060, the fewer CUDA cores too.

This is designed to make Nvidia's 4GB cards like the GTX 1650 and 1650 Super look bad. They don't have 8GB versions like the RX 5500 XT.
 
An Nvidia XX60 series card with 4GB would be a terrible idea for Nvidia. The 1060 from 2016 had 6GB.

TBH, these days I feel sorry for anyone that was suckered into getting the 3GB model of the 1060. I still think that card is false advertising, it just wasn't a 3GB 1060, the fewer CUDA cores too.

This is designed to make Nvidia's 4GB cards like the GTX 1650 and 1650 Super look bad. They don't have 8GB versions like the RX 5500 XT.

It's like I said dude, lower end cards are solely designed not to last.

It's just to keep you coming back. AMD and Nvidia know exactly where the gaming market is headed long before we do. And they know what to put out as a product and what to market. A 3gb 1060 got the job done then. Right then in that moment. That doesn't mean it was ever going to last.

Feel bad for me ! I bought two sodding Fury X and they were expensive and long before they had outdone their usefulness they were running out of VRAM and black screening my rig FFS. And it wasn't due to a lack of horsepower, it was literally that they were running out of VRAM. AMD did fix this, but then they ran from the paging file and crawled to single digits.

VRAM is important. Very important. Once you run out no matter how good your card is you've had it.
 
I would love to see 12GB entry and even say 16GB if you do after massive modding and to future-proof it a bit.
 
They aren't wrong, even at 1080p there are significant performance gains. Assets are only going to get bigger in file size even if you lower settings they will still need more vram.

It's a new console generation which means a new pc generation. Remember the goal of PC gaming is truly to have the newest best tech so we get more powerful stuff than consoles but the AAA titles are designed for console first and it's likely devs will dedicate more to vram than system memory. Also they can change it on the fly.

It's time for gamers to upgrade with cards that have 8 gigs minimum. I have a 2080ti and I can max my 12 gigs of vram. So... More ram is better... Ideally we would start to see 16 gig+ cards very soon.

Sorry for rambling I'm under the influence.
 
It's like I said dude, lower end cards are solely designed not to last.

It's just to keep you coming back. AMD and Nvidia know exactly where the gaming market is headed long before we do. And they know what to put out as a product and what to market. A 3gb 1060 got the job done then. Right then in that moment. That doesn't mean it was ever going to last.

Feel bad for me ! I bought two sodding Fury X and they were expensive and long before they had outdone their usefulness they were running out of VRAM and black screening my rig FFS. And it wasn't due to a lack of horsepower, it was literally that they were running out of VRAM. AMD did fix this, but then they ran from the paging file and crawled to single digits.

VRAM is important. Very important. Once you run out no matter how good your card is you've had it.

Realistically, no card is designed to 'last'. All cards are designed to be replaced at some point. So you can make it sound like the executives planning their lineup are all sitting in their fancy chairs sipping fancy coffee trying to find ways to make their budget cards age quicker. Business wise, I don't think that pays off, not in the long run and not if you're struggling financially. AMD predicted a future in which 4GB of VRAM would be beneficial. Nvidia decided to cut costs on the VRAM capacity with the 780 and 780Ti, while AMD decided to splurge, right from the beginning of the design process I imagine. AMD did the same with Polaris. Their €250 card had the same amount of VRAM as their competitor's €650 graphics card (and more than their own). All that seems to have paid off, both for consumers and AMD. Not across the board, but in many areas.

You also have the cost-to-performance ratio. You can't expect budget cards to have the absolute best of the best. A lot of people buying budget cards aren't playing AAA titles at max settings. They're playing e-Sports titles that are fine on 2GB, or they're playing old games or MOBAs. And AMD know this. This slide is for a specific market, not ALL the people that bought GTX 1060 3GB models. I disagree that those cards are useless. I know people still happily gaming on GTX 760 2GB cards because they don't care about modern games. The ones they do are not hard to run.

These companies are segmenting their products; they're not trying to devalue them after a year or two just so people will buy the next model. In fact, I don't know many people who buy a low-midrange card every 2-3 years. If your point was true, they'd all be upgrading every generation because they'd have to. But they're not, because they don't have to. Obviously some gamers will be conned and fall prey to 3/4GB VRAM limitations, but I hardly think it's poor practise on the part of these companies. Consumers should make informed decisions. If you're a complete newb and you have €200-300 to spend on a card and you want to play all the newest titles for the next 3-5 years at decent settings at 60 FPS, you have a responsibility to type a few words into Google and spend half an hr researching what would be best for you. If you don't and you buy the wrong product, in my opinion that's on you, not AMD or Nvidia.
 
Back
Top