Is your PC ready for Starfield? The game's PC system requirements are here!

The engine will be about 5 years old at this point. All of their games are like that, because RPGs take so long to create.

Yes but that's not what's limiting them, they could make it very heavy if they didn't try to optimise it, and we will surely be seeing mods that will make RTX 4090s cry with this game.

Remember, it's not only bethesda RPGs that take a while to develop, aside from Ubisoft and EA that have literal step-by-step guides glued to the wall behind the desks of every employee on what they should do in what order to get the most eficiency out of every single thing they might be doing nad they have a separate one for everything, most other companies will instead struggle for years when developing a game.

But honestly, it is interesting to think about it this way, if you think about it, what other modern game with great graphics had a reasonable system requirement? The one I think of is Metro Exodus, and it also had a very long development cycle, is that becuase they had more time to optimise? Or is it just that developers making games for modern platforms have seen they don't need to put half the work they used to in optimising their games to run on the last gen consoles and now they are really just brute forcing everything and the result is 2023 games that somehow look the same as some of 2015 games.
 
Yes but that's not what's limiting them, they could make it very heavy if they didn't try to optimise it, and we will surely be seeing mods that will make RTX 4090s cry with this game.

Remember, it's not only bethesda RPGs that take a while to develop, aside from Ubisoft and EA that have literal step-by-step guides glued to the wall behind the desks of every employee on what they should do in what order to get the most eficiency out of every single thing they might be doing nad they have a separate one for everything, most other companies will instead struggle for years when developing a game.

But honestly, it is interesting to think about it this way, if you think about it, what other modern game with great graphics had a reasonable system requirement? The one I think of is Metro Exodus, and it also had a very long development cycle, is that becuase they had more time to optimise? Or is it just that developers making games for modern platforms have seen they don't need to put half the work they used to in optimising their games to run on the last gen consoles and now they are really just brute forcing everything and the result is 2023 games that somehow look the same as some of 2015 games.

There is only so much optimising you can do. Bethesda's engines are notoriously shoddy. In fact, it is a large cause of a lot of the crashing and glitches with their past games. I think people seem to think that it is all about optimisation. It isn't. There are many, many other factors involved, and a big one of those is GPU grunt. Which scaled well until the 20 series, since then it has been pretty bad, with each generation providing less actual raster grunt than Fermi, which was an awful gen. It is clear that Nvidia do not want gaming to progress at anyone else's pace but theirs. This is the same phenomenon we saw with Intel and quad core CPUs and so on. If no one is forcing you to release anything more than you absolutely have to? they simply just won't.

Metro Exodus was not down to optimising, dude. It was down to very small and incredibly linear levels, made to look much bigger than they actually were. The more open areas were very barren and hardly anything was in them. If you take a look at something like Wolfenstein 2? it suffered from the same issues. Looked fantastic, but the levels were extremely linear and absolutely tiny with long load times between them.

Open world games? are always very hard to run when they come out. Even Fallout 3 was, and it was not for about four years after it came out that a GPU with enough grunt and VRAM could run it with the actual HD texture pack Bethesda quietly released for it shortly after it came out. Even my GTX 280 struggled to run that well.

There are ways around it of course. Like doing what they have done in PUBG. IE, make everything at about 8x scope range look like play doh. They also did that in Fallout 3, and Fallout 4.

The problem is GPUs are not progressing enough. Ampere? was not the whole GPU tech. The 4090? is far from the largest it could be. And with AMD choosing not to even try to compete? they won't release the full sized one either. Why bother when you can charge £1800 for 2/3 of it?

And that will not change any time soon. It is compounded more and more by the lowly and mid range GPUs costing more and more and more. The 4060Ti costs a lot more than the 3060Ti did, and is no faster at all. How are games supposed to progress when that is happening?
 
And that will not change any time soon. It is compounded more and more by the lowly and mid range GPUs costing more and more and more. The 4060Ti costs a lot more than the 3060Ti did, and is no faster at all. How are games supposed to progress when that is happening?

Games not only can, they will progress, if NVIDIA and AMD don't provide enough GPU power on the PC side manufacturers wil eventually get tired of the backlash of "bad PC ports" and only release on consoles.

That's also where I say YES! It is ALL about optimisation, we always get ever more graphically impressive games each year in each console generation, eeen though the console hardware doesn't change, there's a lot of ways one can go trying to squeeze the very last drop of performance in a system, the "open world game is harder to run" argument I won't say it's false, but it also isn't like that, open world games ultimately have to rely on cheaper lighiting, making indoor lighting look bad unless the game puts a loading screen in between, also they don't render everything and all open world games do make use of render distance.

I said this before, 2023 is the first year where I can see in the near future I just stop investing on PC gaming and turn myself into console gaming, with NVIDIA pulling like this there's no point in spending ever more money on GPUs that get ever more ty, plus I can get all those Sony and Square Enix games on a PS5 that would only get a "bad" PC port a couple years later.

Thi si a complex topic and we could go weeks here talking about how it would get better if DEVs would use SSDs right, or how they need to account for different PC hardware etc, but really I guess my point here is: this is an open world game, in fact an open galaxy game, and it does look great, if only a little surreal, yet, it has very reasonable system requirements, and that is unexpected and in this day and age, it kinda looks impressive considering how much other modern games ask for, the fact that it started development a long hwile ago doesn't change the fact it has modern visuals, way more than I expected for a Bethesda game at least.
 
Games not only can, they will progress, if NVIDIA and AMD don't provide enough GPU power on the PC side manufacturers wil eventually get tired of the backlash of "bad PC ports" and only release on consoles.

That will never happen, and you know it.

Consoles now ARE PCs. There are no ports. Technically there never was. It doesn't work like that.

Secondly, what you are saying is they will get tired of money. Funniest thing I have ever heard dude. PC games now cost more than console games and they don't have to pay a license fee. £0, zip, nothing. Every console game sold gets Sony and MS £10-£15 per copy sold. If you think for one minute they would refuse to release on PC you are nuts.

Like the word port I don't think you understand the technical meaning of optimise either. There is only so far you can push that. You will be met with limitations whatever the scenario. Whether it be the game engine or the hardware. And that has always, but always been the case. You seem to think they can optimise something and make it run at twice the speed it does already like some kind of magic trick. It doesn't work that way.
 
Well all things aside PC is a first platform for many companies and new ones, optimisation i don't see as a big problem there are only so many GPU's and CPU's and they all use the same drivers that is what counts most as the software just interfaces with that.

All I know outside of the decent recommended specs on starfield, i'm glad i have 16gb of vram cause once UE5 games come out on mass your sure going to need it.

The other thing you need to expect with consoles is limited hardware but also limited low amount of software it's not like they are running a full windows style OS they are while coded well are by comparison basic leaving more of everything to run the game.

The only thing holding back the GPU hardware on PC isnt just plain greed it's the cost of the latest nodes as well, they are not going to lower their profit and the more they need to spend the more it's going to cost on top.

AMD thou i have to admit really held back on the GPU's this gen they don't impress me like RDNA2 did and they sure as well could have made much beefier GPU's than they did, in all honesty if neither nvidia or amd jump up a tier on the tech next gen then i guess the only thing to say is it'll not be worth upgrading until PS6 or xbox blah blah as there really wont be much of a reason outside heavy raytracing features like cyberpunk done.
 
The only thing holding back the GPU hardware on PC isnt just plain greed it's the cost of the latest nodes as well, they are not going to lower their profit and the more they need to spend the more it's going to cost on top.

Oh but it IS only greed:

RTX 2060 TSMC 12nm - 445 mm² - 349 US Dollars
RTX4060ti TSMC 5nm - 190 mm² - 399-499 US Dollars

Just, look at the die size difference, that's more than enough to cover the increased node costs, it also uses waaay cheaper VRAM modules, because of just oversupply of RAM and VRAM in general, also reduced prices on other discrete components given, again, high supplies after the pandemic craze wore off, so why does it need to cost more?

Here's the thing: According to one of NVIDIA's executives, the SVP of Gaming in an investor meeting, from the Bank of America, confirmed that RTX 40 is making NVIDIA more money than what RTX 30 did (like in the first X months RTX 40 made 40% more revenue than RTX 30 had in the same first months). How? If GPU shipments are down 50% how can it be making more money? absurdly high margins! that's how.
 
Oh but it IS only greed:

RTX 2060 TSMC 12nm - 445 mm² - 349 US Dollars
RTX4060ti TSMC 5nm - 190 mm² - 399-499 US Dollars

Just, look at the die size difference, that's more than enough to cover the increased node costs, it also uses waaay cheaper VRAM modules, because of just oversupply of RAM and VRAM in general, also reduced prices on other discrete components given, again, high supplies after the pandemic craze wore off, so why does it need to cost more?

Here's the thing: According to one of NVIDIA's executives, the SVP of Gaming in an investor meeting, from the Bank of America, confirmed that RTX 40 is making NVIDIA more money than what RTX 30 did (like in the first X months RTX 40 made 40% more revenue than RTX 30 had in the same first months). How? If GPU shipments are down 50% how can it be making more money? absurdly high margins! that's how.

It's like I said elsewhere, at this point there is no competition at all. And this is what happens when that is the case.

If AMD had gotten more competitive and aggressive on pricing? Nvidia would be forced to retaliate. The very fact AMD have waited on Nvidia for the past two rounds says it all. All they want to do is see what Nvidia are pricing at, then sneaking in $10 less per tier. That is not competition, it is price matching.

As such? Nvidia are now where Intel was before Ryzen came out. Coasting along, not a care in the world, and giving tiny incremental updates or no updates at all in the same tier. And it will continue that way unless people get behind Intel.

If you want to make a difference? buy an ARC card. It will sting for a while, but that is the only way forward. AMD can not and will not bring it to Nvidia, but Intel have learned a lot after the spanking they took and will indeed take the fight to Nvidia if we help them to.
 
It's like I said elsewhere, at this point there is no competition at all. And this is what happens when that is the case.

If AMD had gotten more competitive and aggressive on pricing? Nvidia would be forced to retaliate. The very fact AMD have waited on Nvidia for the past two rounds says it all. All they want to do is see what Nvidia are pricing at, then sneaking in $10 less per tier. That is not competition, it is price matching.

As such? Nvidia are now where Intel was before Ryzen came out. Coasting along, not a care in the world, and giving tiny incremental updates or no updates at all in the same tier. And it will continue that way unless people get behind Intel.

If you want to make a difference? buy an ARC card. It will sting for a while, but that is the only way forward. AMD can not and will not bring it to Nvidia, but Intel have learned a lot after the spanking they took and will indeed take the fight to Nvidia if we help them to.

But isnt it also an matter of Intel being an giant in comparison to AMD and hence having an giant (Intel) vs an giant (Nvidia) being an much better competitive position?... Intel has a lot more resources at their disposal to actually give Nvidia a good fight, atleast a better one than AMD.
 
But isnt it also an matter of Intel being an giant in comparison to AMD and hence having an giant (Intel) vs an giant (Nvidia) being an much better competitive position?... Intel has a lot more resources at their disposal to actually give Nvidia a good fight, atleast a better one than AMD.

Sort of.

Intel has shrunk compared to what it used to be. However, what has happened in that time is they have learned a lot, and they have really opened their ears and listened to what people want. It was quite probably the biggest tech humbling ever to happen, but yeah it happened.

AMD does not want to compete with Nvidia. That is now clear to see. Firstly they easily could. Mostly because the 7900XTX is absolutely miles from what it could be. I mean, like 30% or more of it is missing. They said it was because they did not want to make £1800 GPUs but I don't believe them. I don't believe them at all. The new AMD 7000 series use older nm dies wrapped around a smaller one to save them money. So I don't believe that any of their GPUs could ever cost £1800.

What they have also done, and is the absolute tell tale? they have put their profit margins up to the same as Nvidia. What I mean is, they now earn 70% on top of everything they have made and sell, the same as Nvidia. Even though you know? they are not Nvidia. RTG is absolutely tiny in comparison, and they really are taking the biscuit. Like I said, they are not competing. At some point they just gave up, and said "If you can't beat em? join em". And thus they are being equally as greedy as Nvidia. Sure they sell a lot less, but they have just turned their GPUs into a milking exercise, with no care for their reputation or the gamers.

Intel on the other hand? yes they have the money. If at some point they can fab their own GPUs? they WILL bring the fight to Nvidia totally. Nvidia can not fab GPUs themselves, and thus have a middleman (being TSMC). Intel? can cut that out, cut out that cost and sell to you directly. Also, if they even so much as charge 30%? which they are clearly doing on their ARC ones now?* then it is win win for us.

* IDK what Intel are earning per card sold. But I can tell you now, it ain't 70%. Their dies are pretty bloody big, and they are also paying TSMC. So it is probably 30% or heck, maybe even less than that.

But you are correct in a way yes. Intel have the means, and they have the clout, and they have the will to bring the fight to Nvidia. Whereas AMD? have just whimpered in the corner shoving donuts into their fat faces.
 
If you want to make a difference? buy an ARC card. It will sting for a while, but that is the only way forward. AMD can not and will not bring it to Nvidia, but Intel have learned a lot after the spanking they took and will indeed take the fight to Nvidia if we help them to.

Man, I had written an answer to your previous comment, I guess I took too long and got timed-out of my login, I don't think I have the willpower to write it again.

But mostly I said taht I do believe I know what I'm talking about and that I understand you feeeling like I'm throwing the word optimisation lightly, but that's how it is in my book, any thing that you do to achieve the same result while cosuming less resources is increasing efficiency which is optimization, I do'nt care if you have written an entirelly new piece of code to add an entirelly new system to your game, that's optimization for me.

And the argument for my point of view is simple: If there is no "porting" to PC than how come games run so badly on PC? And it's not only becasue we don't have anough VRAM they just run buggy and poorly every time. And also, how comes they will get tired of money? PC gamers have been slowly turning to consoles for ages, now they're turning way faster because there's no actual PC market anymore, they won't lose money, they will just sell to the same people on a different platform. Apple has a chance t grab the PC market here, I don't believe they will, but they can.

Also where in the hell are you looking your game prices? I still see PC games way cheaper than console games, I was talking to an ex-PC gamer, turned into Xbox gamer friend of mine just a few days ago how games like Dark Souls never gets to the same low prices on Xbox as they get in Steam. And also no, Steam still charges 30% GOG used to charge 30% as well, not sure if they still do, only epic charges 15% but people don't Use epic a lot. And like les than 1% of gamers are buying from the Dev's own launchers. Even Ubisoft that was out of steam came crawling back.
 
back to my earlier thought, i didnt say there wasn't greed there is plenty i said it wasnt greed alone. PC market is doing fine atm in my view least in terms of games.

The issues with nvidia and amd we've been over 100's of times pretty much all clear on that, same as been over Intel. the issue with Intel is they have the means to do well, but the issue is they are no where near a 4090 level card atm they are stuck in the midrange which for most gamers is fine. Same is true of AMD they could have made one they choose not too, they are just not trying to take market but slot into it comfy, it's really frustrating for me tbh as i look to get a card next gen, but there in my view wont be anything to goto unless i go back to nvidia which i wont.

If amd or Intel are going to make cards and want me to buy one then it needs to be something worth my time, i have no use for a 4060/ti style card at all.

AMD started out well with RDNA but RDNA3 is not even remotely worth the step up from my card and i'm sadly expecting that next gen will be more of the same from everyone, when in truth to justify the prices they all need to bring the performance and now Nvidia have gone into full greed mode with that offering nothing much.

None of them are being industry leaders atm and be it nvidia amd or intel in the end whichever one is dominant they will end up just the same as any other one, do not see any of them being heros they are all the same in terms of when at the top.

I'd like AMD to make the cards they could.

I'd like Intel to progress forward while keeping a sane head.

But at the end of the day all most anyone wants is lower nvidia prices.

It's simple really pick your poison, not like you have much choice, for me thats amd but not until they start making a meaningful card again as RDNA3 was in my view a massive flop.
 
Last edited:
Back
Top