Programmer cuts GTA V's load times by almost 70% - Rockstar needs to fix this!

I am extremely curious to see if they will actually fix it now that they have a solution handed to them on a silver platter. They are a very, very arrogant and vane company, so I will be positively surprised if they go ahead and actually do it.
 
Would like to see this benchmarked on different system configurations. That being said it's pretty embarrassing for the devs that coded this initially.
 
Rockstar are pretty well known for releasing unoptimised code, This should not surprise anyone and I highly doubt they will do anything about it, Rockstar worship money above all else, Even integrity, They will ignore this.
 
Would like to see this benchmarked on different system configurations. That being said it's pretty embarrassing for the devs that coded this initially.

Wouldn't quite agree, from the looks of it, seems like the implementation used for sscanf had changed from the original coders environment, would assume this is something that compiled and ran fine for PowerPC's or possibly with the x86 consoles own compilers, but the code didn't get re-profiled and updated for x86 PC port.
 
Last edited:
Would like to see this benchmarked on different system configurations. That being said it's pretty embarrassing for the devs that coded this initially.

I imagine that the original dev just wrote a function that reads a file and checks for duplicates and expected it to be run once. Plus there were, for example, just 100 entries in the file.

Then someone else probably added an easy way to get a single value over that. Then someone else had some use for going over all the values and decided to use that function, a black box.

That all went unnoticeable because there were just 100 values in the file. Even at 1000, it was hardly an issue, taking a fraction of a second. Years later, it's 63,000, and that bit of code is a major bottleneck.

The problem isn't directly with either the function itself or its use, but the combination of them. Programmers all use functions without going into their code, and people often write functions inefficiently when they don't expect them to be a bottleneck. That's something that's typically found later, when profiling for performance bottlenecks.

That, to me, is the embarrassing part, that nobody has seen fit to look into the long loading times and fix them, not that someone wrote suboptimal code to begin with.
 
Last edited:
Yeah I agreed with ET3D. Its not that it was unoptimized to begin with. Only 100 entries is incredibly small and quick.

But as the problem got bigger and bigger, you'd think after all that time compiling and then launching the game hundreds if not thousands of times per team that fixing the loading issues would end up saving the company hundreds of thousands of dollars as the teams can work quicker and be more productive than the alternative, which is probably walk away for a quick break but end up taking longer than you need and coming back after the game already loaded.

Should have been a priority later in development.
 
The load times have been rather horrid since the beginning, so I doubt they began by parsing a mere 100-1000 entries.

I had a strong hate/love relationship with early GTA V online as the servers could crash occasionally and then getting everyone back to the same lobby could easily take like 15 minutes.
 
Back
Top