Idk it's actually ridiculously rare you get truly random data request patterns in consumer workloads, and most of the ones you do get are very low intensity and unlikely to ever hit the limits, only common example that does is the pagefile really. Most software is still designed with the limitations of HDDs in mind so even though queue depths beyond 4 are still often avoided, the OS generally tries to make every queued command as close to sequential as possible. Even if you have a random data access pattern, as long as it isn't latency critical with an SSD you basically treat it as a sequential one, it's just branchy & latency critical software that forces constant no-queue depths.
For 90% of consumer use cases, loading OS', software, games, media, all the usual use cases you're basically just pulling big chunks of predictable data into RAM, or throwing big chunks of predictable data onto the drive(Installation or storage is rarely for low depth random access patterns unless you've got it set up as a scratch disk for some specific software).