Yes, in case you didn't catch it, the title of my thread is positively weeping sarcasm from open sores.
Let me start at, well, the start (It's a VERY good place to start.)
A few weeks ago, after purchasing my yummy little new 'puter, I thought I'd sit back and engage in the sweet, sweet eye candy of Far cry 2.
Installation? fine? performance check? "Well done, your system is totally ready to kick ass son!", which is groovy.
Hell, I played it on mostly high/est settings and still got playable framerates,
So why the long face?
Directx10, that's why.
Now, Even though the game is patched to the newest version, my graphics drivers are blazing edge and my system brand spanking new, this game, for some unknown reason, HATES DX10.
So why not run in DX9c? A good question, and for obvious reasons I reverted the game to engine engine number 9, bringing me back to the hectic world of malaria and molotovs I love so dearly, and all was well.
The thing that DOES bug me however, is that games developers seem to be pushing us towards DX10 hardware with chants of "ZOMG YOU GET REALISTIC SMOKE AND SHINY THINGZZZ." Nice. On the other hand, I like more than 12 FPS, thank you very much.
So even though I'm on DX10 hardware, with new drivers, a brand new game with a brand new patch, as mentioned, I still get an unplayable experience.
Here's a tip Nvidia/Ubisoft. Don't lie to me please. You can scream "REALISTIC SMOKE AND GLASS" all you like, but at the end of the day if my game resembles a jerky 1800's sepia motion picture then I'm not going to be a happy bunny, no matter how oily the smoke.
On a parting note, I've tried DX10 Far cry on different systems VS dx9 with the same result. Maybe I'm unlucky, or maybe, just maybe, we're not all quite so version 10 ready as they'd like us to believe.
Your thoughts? Any other major offenders? feedback is welcome.