358.70 developer driver vs icafe

S.T.A.L.K.E.R

New member
I just tested latest new developer driver 358.70 on windows 10 Pro x64 fully updated, and compared it to one old icafe driver that i am was recently using on win 7 x64 on another partition..

344.47 icafe result on win 7

http://postimg.org/image/54xntctw9/

Latest developer driver 358.70 result win 10

http://postimg.org/image/kjsymfkfn/

So as you can see these new nvidia drivers are still bad,freezing and lower fps and score on my old card..i had also tested 344.47 icafe driver on win 10 works too,and have 1-2 fps more gain..so if you are using old Fermi cards gtx 400 and 500 this old icafe driver is one of the best...have better fps in all games..i am using another right now it is even better...In Windows 10 icafe 344.47 have dx api 11.1 and latest driver have only 11.0 thats funny nvidia..

They should give Fermi DX12 wddm 2.0 still ignoring to give driver...dx12 games are already released.....NVIDIA the way customer is meant to be played
 
Last edited:
I also tested unigine valley on low 1080p and again old icafe 344.47 is winning with + 3.5 avg fps more in lowest ,average and highest fps. Which is not bad,so looks like older driver have fermi inprovements meawhile latest developer driver is slower.

It is shameful that we need to use old icafe driver on latest games in order to have bigger fps.
I had notice up to 5 fps more in Dark Souls 2 schollar of first sin.. NVIDIA needs to truly support all gpu-s instead leaving them to dust, and giving shallow generic drivers and to only truly improve latest Maxwell...thats why i will buy AMD gpu because of nvidia behavior and all fiascos we saw in past. Same happened with Kepler which was abandoned after 347.88 whql and r9 290x was beating 780 ti.
 
Last edited:
Heyyo,

They should give Fermi DX12 wddm 2.0 still ignoring to give driver...dx12 games are already released.....NVIDIA the way customer is meant to be played

Dx12 gameS? Nah... there's one, that indie game made by one guy trying to cash in on the Dx12 craze before it even properly takes off... and check the Steam reviews, it's unfinished. Should have been Early Access from what I've read. It runs mediocre in both Dx11 and Dx12 modes. :P

For actual Dx12 content? There's the Ashes of Singularity benchmark for early backers and the 3DMark 3D API tests. That's it. ARK cancelled their Dx12 patch until they can better optimize it... my guess? Same issue that indie horror game dev is running into where it's not stable yet on AMD or NVIDIA hardware.

Just... give it time. Dx12 is definitely not as polished as everyone hopes. NVIDIA has been improving though as their latest drivers seem to fix the Async compute issues in Ashes of Singularity. I wouldn't know though as I rarely play RTS games and am not forking over $50 to do early access on an RTS.

To be fair too? I bet AMD users are also complaining about the Dx12 drivers on older GPUs too man. NVIDIA stated before that Fermi support was in the works. It'll take time for it to come out... just give it time brah. Things will work out in the end. :)

Let the Dx12 "beta testers" do their "beta testing" for meow. 2016 is still a bit away before actual Dx12 titles release from big name gaming devs.

Heck.. if Oxide games is so confident in their Dx12 capabilities? Why haven't we seen that mysterious Dx12 version of Star Swarm go public eh? My guess with Ashes of Singularity is they're trying to cash in on it too... but then again, makes sense. They are a company and free benchmarks don't pay the bills.
 
Heck.. if Oxide games is so confident in their Dx12 capabilities? Why haven't we seen that mysterious Dx12 version of Star Swarm go public eh? My guess with Ashes of Singularity is they're trying to cash in on it too... but then again, makes sense. They are a company and free benchmarks don't pay the bills.

Star swarm is couple years old and DX12 was probably on the drawing board at the time it came out. Mantle was the only next API out for at the time. I don't think they are trying to cash in on the DX12 hype. They made the engine a while back specifically for DX12. Unlike all the other DX12 content out there(not a lot admittedly) going from DX11 to 12 certainly provides a very big boost for AMD(upto 80%) and Nvidia either loses some performance or gains tiny bit depending on scence(thats an Nvidia driver issue though, unless they fixed that recently).
 
Heyyo,

Star swarm is couple years old and DX12 was probably on the drawing board at the time it came out. Mantle was the only next API out for at the time. I don't think they are trying to cash in on the DX12 hype. They made the engine a while back specifically for DX12. Unlike all the other DX12 content out there(not a lot admittedly) going from DX11 to 12 certainly provides a very big boost for AMD(upto 80%) and Nvidia either loses some performance or gains tiny bit depending on scence(thats an Nvidia driver issue though, unless they fixed that recently).

When Star Swarm came out, Mantle and Dx11 were out and that's what is on the internet including Steam. Oxide Games then released a Dx12 preview version to Anandtech which they benchmarked... and never released to the public. I'd love to see how it runs on my PC... but I guess Oxide Games didn't want to bother with the free tech demo and instead focus on Ashes of Singularity where they're charging $50 for Early Access...
 
Heyyo,



When Star Swarm came out, Mantle and Dx11 were out and that's what is on the internet including Steam. Oxide Games then released a Dx12 preview version to Anandtech which they benchmarked... and never released to the public. I'd love to see how it runs on my PC... but I guess Oxide Games didn't want to bother with the free tech demo and instead focus on Ashes of Singularity where they're charging $50 for Early Access...

I paid $100 for the game? And it's worth every penny. It's also not Early access btw;) Not yet anyways.. Early Access doesn't mean anything tbh. It has a bad connotation for no reason. $50 is still cheaper than a console game.. $100 is still cheaper if you add in all those damn DLC an expansion crap they do for most console games these days.
 
I would like to see dx 12 on my antient Fermi card aahahahaa that would be interesting.
Game optimisation is a key,,,for example i am playing latest Mad Max and Metal Gear solid 5 the phantom pain perfectly fine on almost max out custom settings at 1080p,,30-45 fps..meanwhile AC Unity is unplayable even at low resolution..everything is in coding.
Same for PhysX it runs very bad in many games even on latest hardware,but in Alice Madness Returns it works amazing ultra vs low physx almost no performance hit..but they need to sell new gpu-s offcourse we know what they are doing with drivers and game developers..have a nice day
 
Heyyo,

I paid $100 for the game? And it's worth every penny. It's also not Early access btw;) Not yet anyways.. Early Access doesn't mean anything tbh. It has a bad connotation for no reason. $50 is still cheaper than a console game.. $100 is still cheaper if you add in all those damn DLC an expansion crap they do for most console games these days.

There's absolutely nothing wrong with that amigo, I'm just not big into RTS games is why I'm not bothering with it is all I was saying. The only part that interests me is the Dx12 benchmarks to see how it would run on my PC and Early Access starts on Steam either today or tomorrow I can't remember. :P

I would like to see dx 12 on my antient Fermi card aahahahaa that would be interesting.
Game optimisation is a key,,,for example i am playing latest Mad Max and Metal Gear solid 5 the phantom pain perfectly fine on almost max out custom settings at 1080p,,30-45 fps..meanwhile AC Unity is unplayable even at low resolution..everything is in coding.
Same for PhysX it runs very bad in many games even on latest hardware,but in Alice Madness Returns it works amazing ultra vs low physx almost no performance hit..but they need to sell new gpu-s offcourse we know what they are doing with drivers and game developers..have a nice day

Ubisoft had a really bad 2014 and their 2015 hasn't been great either... which is odd since 2013? Hmm, they might have easily been in my top five game devs... with amazing games like AC Black Flag and Far Cry 3 giving me so much joy and the quality of the PC version... wow.... and here we are in 2015 with broken PC versions of Watch_Dogs which they abandoned, AC Unity which is also abandoned and Far Cry 4 which has performance issues... It shocks me that Watch_Dogs especially ran bad given the long development time... oh well, poop happens.

PhysX? Eh, it doesn't necessarily run that bad on NVIDIA hardware, it's just half the time the game engines are messed up. Witcher 3's HairWorks being a prime example. Every patch they added in "optimized HairWorks" and added in options for changing the tessellation in the hair... so... wth happened there? People say NVIDIA blocked them from allowing a configurable tessellation which makes me laugh since it brutalized all NVIDIA GPUs. To me? That's another Watch_Dogs situation... I still remember on release day... 10fps average in the locker room where the game started... and outdoors it dipped to 6fps average... at least CD Projekt Red have fixed The Witcher 3's performance issues which is good, unlike Ubisoft. :\
 
There's absolutely nothing wrong with that amigo, I'm just not big into RTS games is why I'm not bothering with it is all I was saying. The only part that interests me is the Dx12 benchmarks to see how it would run on my PC and Early Access starts on Steam either today or tomorrow I can't remember. :P

Each to there own:)
Oh and early access is tomorrow:)
(hopefully thats not under NDA.. )
 
To be sincere i believe that R9 390X 8GB is best solution right now in terms performance per dollar...yes 980 ti is better but its double price....Here in Serbia i can buy 390x for 310 usd, meanwhile 980 ti is 700 dollars ahahhaa thats ridiculous to pay double for 10-15 fps advantage depends from game tittle...if game is nvidia sposnored and gimped you have 15-20 fps more,in fair tittles 390x is much closer.. and we know that both cards are weak for 4K and you need at least 2 of them in CF/SLI to run current games at 60+fps... NVidia Gimpidia flagship gpu-s was always overpriced and wasted investment especially gimped 6gb 980 ti,,,and there are already some games that eat vram even more for example Lords OF The Fallen eating 8 gb on 4k on titan x....so 6gb is not future proof and mark my words DX12 games and incoming UE4 games will be massive vram hogs ,thats business Pascal and arctic islands needs to be sell .....for 1080p and 2K r9 390x 8b is best solution right now......i dont buy nvidia gpu-s anymore they are liars and scammers and everybody knows that multiple fiascos. ..i was planned 970 but after 3.5 gb fiasco i had give up..AMD is way better and fair to customer
 
Last edited:
Heyyo,

Each to there own:)
Oh and early access is tomorrow:)
(hopefully thats not under NDA.. )

Ah that's pretty cool mang! If you got it, by chance do you have benchmark comparisons on your rig between Dx11 and Dx12? I dunno if there is a thread for that on these forums I should really look lol. Like I said, I'm curious as to the results... mainly between the average gamer's builds and not so much the articles.

I used to be into singleplaye RTS games back in the day but I dunno, I'm so bad at them nowadays lol that whole APM thing and how crazy good some players are just wrecks my single-track-minded play style to bits and pieces. Company of Heroes multiplayer against other people for me? Ehh... I just get wrecked a lot haha. I'll stick to my FPS, RPG and World of Tanks I guess heh.

To be sincere i believe that R9 390X 8GB is best solution right now in terms performance per dollar...yes 980 ti is better but its double price....Here in Serbia i can buy 390x for 310 usd, meanwhile 980 ti is 700 dollars ahahhaa thats ridiculous to pay double for 10-15 fps advantage depends from game tittle...

That is absolutely true that the R9 390X > GTX 980 for price to performance. I doubt there's any place in the world stuff like the MSI Gaming Edition 390X is more expensive than the GTX 980 Gaming Edition. For that performance bracket? It is a waste of $100 or so to get the GTX 980. It's in a bad spot, much like before with the Titan Z versus the R9 295x2... then again, the Titan Z was built for gamers who like CAD... so it wasn't a true consumer dual GPU solution so maybe it's not that fair of a comparison... but damn, that is still quite the price gap either way. :p


if game is nvidia sposnored and gimped you have 15-20 fps more,in fair tittles 390x is much closer.. and we know that both cards are weak for 4K and you need at least 2 of them in CF/SLI to run current games at 60+fps...

NOT true. The only shitty part is most games with NVIDIA GameWorks were built poorly for PC as of late. Look at Far Cry 4... one of Ubisoft's recent games that doesn't suck performance-wise... at least not completely suck lol.

http://www.hardocp.com/article/2015/01/07/far_cry_4_video_card_performance_review/5#.VikZ436rRaQ

Notice an AMD R9 290X scoring the same as an NVIDIA GTX 980. That's not nerfed performance at all. R9 290X = R9 390 for as close as possible comparison... I could go on and find a 390x benchmark for far cry 4... but also try to find benchmarks after January 2015 since earlier benchmarks were on broken performance versions of Far Cry 4. The first page of that article from HardOCP even goes over that... like I said before... Far Cry 4 wasn't hit nearly as bad for terrible release state as Watch_Dogs and AC Unity... but it was still messed up. It's still not perfect with flickering shadows and stuff but oh well, it's plenty playable at least. :p

Another prime example is The Witcher 3 after CD Projekt RED fixed it after many patches... it definitely was a mess at release... but not Witcher 1 (still my favorite in the series tbh) or 2... especially Witcher 2 which ran like crap and the combat was broken where the combat-roll was a joke lol.

NVidia Gimpidia flagship gpu-s was always overpriced and wasted investment especially gimped 6gb 980 ti,,,and there are already some games that eat vram even more for example Lords OF The Fallen eating 8 gb on 4k on titan x....so 6gb is not future proof and mark my words DX12 games and incoming UE4 games will be massive vram hogs ,thats business Pascal and arctic islands needs to be sell .....for 1080p and 2K r9 390x 8b is best solution right now......i dont buy nvidia gpu-s anymore they are liars and scammers and everybody knows that multiple fiascos. ..i was planned 970 but after 3.5 gb fiasco i had give up..AMD is way better and fair to customer

lol Lords of the Fallen? You mean Lords of the FAIL? Using a broken game as a benchmark doesn't really count... heck, Witcher 3 barely uses any VRAM and looks better. It's up to the game developer to make a game that doesn't suck and runs properly which is seriously lacking in 2015. The Witcher 3 (after quite a few patches that is.. mainly 1.07), GTA V, Mad Max and MGS V are the only true beacons of optimized performance that I can think of off the top of my head that both look amazing and run amazing.


Anywho? Right meow for price to performance GPUs for certain performance brackets? As it stands it goes...

GTX 950 beats R7 370
R9 380 beats GTX 960
R9 390 beats GTX 970
R9 390X really beats GTX 980
R9 Fury aircooled sits in its own bracket
GTX 980 Ti is equal to the R9 Fury X... so whichever at the time has a lower price is the one to buy.

Besides, I doubt by 2016 the AMD R9 Fury X with only 4GB of HBM will be obsolete. As for 4K UHD? DirectX 12 and Vulkan (if it ever gets released that is) has multi-adapter support which means shared VRAM between GPUs... so even if games start gobbling more than 4GB of VRAM in 4K UHD? People could easily just buy a second R9 Fury X and enjoy 8GB of total VRAM shared across the two GPUs... that is, as long as mutli-adapter takes off like many people are hoping it does... might only be certain game engines though.

Also, the GTX 970? Yes NVIDIA did mess that card up. They should have just called it a 3.5GB GPU. That partitioning thing was silly as heck. Some people claim it doesn't affect their performance when they get about 3.5GB, some do... so they should have just played it safe and called it a 3.5GB card from the get-go or not even bothered with that 500MB partition. It's still a good performance card.. but with the R9 390 in the same price bracket and it overclocks slightly better? It just makes sense at least for overclockers to get the R9 390.

I'm not pro-NVIDIA or pro-AMD... I just buy what is most cost effective for my needs when it comes to PC Gaming tbh.
 
Last edited:
Nice post man...I am waiting for next gpu amd arctic islands that will be interesting...I am sure next generation will be monster gpu....greetings.
 
Heyyo,

Nice post man...I am waiting for next gpu amd arctic islands that will be interesting...I am sure next generation will be monster gpu....greetings.

No doubt! AMD's Arctic Islands and NVIDIA Pascal will definitely both be interesting to see what happens. 2016 should be the rise of DirectX 12... so I'm curious if both AMD and NVIDIA will offer their upcoming GPU architectures with all three tiers of Dx12 support since right meow neither are perfect. It'll also be epic if it's true that all HBM stock issues will be resolved which means a win for all consumers as HBM2-based GPUs probably won't suffer from stock limitations and price jacks due to that which currently the R9 Fury X does suffer from a bit... go-go better technology manufacturing! :)

Either way? Dx12 will be a better life cycle for AMD since they've always struggled a little with driver overhead in Dx11 and previously even before that with ATi too. No CPU overhead on the drivers definitely helps unlock the full potential of AMD GPUs as seen with Dx12 benchmarks.

Same can be said for their CPUs which have also suffered in the past due to CPU-based draw calls and multi-threading in Dx11 which relied heavily on a primary thread to doll out the tasks. Owns of AMD FX CPUs should see a healthy boost which leads me to believe that AMD Zen could indeed deliver the AMD CPUs back into enthusiast gaming PCs. Still, I'm gonna sit back and ponder and read and observe since I also don't want to consider upgrading my PC unless the benefits outweigh the cost difference... but damn it would be sick if AMD CPUs could come back to the glory days of the AMD Athlon64 where it greatly challenged the Pentium IV. We'd probably see CPU price drops from Intel and AMD and make the CPU market very competitive again.
 
Heyyo,



No doubt! AMD's Arctic Islands and NVIDIA Pascal will definitely both be interesting to see what happens. 2016 should be the rise of DirectX 12... so I'm curious if both AMD and NVIDIA will offer their upcoming GPU architectures with all three tiers of Dx12 support since right meow neither are perfect. It'll also be epic if it's true that all HBM stock issues will be resolved which means a win for all consumers as HBM2-based GPUs probably won't suffer from stock limitations and price jacks due to that which currently the R9 Fury X does suffer from a bit... go-go better technology manufacturing! :)

Either way? Dx12 will be a better life cycle for AMD since they've always struggled a little with driver overhead in Dx11 and previously even before that with ATi too. No CPU overhead on the drivers definitely helps unlock the full potential of AMD GPUs as seen with Dx12 benchmarks.

Same can be said for their CPUs which have also suffered in the past due to CPU-based draw calls and multi-threading in Dx11 which relied heavily on a primary thread to doll out the tasks. Owns of AMD FX CPUs should see a healthy boost which leads me to believe that AMD Zen could indeed deliver the AMD CPUs back into enthusiast gaming PCs. Still, I'm gonna sit back and ponder and read and observe since I also don't want to consider upgrading my PC unless the benefits outweigh the cost difference... but damn it would be sick if AMD CPUs could come back to the glory days of the AMD Athlon64 where it greatly challenged the Pentium IV. We'd probably see CPU price drops from Intel and AMD and make the CPU market very competitive again.


To be sincere i am bored from pc gaming,,,i was playing more these days on my old x box 360 its cold works great jasper edition,,,PC graphics needs to advance we are tired from dx11 ,and since crysis 2007 there is no truly graphics step forward,,,witcher 3 is downgraded compared to videos before game was released.., crysis 3 is meh,, game needs to be really good to have great story and graphics to keep me focused these days...but maybe all those years of gaming has make some saturation..dx12 is interesing and i hope to we see some serious new effects games to be more realistic....I am still using my antient i5 760 since 2010 year oc 4ghz now, and it is great cpu for gaming... dx12 makes me happy it will prolong his life..amd arctic island hbm 2 will be interesting,have a nice day
 
Last edited:
Back
Top