nVidia enable hybrid PhysX in 257.15 drivers

It's been confirmed, that this was a bug in the driver - they had just forgotten to block the use of Physx in hybrid setups. This is also the reason why they have pulled the drivers from their website again. However, if you can live with a beta driver, you can get PhysX in a hybrid setup with it.
 
name='Lallespasser' said:
It's been confirmed, that this was a bug in the driver - they had just forgotten to block the use of Physx in hybrid setups. This is also the reason why they have pulled the drivers from their website again. However, if you can live with a beta driver, you can get PhysX in a hybrid setup with it.

If thats true its no mistake, just a PhysX marketing stunt.
 
You think it was deliberate Tom? That's interesting, you may be right. nVidia have been known to use, let's say, unconventional tactics, but I think if they allowed it to be enabled it would actually help their sales. Let's face it, Fermi was too late, too hot, and too expensive, so now most enthusiasts are sporting ATI cards. What if they actually want PhysX too? Well, with this they could have, and nvidia might actually sell some cards that are profitable (read, cheaper). They could always re-enable the lock once the high end catches up. Shooting themselves in the foot again in my opinion.

nVidia have confirmed that future WHQL drivers will not have this 'bug'.
 
I find nvidia a strange company. They at first allowed nvidia cards to do physx and ati to render. they they blocked, then they said it is now physically impossible - not they prove that what they told us was a lie (we already knew that)
 
I think Tom may be right. They dont want to loose business in the PhysX department so this might be some sort of covert driver update for those using the old physx drivers that allowd an ati card to render. I still think physX is a waste. I think in about a year we will see some actual working proof of multi-hardware physx like on OpenCL and Directcompute11 (probably havok will be first)
 
ime due my 4890 back,so ime goin to try the physx hybrid mod and see what happens.

put a good use to my 9800gtx
 
Should just point out that an 8800GT wouldn't be up to the task. Batman suggests at least a 9800GTX.

A while back I did some experiments offloading Physx onto other cards and they actually slowed my 280 to a crawl. This is the problem tbh. If the card you are using is a lot slower than the main card you have it just creates a bottleneck in the system.

This whole 'use an inferior card to handle your physx' thing is mainly snakeoil. Just another thing like, for example, three 280s in SLI. By the time you have done all the scaling you actually lose performance.

Physx is pretty good but it's no match for all of the DX11 features. I ran Batman with it off and with it on and tbh I had trouble noticing much difference. A bit of steam blowing out of some holes here and there and (so I am told) a better looking cape. However, Tesselation will make for better looking fabrics so Physx may even be phased out any way.

Physx and all this other crap are just ways for the GPU companies to try and reinvent graphics again and hold a sort of monopoly like Voodoo did with Voodoo 3d. Things have progressed so much though that it will be super hard to get a stranglehold on the market like in those days. Nvidia are not alone in their quackery and have a strong enough competitor that won't go under from something as daft as Physx. Seriously, Ageia went broke trying to push it out there (because we didn't want it because it simply didn't do enough) so why do Nvidia think they can mass market it?

Oh yeah, because of their brainwashing skills, bo staff skills and numchuck skills.

Physx = poo. Gimme a half decent DX11 title any day.
 
Actually popular consensus is that an 8800gt/9800gt is plenty powerful if that is doing just physx (see fluidmark benchies). But I agree that physx is dying out, there are too few games that support it, and given that a lot of enthusiasts will be buying a 5770 this time round (by far the best value card) I think Nvidia will see a drop in popularity.
 
I had a great discussion on an.other forum. Apparently nothing Physx does can't be handed to the CPU.

Which makes perfect sense. ATI and Intel both gave up on Havok yet it's in games. Last one I noticed it get a mention on was Just Cause 2 (on load up screen). So what's doing the Havok? certainly not the GPU so it's all been handed to the CPU.

Physx is Nvidia's desperate attempt at holding a trump card over ATI. And that's exactly how they are marketting it. Sadly it's not up to much. And I certainly don't think it can rescue them either.

Only game that utilises it that's coming is Mafia 2. But they would be stupid to make it a better game on Nvidia. All that will happen is less people will buy the game.

See also Crysis 2. Apparently that is now headed for 3D. Thing is I have done a lot of reading about 3D in gaming and that's also supposed to be a load of crap. The timing is all thrown out making games pretty impossible to do well at and it's always dark because the 3D glasses are basically shaded like sunglasses.

Nvidia do like to waste money on crap huh? Maybe if they concentrated on their GPU lineup Fermi wouldn't be so hot, expensive etc.
 
Physx in UT3 was a bit of a laugh, but nothing incredible. GPUs could render physx better, but physx is so scarse its kind of pointless. I tried 3d gaming once and only once, tried it on grid and couldn't work out where my turn in had gone, I lapped about 10secs slower.

It was a relief to take the glasses off. When they get 3d gaming without glasses right, I will invest, at the mo, eyefinity is my "new technology in graphics" that I'm excited by. I held out for 6 months for fermi, saw the benches and bought the 5970s the day after.
 
Back
Top