Nvidia Kepler

be sure to take all info good or bad withn a grain of salt until official info and benchmarks are released... as the card could be wayyy better... or not even close to what they have here... in reality for any new release of anything big, people tend to over hype things... and make there own charts of there predictions.

as for official launch dates? you will see the first high end cards at pax march 14-16th like nvidia launches ever year.... then u will see the upgraded version 6 months after. and then the dual card solution again at pax the following year.
 
Dunno, don't have BF3 and don't have a proper CPU either. But even if the difference is _that_ big, it would still be cherry picking, as propably noone would get a 2560 screen with a 560ti.

I'm about to... but after that i'll buy a new GPU.
tongue.gif
 
Only thing I would say about this is BF3 doesn't use Physx

Indeed, it has it's own "physics" in the attached engine.

What we got to watch is that there's a tendency to use the term physx as physics in general by those who wouldn't know any better - and to be fair aren't wholly interested, other than the fact their games look good. But PhysX looks like a nice buzz word to use.

There are games that don't boast using PhysX, but use the libraries - these will use the cpu in all cases. I'm not exactly sure what this is all about, it may be some kind of weird licensing rule and using older versions. May be a way of easily getting the effects you want to use to work on the consoles too. Gears Of War was a weird one.
 
Been wondering, any talk about the kepler cards supporting nvidia surround on a single card? (like the AMD equivalent currently does) I'm really starting to get sold on the whole 3x24" monitors since I've been doing a lot of racing. Being able to run it on one card to keep the wallet happy for a little while would be a nice change of pace.
 
Indeed, it has it's own "physics" in the attached engine.

What we got to watch is that there's a tendency to use the term physx as physics in general by those who wouldn't know any better - and to be fair aren't wholly interested, other than the fact their games look good. But PhysX looks like a nice buzz word to use.

There are games that don't boast using PhysX, but use the libraries - these will use the cpu in all cases. I'm not exactly sure what this is all about, it may be some kind of weird licensing rule and using older versions. May be a way of easily getting the effects you want to use to work on the consoles too. Gears Of War was a weird one.

Games that utilize nVidia's PhysX use an SDK supplied by nVidia themselves. They also supply an SDK for all platforms. The nVidia PhysX library can run on both the CPU and GPU (most likely configurable, AMD GPUs don't support running PhysX)

Games that don't use PhysX use a pre-written physics engine like Havok (or a custom, handwritten one) that deals with normal things related to real world physics and is CPU only.
 
I can't wait for Kepler
rock.gif
One main point for me is power consumption when idle, even if it probably cannot rival radeons 7watts (??) in idle, 28nm should decrease power comption a fair bit. Hopefully. The juice is pricey these days in Sweden >.<
 
Games that utilize nVidia's PhysX use an SDK supplied by nVidia themselves. They also supply an SDK for all platforms. The nVidia PhysX library can run on both the CPU and GPU (most likely configurable, AMD GPUs don't support running PhysX)

Games that don't use PhysX use a pre-written physics engine like Havok (or a custom, handwritten one) that deals with normal things related to real world physics and is CPU only.

You have to download PhysX to run Metro 2033 on an ATi/ AMD card. PhysX is a program, not a driver, so runs on anything.
 
You have to download PhysX to run Metro 2033 on an ATi/ AMD card. PhysX is a program, not a driver, so runs on anything.

It was my understanding that Physx Was hardware based. But in the absence of a physx chip on the GPU, the Physx driver will give the CPU all the Phisx work to do.

You can used modified drivers, and stick a Nvidia card in your system to run just like the old Agia cards (think i still have one in the cupboared actualy) And the nvidia card will do all of your physx number crunching for you wile your AMD cards do the rendering.

However I did read that The guy who invented the Agia Physx Card and engine, Left Nvidia and went to work for AMD.
 
It was my understanding that Physx Was hardware based. But in the absence of a physx chip on the GPU, the Physx driver will give the CPU all the Phisx work to do.

You can used modified drivers, and stick a Nvidia card in your system to run just like the old Agia cards (think i still have one in the cupboared actualy) And the nvidia card will do all of your physx number crunching for you wile your AMD cards do the rendering.

However I did read that The guy who invented the Agia Physx Card and engine, Left Nvidia and went to work for AMD.

Nah, they just turn it off in most games because they don't want you using an 'exclusive' feature on the competition's cards - you can easily enable physX - it just runs best on CUDA though
 
[font="Verdana, sans-serif, FreeSans"]The new Kepler based GPU's are set to completely dominate their predecessor with a performance gain of 1.5-2x.
[/font][font="Verdana, sans-serif, FreeSans"]Sounds like the same statement is made for every generation.[/font]

[font="Verdana, sans-serif, FreeSans"][/font]

[font="Verdana, sans-serif, FreeSans"]To be honest, aren't we always being pulled up really high with these amazing figures?[/font][font="Verdana, sans-serif, FreeSans"] [/font]

[font="Verdana, sans-serif, FreeSans"]Then be slapped back into reality with actual results. Yet I'm still excited for this.[/font]

[font="Verdana, sans-serif, FreeSans"][/font]

[font="Verdana, sans-serif, FreeSans"]But I don't want my return to be filled with negativeness so all I can say is that I am very excited![/font]

[font="Verdana, sans-serif, FreeSans"]It's also nice to see people I recognise still here too. I digress.[/font]
 
[/font][font="Verdana, sans-serif, FreeSans"]Sounds like the same statement is made for every generation.[/font]

[font="Verdana, sans-serif, FreeSans"][/font]

[font="Verdana, sans-serif, FreeSans"]To be honest, aren't we always being pulled up really high with these amazing figures?[/font][font="Verdana, sans-serif, FreeSans"] [/font]

[font="Verdana, sans-serif, FreeSans"]Then be slapped back into reality with actual results. Yet I'm still excited for this.[/font]

[font="Verdana, sans-serif, FreeSans"][/font]

[font="Verdana, sans-serif, FreeSans"]But I don't want my return to be filled with negativeness so all I can say is that I am very excited![/font]

[font="Verdana, sans-serif, FreeSans"]It's also nice to see people I recognise still here too. I digress.[/font]

AMD said the same things this go round and delivered so far. I would expect the same from Nvidia as its in large part due to the die shrink thats helping the gains.
 
1.5 - 2x perfomance increase over current in numbers is likley and I expect at first there'll be issues to refine drivers and game code to make it 2x better for gameplay too. FERMI, last time I checked benchmark results, the GTX 480 was on par with the GTX 295, that's evidence it has succeded 2x better. I think micr-architecture study is kinda like Biology and refinement efforts progress is accelerating
 
CES starts tomorrow
biggrin.gif
Nvidia's keynotes should be interesting, not sure if the Kelper stuff is public or private though either way there should be some leeks, there always is.
 
Back
Top