Yes again, PhysX and Cuda were such a fail... Adobe Didn't write it into their rendering engine at all, and physx.. pssh.. I mean it's such a flop right not used in 60+ recent games including Witcher 3, Hawken, Planetside 2, The Unreal engine in all it's current and future forms from version 3 onwards, warframe, bioshock infinite, borderlands 2, batman...
Tom needs to get his mits on one, he'll tell it how it is. I couldn't be bothered fitting the G-Sync upgrade module (sure you've all heard of it) to a monitor though, I'm sure if you opened up your screen it would void warranty.
I did plenty of reading up on this last week including watching the 1hr Montreal initial release video. And I wasn't that impressed by the end very over hyped nVidia marketing. And I can only see this appealing to a small minority the rest of us have very evenly matched hardware (Graphics card -> Monitor) that perform perfectly well already.
Isn't this why we have benchmarks i.e 3DMark or Unigine so we know what our hardware is capable of running at, that way we know our monitors will support the fps -> Hz rate.
In the beginning nVidia had all monitor manufacturers (Acer, AOC, Asus, HP, LG, Dell, Samsung) onboard but 4 years later only Asus remain.... WHY???
Still not sold on this.... yet.
It's all on the Montreal video.
Also read somewhere that if you are sporting a 144Hz monitor then G-Sync is kinda pointless, as they are already perfectly adapted to sync at the GPU output ratio..
Really need to see more in depth details..
That's crazy. I just had a read elsewhere and it appears Nvidia have signed a deal with ASUS that will make G-Sync an ASUS exclusive till late 2014.
So the tech is already Nvidia only compatible and now it's ASUS only too. I feel like Nvidia are shooting themselves in the foot and we as consumers will likely have to pay silly prices to buy one at any time before ASUS competitors get their hands on the tech.. I think unless the launch prices turn out to be pretty reasonable I will wait till 2015 to upgrade my monitor and hope that G-Sync hasn't died by then
Yes again, PhysX and Cuda were such a fail... Adobe Didn't write it into their rendering engine at all, and physx.. pssh.. I mean it's such a flop right not used in 60+ recent games including Witcher 3, Hawken, Planetside 2, The Unreal engine in all it's current and future forms from version 3 onwards, warframe, bioshock infinite, borderlands 2, batman...
If this tech hits TV's, and goes open-source enough for AMD to utilise in consoles, gaming on TV's and consoles will never be the same, needing to push only 35+ frames instead of a solid 60 for smooth gameplay will bring HUGE benefits to the console market. Mark my words, if this gets introduced into consoles... it will BOOM.
It really does seem like it could easily be a new standard for a gpu to talk to a display...
even if you have 144hz, and dual titans/780's pushing the frames, if you put vsync on you get stuter, if you turn it off you still have potential tearing because the monitor is STILL fixed rate refresh and the gpu is still not synchronous with this... even at 60+fps this will still make things better.
They need a box you plug your monitor into then that goes to your Video card so we don't have to buy new monitors.
Yes, but at 144Hz the chance of tearing is so incredibly low that it's not that big of a deal really.
Not if you're playing a source engine game xD.
This is true, its not something that can really be shown on youtube.I think it needs to be experienced first hand really before you decide you want G-Sync.