Nvidia release G-Sync video to show off their new Tech

Watching the video, it looks very very nice, but I just hope the monitors aren't going to be much more expensive.
 
Yes again, PhysX and Cuda were such a fail... Adobe Didn't write it into their rendering engine at all, and physx.. pssh.. I mean it's such a flop right not used in 60+ recent games including Witcher 3, Hawken, Planetside 2, The Unreal engine in all it's current and future forms from version 3 onwards, warframe, bioshock infinite, borderlands 2, batman...

Wasn't Bioshock an AMD bundle game? I honestly didn't know about the Physx and I love that game :3. Also another notible recent release that use Physx is Farming simulator and almost all the other "Giants Engine" simulators. That is the only game I've played that felt more complete with Physx. I think Physx is not going to be used less and less now due to AMD powered consoles and that only very few games with PhysX actually make any decent use of it. I think it shows how much of a success it has been when many people consider it a failure and only a few games that have it parade it around. The same is with AMD's idiotic Tress FX, except they'll try and include it more and more but you won't really notice it so it'll hopefully die a quiet death as well.I hate any red/green only technology (except CUDA). Its a shame that the whole G sync was created by Nvidia and not by a monitor company, as it would make it much more likely for the technology to be licensed around and be better for everyone.

Wasn't adobe originally apple product support only as well?
 
Tom needs to get his mits on one, he'll tell it how it is. I couldn't be bothered fitting the G-Sync upgrade module (sure you've all heard of it) to a monitor though, I'm sure if you opened up your screen it would void warranty.
 
Tom needs to get his mits on one, he'll tell it how it is. I couldn't be bothered fitting the G-Sync upgrade module (sure you've all heard of it) to a monitor though, I'm sure if you opened up your screen it would void warranty.


Incorrect, Tom needs to send one (or three, y'know whatever) to me! :)
 
I did plenty of reading up on this last week including watching the 1hr Montreal initial release video. And I wasn't that impressed by the end very over hyped nVidia marketing. And I can only see this appealing to a small minority the rest of us have very evenly matched hardware (Graphics card -> Monitor) that perform perfectly well already.

Isn't this why we have benchmarks i.e 3DMark or Unigine so we know what our hardware is capable of running at, that way we know our monitors will support the fps -> Hz rate.

In the beginning nVidia had all monitor manufacturers (Acer, AOC, Asus, HP, LG, Dell, Samsung) onboard but 4 years later only Asus remain.... WHY???

Still not sold on this.... yet.
 
I did plenty of reading up on this last week including watching the 1hr Montreal initial release video. And I wasn't that impressed by the end very over hyped nVidia marketing. And I can only see this appealing to a small minority the rest of us have very evenly matched hardware (Graphics card -> Monitor) that perform perfectly well already.

Isn't this why we have benchmarks i.e 3DMark or Unigine so we know what our hardware is capable of running at, that way we know our monitors will support the fps -> Hz rate.

In the beginning nVidia had all monitor manufacturers (Acer, AOC, Asus, HP, LG, Dell, Samsung) onboard but 4 years later only Asus remain.... WHY???

Still not sold on this.... yet.


Do we actually know that ASUS is the only vendor coz i'm pretty sure they've said that ASUS is not going to be the only vendor of G-Sync and that they just happen to be the first to market
 
It's all on the Montreal video.


Also read somewhere that if you are sporting a 144Hz monitor then G-Sync is kinda pointless, as they are already perfectly adapted to sync at the GPU output ratio..

Really need to see more in depth details..
 
It's all on the Montreal video.


Also read somewhere that if you are sporting a 144Hz monitor then G-Sync is kinda pointless, as they are already perfectly adapted to sync at the GPU output ratio..

Really need to see more in depth details..

That's crazy. I just had a read elsewhere and it appears Nvidia have signed a deal with ASUS that will make G-Sync an ASUS exclusive till late 2014.

So the tech is already Nvidia only compatible and now it's ASUS only too. I feel like Nvidia are shooting themselves in the foot and we as consumers will likely have to pay silly prices to buy one at any time before ASUS competitors get their hands on the tech.. I think unless the launch prices turn out to be pretty reasonable I will wait till 2015 to upgrade my monitor and hope that G-Sync hasn't died by then
 
That's crazy. I just had a read elsewhere and it appears Nvidia have signed a deal with ASUS that will make G-Sync an ASUS exclusive till late 2014.

So the tech is already Nvidia only compatible and now it's ASUS only too. I feel like Nvidia are shooting themselves in the foot and we as consumers will likely have to pay silly prices to buy one at any time before ASUS competitors get their hands on the tech.. I think unless the launch prices turn out to be pretty reasonable I will wait till 2015 to upgrade my monitor and hope that G-Sync hasn't died by then

Nah that was outdated info. BenQ and 2 others brands will also support Gsync. Also at high fps it will provide greater improvement. Otherwise why would Nvidia release technology designed to only cater for mid/low end market. It's great that low end cards can enjoy higher detail, but according to the review over at Guru3d at higher fps its just fantastic. The reviewer did state though that while 30hz was enjoyable, it was still nothing quite like playing at 60/120hz. Never saw the point of 144hz though since if i remember correctly, lightboost is much better at 120hz.

edit* I'll try and find the source of the info about the multiple brands. But they did confirm that ASUS would not have the monopoly here.
 
They need a box you plug your monitor into then that goes to your Video card so we don't have to buy new monitors.
 
even if you have 144hz, and dual titans/780's pushing the frames, if you put vsync on you get stuter, if you turn it off you still have potential tearing because the monitor is STILL fixed rate refresh and the gpu is still not synchronous with this... even at 60+fps this will still make things better.
 
Yes again, PhysX and Cuda were such a fail... Adobe Didn't write it into their rendering engine at all, and physx.. pssh.. I mean it's such a flop right not used in 60+ recent games including Witcher 3, Hawken, Planetside 2, The Unreal engine in all it's current and future forms from version 3 onwards, warframe, bioshock infinite, borderlands 2, batman...


If this tech hits TV's, and goes open-source enough for AMD to utilise in consoles, gaming on TV's and consoles will never be the same, needing to push only 35+ frames instead of a solid 60 for smooth gameplay will bring HUGE benefits to the console market. Mark my words, if this gets introduced into consoles... it will BOOM.

It really does seem like it could easily be a new standard for a gpu to talk to a display...

Regarding the bold: To be fair Witcher 3 is not out yet.. Can't say it will use PhysX(unless dev has said so).

Now It will make a big difference to console IMO more so than PC. Reason being is any high end rig can hit 60fps pretty consistently at very high graphics while consoles on the other hand are limited over time. Now Instead of wasting 50% of power to go from 30 to 60fps they could easily set it down to 30,31,32,33,34, etc. or whatever they want. This would allow them do get the best visuals at any refresh rate they want, which will allow more freedom and push visuals on muliplat games.

Playing at low refresh rates will be noticeable compared to 60 but it will still look better but also feel smoother and distract the eyes from noticing the bigger difference, which for most console players, are used to low paced games.
 
even if you have 144hz, and dual titans/780's pushing the frames, if you put vsync on you get stuter, if you turn it off you still have potential tearing because the monitor is STILL fixed rate refresh and the gpu is still not synchronous with this... even at 60+fps this will still make things better.

Yes, but at 144Hz the chance of tearing is so incredibly low that it's not that big of a deal really.
 
They need a box you plug your monitor into then that goes to your Video card so we don't have to buy new monitors.

Thats not true, you have two options.

1. Buy the G-Sync Self Install module when it's released, you don't just sit it on your desk, you open up your monitor back panel and fit it (obviously only certain monitors will allow it).

2. Buy a G-Sync ready monitor, I think the ASUS is supposed to be about £400.

Edit: I read your comment wrong, you were just saying it would be a good idea if...oh well, maybe the module will be of interest to you?
 
Not if you're playing a source engine game xD.

Exactly... there are other games that even on a 120~144hz screen still tear/stutter due to forced frame rates, vsync or other issues.

Having the monitor refresh when the card does just makes pure, common sense... it's a shame a screen manufacturer didn't come up with this.
 
I'd get G sync if it wasn't Nvidia only. I play Team Fortress 2 and while its not the most serious game out there you need very high and very stable frame rates (one of my friends is a very high rated sniper, has a Titan and still plays with an FPS config as to keep the frames at a stable 200+ at all times) and the screen tearing is pretty bum. If you enable V sync aiming suddenly becomes incredibly hard as the mouse constantly de accelerates/accelerates. Anyone who plays counter strike will probably know what I'm talking about as I believe its the same issue.
 
download the video file i think it shows it fairly well tbh.. But yeah so far it seems to be one of those "see it to believe it" things. That once you've seen.. you probably won't accept anything less
 
Back
Top