Nvidia release G-Sync video to show off their new Tech

WYP

News Guru
Since the unveiling of Nvidia G-Sync in October, Gamers wanted to see it and reviewers scratched their heads wondering how.

NVIDIA-G-SYNC-300x140.jpg


Today Nvidia have released a video showing off how their new G-Sync technology, allowing us to finally get a glimpse of it ourselves.

Sadly this video is not on youtube for me to simply post, due to the requirement that this video must be played at 60 FPS to correctly show how G-Sync operates, but it is available via direct download HERE via MEGA. This video delves into the pros and cons of V-Sync and how G-Sync eliminates these issues (mainly tearing).

gsync-monitor-key-visual_678x452.jpg


What do you guys think of Nvidia's G-Sync technology? I believe G-sync or another similar tech will be the norm eventually, the question is will Nvidia be the guys who do it?

Please comment Below

EDIT, I have found the video on youtube, while youtube undoubtedly has degrade the quality and lowered the framerate, you will still be able to hear what Nvidia have to say about G-Sync.

PLEASE DOWNLOAD THE VIDEO IF MAKING A QUALITY COMPARISON BETWEEN G-SYNC ON AND OFF!!


Source - PCPER
 
Last edited:
g-sync's gonna be awesome. but bought my 1440p monitor recently so no new one soon...
all i want is a 4k 120hz ips g-sync monitor for $1000, lol
 
Just downloaded the vid and have been following this here and there where I can. I personally love the idea of the tech and am almost certain it'll be implemented into future TVs etc. if Nvidia can generate enough buzz for it. I will hold out to buy a 1440p monitor (my next step up) until g-sync is released but whether i get one anytime soon will be very dependent on price premiums.
 
Whilst it looks cool and shows promise, the fact it's limited to only Nvidia cards is going to limit how much it'll take off I'd say.

Nvidia are notorious for limiting stuff to their products and their products only. Remember the days where you needed an nVidia chipset for SLI for instance? There's also features like PhsyX and Cuda, both of which never really took off partly because they're limited to nVidia only.

Personally if I buy a monitor, I'm going to want it to last for 3/4+ years and not have features that are locked to one GPU manufacturer only.

I also wonder how much it's going to push up the price of the monitor?
 
Whilst it looks cool and shows promise, the fact it's limited to only Nvidia cards is going to limit how much it'll take off I'd say.

Nvidia are notorious for limiting stuff to their products and their products only. Remember the days where you needed an nVidia chipset for SLI for instance? There's also features like PhsyX and Cuda, both of which never really took off partly because they're limited to nVidia only.

Personally if I buy a monitor, I'm going to want it to last for 3/4+ years and not have features that are locked to one GPU manufacturer only.

I also wonder how much it's going to push up the price of the monitor?

Very valid point. Are they shooting themselves in the foot with what could be revolutionary tech and thus push another party to find a different solution to the same problem which doesn't break patents and is open to licensing with whoever.

Just think if Nvidia licensed this tech to everyone then it's money in their pocket that can go into R&D to stay on top of the GPU game
 
Cheers for posting this. Watched the video and I must say I'm impressed.

The problem is backwards compatibility. Would I give up my Dell U3011 for this? All those people with more than one monitor, would they? As has been said before, tie yourself down to just one gpu manufacturer?

If it were an inexpensive upgrade kit for my existing monitor, I'd definitely get it. I don't think it will be, due to care that has to be taken to open up a monitor.

However, isn't the monitor industry missing something? Why should this be solely based on Nvidia cards? If the monitor's refresh rate is governed by some internal IC, can they not design something that detects the incoming fps and refresh the screen accordingly? Can't be that hard? It might cause some lag, but a fast circuit should minimise this to indiscernible levels.

It's an interesting space to watch for sure.
 
I have posted the video in the OP for the benifit of those who wish to quickly view/listen to it.

Please be advised the the youtube video WILL NOT offer a good comparison between G-Sync on and G-Sync off due to youtube degrading the quality for the stream.
 
People should check out the lastest WAN report with Linus, he is having Anand from Anandtech on air there talking about the G-sync, and they both seem to agree that G-sync is good up to 60fps.

But after there it offer very small improvements, so if you are running a 120mhz monitor, you will not see a huge increase. And as far as i know, the only current monitor to support the DIY kit, is an ASUS 120mhz monitor ( havent checked up on it), and the price was rumoured to be around 100-150 dollars back in october.
 
Read a review on this G-Sync and it's meant to be very good, but at the same time it's not a must...but if you do happen to see one then you'll probably fall in love and never go back.
 
im very excited by this, since i will hopefully be getting a gtx 800 series next year, im not too concerned with the issue of licensing it to amd, i think long term, they probably will license it. I hope so because this is a piece of tech that all GPUs should be able to use in future.
 
This video has just highlighted to me that this is staying smack bang at the top of my shopping list. Can't wait for it :)

such a shame if it stays nVidia only, but to those saying PhysX didn't do well, they didn't invent it they bought Ageia (still got one of their cards), and there are PLENTY of PhysX games, and as for Cuda, I have one thing to say, Adobe Suite...

It would be great if it did get to the red team as well, it sounds like it should be something that becomes mandatory because it makes 100x more sense refreshing the monitor at the speed of the incoming data than how it's done now. This needs to be a gaming-changing STANDARD not something for them both to compete over.
 
It would be great if it did get to the red team as well, it sounds like it should be something that becomes mandatory because it makes 100x more sense refreshing the monitor at the speed of the incoming data than how it's done now. This needs to be a gaming-changing STANDARD not something for them both to compete over.

As much as I agree with you, you still have to keep in mind that this is ALL business to them, lets face it, they don't really care about us if we don't buy their product. If I was Nvidia I would make G-sync an exclusive to myself and patent it so those who would like it wont have a choice of red or green, it will only be a question of which shade of green!
 
This video has just highlighted to me that this is staying smack bang at the top of my shopping list. Can't wait for it :)

such a shame if it stays nVidia only, but to those saying PhysX didn't do well, they didn't invent it they bought Ageia (still got one of their cards), and there are PLENTY of PhysX games, and as for Cuda, I have one thing to say, Adobe Suite...

It would be great if it did get to the red team as well, it sounds like it should be something that becomes mandatory because it makes 100x more sense refreshing the monitor at the speed of the incoming data than how it's done now. This needs to be a gaming-changing STANDARD not something for them both to compete over.

A standard like this will be the future, someday we will laugh at the idea id v-sync and tearing (I hope).

G-sync would be very beneficial to AMD based PCs and the consoles, if Nvidia were to licence it out for those used they should be very popular.

I reckon that Nvidia will keep this to themselves for a while though, get a few more enthusiasts in their camp before releasing it it a wider pool of people.

A little off topic here but does any of you guys notice that in Assassin's Creed 4 Black Flag there is tonnes of screen tearing and very bad/ super demanding AA options? And that both these problems are for whatever reason fixed by Nvidia's G-Sync and TXAA, I know Nvidia sponsor the game but it looks like the game was made to intentionally have those faults to help Nvidia leverage their exclusive tech.
 
Where I read the review he said it would be the future and we'd start seeing it in mobile devices, tablets etc. Although I don't know if there is tearing in mobile games...
 
Can't say I have EVER noticed that "jitter" that is quite clearly apparent in the V-SYNC vs G-SYNC section when I have V-SYNC turned on.

This video looks like marketing BS to me.

I think G-SYNC will be useful and important yeah, but I think they are overhyping it quite a lot. I also imagine that if nVidia don't make it an open technology (which of course they won't since nVidia hates open technology) then it will fail just as much as PhysX and CUDA etc.. does. If this is the case, then I know for a fact that some other company such as samsung or LG will create a similar technology and license it to all, meaning nVidia will lose out hugely.
 
Last edited:
Can't say I have EVER noticed that "jitter" that is quite clearly apparent in the V-SYNC vs G-SYNC section when I have V-SYNC turned on.

This video looks like marketing BS to me.

Showing some ignorance here I'm afraid, that effect has been prevalent for years in games


I think G-SYNC will be useful and important yeah, but I think they are overhyping it quite a lot.

I think you might be eating your hat when this stuff hits mass-market

I also imagine that if nVidia don't make it an open technology (which of course they won't since nVidia hates open technology) then it will fail just as much as PhysX and CUDA etc.. does. If this is the case, then I know for a fact that some other company such as samsung or LG will create a similar technology and license it to all, meaning nVidia will lose out hugely.

Yes again, PhysX and Cuda were such a fail... Adobe Didn't write it into their rendering engine at all, and physx.. pssh.. I mean it's such a flop right not used in 60+ recent games including Witcher 3, Hawken, Planetside 2, The Unreal engine in all it's current and future forms from version 3 onwards, warframe, bioshock infinite, borderlands 2, batman...

Clearly a fail /s


If this tech hits TV's, and goes open-source enough for AMD to utilise in consoles, gaming on TV's and consoles will never be the same, needing to push only 35+ frames instead of a solid 60 for smooth gameplay will bring HUGE benefits to the console market. Mark my words, if this gets introduced into consoles... it will BOOM.

It really does seem like it could easily be a new standard for a gpu to talk to a display...
 
Where I read the review he said it would be the future and we'd start seeing it in mobile devices, tablets etc. Although I don't know if there is tearing in mobile games...

i don't think you can even turn vsync off, on IOS thats not going happen but on android i know you can turn it off with root.

just been looking about the system on my htc one x and it seems like i can turn off vsync might give it a go :)


i watched the video and i know it's going to be awesome :P means nvidia can go sit back and even if amd brings out more powerful cards can just go "ours will be smother at all fps"
 
Showing some ignorance here I'm afraid, that effect has been prevalent for years in games

I have never seen it myself whilst playing games. I have seen many reports on it, and many other people complain about it. In my personal experience however I have never see it, I was not being ignorant, I know this problem exists I simply meant that I myself have never experienced it.

I think you might be eating your hat when this stuff hits mass-market

No, I WANT it to hit mass market, this technology looks good. Whilst not absolutely ground breaking, I do very much agree the concept is very good.

Yes again, PhysX and Cuda were such a fail... Adobe Didn't write it into their rendering engine at all, and physx.. pssh.. I mean it's such a flop right not used in 60+ recent games including Witcher 3, Hawken, Planetside 2, The Unreal engine in all it's current and future forms from version 3 onwards, warframe, bioshock infinite, borderlands 2, batman...

Clearly a fail /s

You mean in the same way OpenCL and DirectCompute are also in those engines, every single one of those games and much much more? Not to mention much more versatile? Just because it has a presence does not mean it is doing good. In comparison to its competitors it is doing terribly.

Would you say that Google+ is doing good in comparison to Facebook?

If this tech hits TV's, and goes open-source enough for AMD to utilise in consoles, gaming on TV's and consoles will never be the same, needing to push only 35+ frames instead of a solid 60 for smooth gameplay will bring HUGE benefits to the console market. Mark my words, if this gets introduced into consoles... it will BOOM.

It really does seem like it could easily be a new standard for a gpu to talk to a display...

I agree, however this will never happen unless we are lucky, as nVidia is completely anti open technology. Which is what will ruin it.
 
Last edited:
Back
Top