Graphic Card Decision

CrosairHero

New member
Hi all, I am in need of opinions and advice.

I am looking into getting a new graphics card. Currently I have a GTX 760,I have decided to open up and look at both AMD and Nvidia. I am playing on a 1920x1080 res and I will be adding a second monitor down the line (Will not be using both for gaming).

So I am looking at the R9 290/290x and the Nvidia 780. I have been a faithful user to Nvidia, but I feel I need to look at other options.

I do not have the biggest budget in the world as I am saving my pennies.

I am looking for the biggest bang for my buck.

Games I play: Crysis 3, BF4, Starcraft2, Assassin's Creed (All of them), Metro (both).

Any help is appreciated.....
 
Out of the cards you mentioned the R9 290 is the best bang for your buck. It's also cheaper/faster(more than not) than a 780. Just get any good cooler on it and it should do wonders for you.
 
well I hear the tonga gpu is promising if your on a budget, though since your already considering the 290(x) I guess there isnt much point.

Why not just keep the 760? As these current cards arent exactly great... the 760 is a great card. I like the 670 better though.
 
Why not wait a month and see if the new Graphics cards that are coming on the market are a big upgrade from your 760. I am looking to upgrade my HD7970 but by the end of October there will be newer products out there are rumours of the R9285x
 
Last edited:
I am having a lot of problems with drivers. Not sure why, but I had first intended to run two 760's in SLI. I am still thinking about this config as I am only gaming in 1080p and not sure if the higher end cards are worth the money. But I am also considering waiting for the 900 series and seeing what they offer.
 
I recommend staying with what you have, unless you are running a benchmark you don't really see any differences in the cards you mentioned and the one you have. The only way you can tell if you are running a game at 30 or 60 fps is to have the monitor on the screen as the human eye cannot recognize anything over 20 or so. You will get smoother gameplay, but even that is negligible.

I'd wait for a new generation to go through the trials of release, updated drivers and stability fixes before I buy anything new.
 
I recommend staying with what you have, unless you are running a benchmark you don't really see any differences in the cards you mentioned and the one you have. The only way you can tell if you are running a game at 30 or 60 fps is to have the monitor on the screen as the human eye cannot recognize anything over 20 or so. You will get smoother gameplay, but even that is negligible.
.

wow, did you REALLY say that? This is not the place to say that. Me and pretty much everyone here can see the difference. This is presuming you're not running a 30hz monitor.
http://30vs60.com
 
The only way you can tell if you are running a game at 30 or 60 fps is to have the monitor on the screen as the human eye cannot recognize anything over 20 or so. You will get smoother gameplay, but even that is negligible.

Wait, what? There is a huge difference between 30 and 60fps. Also the human eye doesn't see in fps.
 
I recommend staying with what you have, unless you are running a benchmark you don't really see any differences in the cards you mentioned and the one you have. The only way you can tell if you are running a game at 30 or 60 fps is to have the monitor on the screen as the human eye cannot recognize anything over 20 or so. You will get smoother gameplay, but even that is negligible.

I'd wait for a new generation to go through the trials of release, updated drivers and stability fixes before I buy anything new.

I think you got sidetracked by an offhand comment and missed the answer, so I bolded the important bits for you. :)

I am sure you can also tell a difference in 20,000k and 20,001k in a color spectrum.
Motion pictures are filmed at 16fps, do you see the flashes?
Human vision is not what you perceive, the image that is delivered to the brain is upside down, backwards and has a big black spot in the center and lines running out from there where there is no vision at all. Most of what you perceive is added in post processing by the brain and what it thinks should be there, that's why magic works.

For the record, I mis-typed 30 where I intended to put 50, but it's all the same. The point is that a person cannot differentiate between small differences in fps with so many other variables including but not limited to monitor refresh rates and calibration, ambient lighting, effects of the environment such as caffeine, tobacco, alcohol and other mind altering drugs, state of mind, tiredness, and so many others that I'll not bother listing all. And for the record, a person cannot tell the difference between printed material done at 300 dpi and 600 dpi at arms length in normal ambient light. Can you see the difference through a loupe? certainly.

Anyways, you are free to disagree with any or all of this you choose, I still recommend the OP not bother with the card upgrade he suggested. That is just my personal opinion.

Good day.:)
 
If your looking for a bit of a boost and a bit more power to run the newer games then sure SLI to help your system out. But don't forget only the max ram of the lowest card gets used overall, tis the question im having with my 780 atm.
 
You are comparing a 100% increase (from 30 to 60) to a 0.005% increase (from 20k to 20.001), obviously you're going to see much less difference.

The second claim is false. Movies and TV shows are shot in a minimum of 24 fps, but sometimes 25 or 30. Obviously that's still not too high but once again, 24 is quite a difference (50%) over 16 fps. The only exception being hand-drawn animation, which is still often displayed at 24 fps, but every second frame is a duplicate of the one before.

24 fps was chosen because it's easy to work with, and somehow quite comfortable on the eye for such a low frame rate (Google how that works), even compared to 23 or 26 fps.

That said, movies and TV shows exist out of pre-recorded frames that are just being played, however the frames in your games are being rendered on the spot. Hence why you can get away with Intel HD graphics for watching movies in 1080p, yet when you play games it might not even reach the 24 fps that it does during movies, depending on the game of course.

That's why we have graphic cards, to render these images quicker and deliver us a smoother gameplay experience. It's already been proven time and time again that there is a difference between 30 and 60 fps so I'm not even going to get into that. It's just that for some things, like TV and movies, it's not really necessary because you don't need to react to anything and then act accordingly like you do in games.
 
You are comparing a 100% increase (from 30 to 60) to a 0.005% increase (from 20k to 20.001), obviously you're going to see much less difference.

The second claim is false. Movies and TV shows are shot in a minimum of 24 fps, but sometimes 25 or 30. Obviously that's still not too high but once again, 24 is quite a difference (50%) over 16 fps. The only exception being hand-drawn animation, which is still often displayed at 24 fps, but every second frame is a duplicate of the one before.

24 fps was chosen because it's easy to work with, and somehow quite comfortable on the eye for such a low frame rate (Google how that works), even compared to 23 or 26 fps.

That said, movies and TV shows exist out of pre-recorded frames that are just being played, however the frames in your games are being rendered on the spot. Hence why you can get away with Intel HD graphics for watching movies in 1080p, yet when you play games it might not even reach the 24 fps that it does during movies, depending on the game of course.

That's why we have graphic cards, to render these images quicker and deliver us a smoother gameplay experience. It's already been proven time and time again that there is a difference between 30 and 60 fps so I'm not even going to get into that. It's just that for some things, like TV and movies, it's not really necessary because you don't need to react to anything and then act accordingly like you do in games.

To expand on your post.

TVs and Movies are being watched at a much much greater distance than a video game. This also plays into it. Your eyes will notice less and less the farther you go. Its the same reason why 4k looks the same as 1080p at 10ft away. The only difference being 4k is showing more of the current picture. Also since they are rendered at 24FPS and played at exactly the same rate and TVs/Monitors will drop their own refresh rate to match it you will have all 3 variables playing the same thing at the same time. Its therefore smooth and your eyes won't catch the imperfections nearly as easy as video games.

Video games on the other hand. As Feronix points out is rendered in real time on the spot. Meaning you have the engine rendering everything on the fly. This is where all the imperfections happen because it takes much longer than a pre-rendered movie. Since nothing is displaying at the same rate you can far more easily tell when your running low FPS or when a frame is skipped. Its very noticeable in V-Sync which is the best example. Sp really your refresh rate is constant the render is not. You're closer to the screen which allows your eyes to pick up these annoying things.

Though if you're dead set on the fact your eyes won't see a difference of 30FPS to 60FPS you sure as hell will "feel" a difference and have quicker reaction times.
 
Back
Top