Dawelio
Active member
Hey guys,
So probably a rather dumb question to you all, but since I've never overclocked a GPU or so before and don't have a lot of knowledge when it comes to actual clock speeds, I wanted to ask here before going forward with anything...
So I placed an order on an Gigabyte RTX 2080 Windforce III OC 8GB the other day (for roughly 726 pounds) and yesterday I saw that the non-OC version was mistakenly on the website for 632 pounds (as about 5 mins after I placed my order, it was on for 692 pounds).
Now the base model has a core clock of 1710 MHz and the OC model is at 1800 MHz.
My question is, how much of a difference is there really with those 90 MHz (Base -> OC) in terms of actual performance and raw FPS? Are we talking a mere 5 FPS in average or maybe 20 FPS?...
And how much heat output are we talking about here, with such a small clock increase? Or is there a huge difference?.
Since I also managed to place an order on the base model, for almost 100 pounds cheaper, I don't feel those 90 MHz are worth it. But still wanted to ask this question to you overclockers here.
Lastly, I watched Optimum Tech's 1080 Ti used GPU video yesterday and the RX 5700 RT looked quite promising. With an average 5-20 FPS lower than the 2080 in all of the games tested at 1440p.
In the video the 2080 were around 330W power consumtion and the 5700 RT were at roughly 272W. I have the Dan Case A4 SFX, which is basically the most compact case on the market at the moment and hence heat is an important factor to consider here.
My question is this; should I go for the cheaper base model 2080 or perhaps wait for an 5700 RT GPU, which hopefully one of the manufacturers will make one that is as colour neutral as the Gigabyte 2080 Windforce is?
Now as far as I'm aware, "true" G-Sync has gone a bit dead, due to the new Freesync and Nvidia now approving monitors to be compatible of G-Sync standards, I'm also wondering which would be the best fit for the LG 27" UltraGear 27GL850 monitor. Or if there isn't a difference at all.
Hope this wasn't too much text or so, but wanted to get my thoughts down and as much info as possible for you guys to have. As knowing the basis of this thread for your opinions etc.
Thanks,
Dawelio
So probably a rather dumb question to you all, but since I've never overclocked a GPU or so before and don't have a lot of knowledge when it comes to actual clock speeds, I wanted to ask here before going forward with anything...
So I placed an order on an Gigabyte RTX 2080 Windforce III OC 8GB the other day (for roughly 726 pounds) and yesterday I saw that the non-OC version was mistakenly on the website for 632 pounds (as about 5 mins after I placed my order, it was on for 692 pounds).
Now the base model has a core clock of 1710 MHz and the OC model is at 1800 MHz.
My question is, how much of a difference is there really with those 90 MHz (Base -> OC) in terms of actual performance and raw FPS? Are we talking a mere 5 FPS in average or maybe 20 FPS?...
And how much heat output are we talking about here, with such a small clock increase? Or is there a huge difference?.
Since I also managed to place an order on the base model, for almost 100 pounds cheaper, I don't feel those 90 MHz are worth it. But still wanted to ask this question to you overclockers here.
Lastly, I watched Optimum Tech's 1080 Ti used GPU video yesterday and the RX 5700 RT looked quite promising. With an average 5-20 FPS lower than the 2080 in all of the games tested at 1440p.
In the video the 2080 were around 330W power consumtion and the 5700 RT were at roughly 272W. I have the Dan Case A4 SFX, which is basically the most compact case on the market at the moment and hence heat is an important factor to consider here.
My question is this; should I go for the cheaper base model 2080 or perhaps wait for an 5700 RT GPU, which hopefully one of the manufacturers will make one that is as colour neutral as the Gigabyte 2080 Windforce is?
Now as far as I'm aware, "true" G-Sync has gone a bit dead, due to the new Freesync and Nvidia now approving monitors to be compatible of G-Sync standards, I'm also wondering which would be the best fit for the LG 27" UltraGear 27GL850 monitor. Or if there isn't a difference at all.
Hope this wasn't too much text or so, but wanted to get my thoughts down and as much info as possible for you guys to have. As knowing the basis of this thread for your opinions etc.
Thanks,
Dawelio
Last edited: