Monitor Overclocking ?

Tnx dice hunter. Well i also have a side 1080p monitor i use for fast task when needed,quick reply on email wgile gaming etc and imma try to overclock it as an expirement then i will go on 4k and test whether 4k 60hz or 1440p 75hz(greater if possible) looks better but still if done hassle free then i will have an option for everyday use-normal gaming and an fps gamming resolution.

Sry for my english,its been a long while back since i left US. (12 years)

No problem bud :)
 
well idid what it said and 75hz is acceptable while screen became black at 80hz so it automatically reverted to 60hz. i set it to 75hz and made a restart. after that i went to check the refresh rate and it somehow reverted back to 60 again. That may be caused cuz i didnt restarted after 80hz failed. So right now at 75hz the screen somehow brighter and the white become more "white" Lawl! The scrolling on webpage is smoother and so does the colour change on site loading. I am playing some Dizzel and post back if had any troubles.
 
so do you guys saying that if i follow this way i can downscale my 4k monitor (60hz) to 2560x1440 and achieve a 75hz or greater?

Do you mean that since my monitor is 4k the 1440p res will look better than others monitors due to more pixel=more colors etc? Cuz as you said the maximum pixel on 1440p is 2560x1440 but mine has 3840x2160.


Sorry but it will depend on the monitors ability to scale, it could look worse, or it could look good.

Good one ;)
dHRPCC1.jpg

Basicly 4096:2560 on 1920:1200 screen (of course it's at 32Hz) ;D

And performance drop is a little more than rendering game at that resolution on native panel.

Information 1 :
If I want to use resolutions over 3200:2000, I have to apply it on Desktop first (otherwise it won't be available in game/program settings).

Information 2 :
Using higher than 16x AA mode (ie. 8xQ or 16xQ), will crash Crysis (at Very High) after few moments.
Probably GTX 780 Ti needs more VRAM or it's a game bug.

And sorry, but not idea what you are doing there, since your monitor is only 1440p it cannot physically display every pixel. pixels can only display one color at a time. You can get the gpu to render at a higher res then downscale it to fit your res (OGSSAA), but thats about it.. so yea :nutkick:
 
Last edited:
Not sure what your referring to, the overclocking or increasing the res? I believe the overclocking would work, however I the quality degrades or something as you use more bandwidth for the OC on VGA connection. I am not very well informed on this kind of thing, so wouldnt be able to help you.
 
Last edited:
And sorry, but not idea what you are doing there, since your monitor is only 1440p it cannot physically display every pixel. pixels can only display one color at a time. You can get the gpu to render at a higher res then downscale it to fit your res (OGSSAA), but thats about it.. so yea :nutkick:
FYI : My monitor is a native 1920:1200 - Samsung T240 (16:10).
So what if it can't display every pixel from 4096:2560 res. ?
Using res higher than 1920:1200 on my monitor should be impossible, right ?
That's what i was trying say ;)

Furthermore :
Based on what U said, I think U don't quite get how downsampling works :
Because it's not only rendering in higher res.
It uses information from that process to more accuratly calculate colour of every pixel on screen.
And thank's to that, it's able to increase image quality overall, regardless of monitor native resolution.

Example : With 4k to FullHD downscaling, GPU gets 4x the amount of data on every pixel.
U will see that on "FullHD only" screen in smoother edges and visually better textures.

In short : U don't need a 4k screen to see the difference between 4k and FullHD.
Altho I admit that downsampling won't get image quality on par with native higher res monitor.
 
Last edited:
FYI : My monitor is a native 1920:1200 - Samsung T240 (16:10).
So what if it can't display every pixel from 4096:2560 res. ?
Using res higher than 1920:1200 on my monitor should be impossible, right ?
That's what i was trying say ;)

Furthermore :
Based on what U said, I think U don't quite get how downsampling works :
Because it's not only rendering in higher res.
It uses information from that process to more accuratly calculate colour of every pixel on screen.
And thank's to that, it's able to increase image quality overall, regardless of monitor native resolution.

Example : With 4k to FullHD downscaling, GPU gets 4x the amount of data on every pixel.
U will see that on "FullHD only" screen in smoother edges and visually better textures.

In short : U don't need a 4k screen to see the difference between 4k and FullHD.
Altho I admit that downsampling won't get image quality on par with native higher res monitor.

You do realize that you just proved that I am correct dont you? Like I said the monitor will never physically be able to go beyond the limits of its res, simply due to the fact that the pixels cannot do the job of more than a single pixel (one color ONLY). I already knew about down sampling and acknowledged it. So I dont see what your moronic and provocative posts are supposed to prove, aside from the fact that you cant get this through your thick skull!

Also YOU may want to read the post that you quoted again, I didnt say that there were no benefits, basically my only statement in its most basic form was that the panel will not be able to physically display more pixels than the panel contains.
 
Last edited:
So what ?
I didn't said I don't agree with U with Your "can't display more pixels on native screen" statement.
I was originally responding to :
unfortunately there is a limit that your panel will support. Say for instance you have a 1080p monitor, it will never go over 1920 width or 1080 height...
When I see SS with Crysis using 4096:2560 on native 1920:1200 screen, I conlude that resolutions above native one can be used.
Additional pixels from higher res will get blended to match lower res (ie. they will be used regardles of native resolution).
It may not look pretty on desktop, BUT usable desktop space will be larger than on native res.

So why R U so determined to say it can't be used and keep clinging to that "can't display more pixels on native screen" statement ?
It's THERE on my screenshot, I AM using it, and in game quality is better on it - fact that U acknoledged yourself in previous post.
So who's got "thick skull" ?

If someone is confused about this, here's basic question we don't agree on :
Do U need to display every pixel from higher resolution, to be able use it ?
My answer : No, U don't.
Newbie_NS810 answer : Yes, U do.
 
Last edited:
Right so tell me exactly one thing, do you or do you not see more than 1080 pixels in height and 1920 in width? Obviously you cant physically count them with your the eyes. But my statement holds true, you may have applied a higher resolution, however your monitor does not support it.

Read the damn posts instead of quoting something you can barely pick on.
Also I do not know since when the issue was 'Do U need to display every pixel from higher resolution, to be able use it ?'

If you want to keep going and keep pulling words out of your arse feel free to do so. This was never about 'Do U need to display every pixel from higher resolution, to be able use it ?'

Thick skull for sure.

Read the post you decide to quote properly instead of acting like a smartarese when you didnt even understand the context of the post.
 
Last edited:
I know it can't make monitor DISPLAY more pixels !
That's why I agreed with U on that in the first place.

My point is that thank's to this trick, U don't need them to see the difference between native and higher res (in practice however, native higher res monitor should be better than this - of course).

As for question I brought up before :
Well to me that is the base on which we can't agree on, right ?
Since U keep saying "U can't make more pixels visible" and I keep telling U that "U don't need them to get better image quality".

Either way, I admit, it's NOT stricly monitor OC (it's closer to SSAA "cheat" or DSR Prototype).
But it makes games look better (+ run slower), and it makes Despktop bigger.
So it does exacly the same things as native higher res. would (at least in my book).
 
I know it can't make monitor DISPLAY more pixels !
That's why I agreed with U on that in the first place.

My point is that thank's to this trick, U don't need them to see the difference between native and higher res (in practice however, native higher res monitor should be better than this - of course).

As for question I brought up before :
Well to me that is the base on which we can't agree on, right ?
Since U keep saying "U can't make more pixels visible" and I keep telling U that "U don't need them to get better image quality".

Either way, I admit, it's NOT stricly monitor OC (it's closer to SSAA "cheat" or DSR Prototype).
But it makes games look better (+ run slower), and it makes Despktop bigger.
So it does exacly the same things as native higher res. would (at least in my book).

****** ****, why dont you just go and reread your first and second post again. You were picking on my post without even understanding the context of the post. Maybe you dont have a thick skull after all, instead a thin one and someone hit you too hard. Because you are clearly unable to get the facts straight, nor interpret.
Did I say that there were no benefits, NO.
Can the monitor physically display all the pixels of the resolution you applied, NO.
Did I say you couldnt process at a higher resolution, NO.

So far 3 - 0.

If you had simply said that it was possible to get the GPU process the data at a higher resolution instead of picking on my post and acting like a smartass I would have been more than happy to accept that, as I did not mention it. However you quoted my post that was correct and did indeed act like a smartass saying that it was possible, when the post was basically saying that the panel wouldnt be able to physically display more than 1920 width or 1080 height in terms of pixel count due to the physical limitation.
 
Last edited:
Why don't you two just stop whining like little girls and stop this dumb bickering? Its been 4 days of you two totally ruining the thread. Get the fuck over it already. Agree to disagree and move on. Be constructive.
 
So basicly after 4 days :
U would agree that my thinking is correct in the beggining, but didn't bacause of how I quoted your original post (BTW : I quoted it simply because it was two pages earlier, that's all)... :dead4:. Let's just leave it at that.
Next time U think I made a mistake, just write a PM to me, OK ?
Why don't you two just stop whining like little girls and stop this dumb bickering? Its been 4 days of you two totally ruining the thread. Get the fuck over it already. Agree to disagree and move on. Be constructive.
True.
But at least we kept this thread busy :)
Nobody written anything here exept us (and now U), so we were usuful in that regard.

Last thing : @NeverBackDown good nickname ;)
 
Last edited:
I brought this up in another thread ages ago ... lol nice to see old topics coming back.
With this upscaling to 4k its basically using AA or SSAA not just on the lines and joins but the entire image then turning it down, you will see smaller icons because its still showing the scaled resolution, with makes some games a nightmare to get into the menus ... looked awesome though.

As for your screen shot of course it will be shown in 4k as its a 4k image rendering without the down scale, the down scale comes after, which is then pumped down your cable (metaphorically speaking) and displaying at 1920x1080/1200 depending on display, for its pixels to display otherwise we will only see 1/4 of the image at 1080 lol :(

AS for the OCing of the display you will only get so many hertz of an image sent down the cable so an older panel using VGA may be limited by the band width of the cable, so an actual FPS limit will be on the monitor even if the gpu is saying 1000000 FPS your monitor will only use what it can :) Would love the G-sync part for my display to see that running too with an OC on the panel not much but maybe to a rounder number like 150Hz than 144Hz still not sure why 144 but maybe something todo with the active 3D.
 
So basicly after 4 days :
U would agree that my thinking is correct in the beggining, but didn't bacause of how I quoted your original post (BTW : I quoted it simply because it was two pages earlier, that's all)... :dead4:. Let's just leave it at that.
Next time U think I made a mistake, just write a PM to me, OK ?
True.

Again read the posts properly. I will not leave it at that simply because you are once again making your own assumptions.


If you had simply said that it was possible to get the GPU process the data at a higher resolution instead of picking on my post and acting like a smartass I would have been more than happy to accept that, as I did not mention it. However you quoted my post that was correct and did indeed act like a smartass saying that it was possible, when the post was basically saying that the panel wouldnt be able to physically display more than 1920 width or 1080 height in terms of pixel count due to the physical limitation.

My gripe with you is not because you posted information, it was that you quoted a post without reading and understanding it properly, not to mention acting like a smartarse in the process.

Why don't you two just stop whining like little girls and stop this dumb bickering? Its been 4 days of you two totally ruining the thread. Get the fuck over it already. Agree to disagree and move on. Be constructive.

Right, so when some idiot comes and quotes your post saying that your wrong and acting like a smartass, without even reading and understanding it properly, your just going to let it go. You might, but I wont stand for it.
 
Last edited:
Back
Top