using OGSSAA?

remember300

Active member
watching a few random vids and then wiki linking and going onto tech sites i came across rendering a display at higher resolutions then kind of down scaling, the image to your monitors native display, such as rendering at 1440p or 2160p and getting a slightly smoother image and then having it displayed at 1080p

Using this is kind of like using AA on the entire display rather than just the edges, is what i have mainly seen and took in if someone has a better explanation please let me know :D

I have found in most modern games its just easier to run built in AA at different magnitudes.

But on the older games where you don't get this option to run AA or only has a 2xAA option i found my images became much crisper with the exception of some oooooollllldddd games lol :( C&C RA1

Has anyone else tried this or come across this? Had any success? And in what games/ software?
 
i actually came across it while looking into pixels and distance density for my gameboy im building, google found linus but i do watch some of his vids, prefer toms but maybe cos i like tom going into depth about things :D

And if you know about it you must of watched it ;) lol

have you tried it yet?
 
Nah watched it half awake this morning so haven't been near my pc yet. i might try it after my upgrade :)
 
@remember300 : U R talking about Downsampling (which isn't exacly OGSSAA).
I got almost 4k (ie. ~3,5k) out of my old Samsung T240 (from 1920:1200 @ 60Hz, to 3360:2100 @ 54Hz) so that's nice :)

Performance wise : It's rendering game in higher resolution, so it can kill any GPU.
I'm also testing some stuff (:E), so I connetected FullHD TV to my graphics card, to see how "small" it's image get's compared to 4k.
Result : LINK (warning, kinda BIG screen shot).
 
Last edited:
nice i only tried 2k, just to see i like my games being around 60fps no matter what lol, i notice flicker and blur too much now its my norm viewing experience.
 
@remember300 : U R talking about Downsampling (which isn't exacly OGSSAA).
I got almost 4k (ie. ~3,5k) out of my old Samsung T240 (from 1920:1200 @ 60Hz, to 3360:2100 @ 54Hz) so that's nice :)

Performance wise : It's rendering game in higher resolution, so it can kill any GPU.
I'm also testing some stuff (:E), so I connetected FullHD TV to my graphics card, to see how "small" it's image get's compared to 4k.
Result : LINK (warning, kinda BIG screen shot).


Are you seriously running a 780Ti with a Pentium 4?

On Topic though, I never heard of running a monitor at a higher resolution to what the monitor is supposed to run at until I saw, the video that Linus did.

Tried it on my AOC E2795VH, and it will run normally at 1920x1080 @ 60Hz but it will also run at 2560x1600 @ 60Hz.

Not sure if it will damage the monitor though so I am still running it at 1080p.

I wish it was capable of running higher than that but it just won't.
 
@remember300 : U R talking about Downsampling (which isn't exacly OGSSAA).
I got almost 4k (ie. ~3,5k) out of my old Samsung T240 (from 1920:1200 @ 60Hz, to 3360:2100 @ 54Hz) so that's nice :)

Performance wise : It's rendering game in higher resolution, so it can kill any GPU.
I'm also testing some stuff (:E), so I connetected FullHD TV to my graphics card, to see how "small" it's image get's compared to 4k.
Result : LINK (warning, kinda BIG screen shot).

bf5b22eb30c408f22c69b35acdfb747706c476e75e1a756da4b0734caeb0499d.jpg


A pentium 4 and a GTX 780TI? nice. I guess the GPU usage doesnt go above 10% in games :)
 
Wow! A 3.8Ghz P4 "Presshot", kudos on the CPU but really!?! You should really consider that all important upgrade, so you can use your GPU to its fullest.
 
Well I said I was "checking stuff", didn't I ?
Normally GTX 780 Ti works with i7 from signature, so don't worry ;)
A pentium 4 and a GTX 780TI? nice. I guess the GPU usage doesnt go above 10% in games
As U can see in GPU-z Sensor tab from screenshot (thanks to Downsampling) I managed to maxed Crysis with 59% GPU Usage (altho, that required ~3,5k resolution and 16xQ AA...).
So, it's not that bad :P

But U think Pentium 4 570J (3,8GHz) is something special ?
Let me tell ya : Ain't Seen Nothing Yet.
Basicly : If U talking "GPU bottlenecking", U don't know what U R talking about without seeing this :

1) P4 651 "Cedar Mill" @ 4,9GHz : LINK
2) Celeron D 346 (3,06GHz) : LINK
3) P4EE 3,73GHz : LINK (FYI : multiplier is NOT unlocked)

I got 3DMark scores for them all (from 3DMark03 to newest "2013") :)

One of the interesting things I found with this CPU's :
3DMark11 (and 3DMark Vantage) X-Score results : LINK are higher than P-Score ones : LINK
Of course, GPU Scores are lower, but still...

PS. "Downsampling" for NV GPU's : LINK
 
Last edited:
Latest 377.88 driver seems to of broken downsampling, when ever I dial in 4k now it just gives me a black screen :(
 
Back
Top