PC gaming nerds enter here

Will 1080p games look shitty on a 1440 monitor

YES or NO

JERRY

It will look great!

Consider that most TVs sold today are capable of 4K. However, most content is displayed at 1080p. It still looks great and one would be hard pressed to see any difference between a native device and its lower resolution counterpart. HDR capability and the like are far more subject to one's perception than resolution, even when displayed by non-native sources.

For goodness sake, my 480p Wii games upscaled to 1080p on my Wii U and blown up to fill a 4K display looks god damn AMAZIN' and it has everything to do with contrast and color levels.
4K scales perfectly to 1080p because it's just 4 1080p screens together, so 4 pixels become 1.

On 1440p to 1080p, there is no direct scaling, which means it definitely blurs a bit. You can possibly address this by turning up the sharpness of the monitor, but that means whenever you're playing a game, you'd probably turn up the sharpness and turn it down when you're not in game.

p.s. No fucking way I'd ever play any game without GSYNC/FreeSync on the monitor btw, amRam. Also, some monitors that have FreeSync will now support nVidia GSYNC. Different certifications, etc., but at least if you ever change the GPU from nVidia to AMD or vice versa, the monitor still works for the adaptive sync tech.
 
Last edited:
1440p is basically mandatory minimum for the 30-series
running 1080p on them bottlenecks them so badly they perform at or below 20-series level cards
What is this nonsense? The relationship between the CPU and GPU is more important, but in the real world, it's not really. You don't want to pair up an absolute shit CPU with an amazing GPU or vice versa, but running a 3-series on 1080p just means even higher/stable FPS as well as higher graphics settings.

With that said, I wouldn't buy a 3-series for 720p gaming (example). It'd be pointless. It's not that it would be bottlenecked, but just way too much performance for the screen it's supporting. There's a point where more fps doesn't matter for casual gaming.
 
uh why? The sync only matters if your fps drop under threshold.
Nope. It's a somewhat complicated synchronization thing between the refreshing of the screen and the frame being sent by the GPU.

Doesn't matter if your FPS is triple your monitor's refresh rate. Tearing (and stuttering) will still happen. And to me, one of the best bits of monitor tech to come out in a decade was eliminating that tearing.

VSync fixes this problem but it causes input lag. You're waiting on the monitor to be ready to update. Even if the refresh rate of the monitor is 144, your GPU might be ready to post a frame a millisecond after the monitor refreshed, but now you have to wait for the next refresh for that frame to be seen (or your input to be recognized as it happened).

GSync/FreeSync reverse that locked synchronization. The GPU sends the frame and that's when the monitor refreshes. There are some little tidbits of information regarding adaptive sync, to where you want to cap your fps to 3 below the monitor's refresh rate capability because even if you're getting 500fps, the monitor can only refresh as fast as its hardware supports. I can't remember why though. Adapative sync does add input lag as well btw, but's exceptionally low (1-5ms) and it's worth not having your screen tearing/stuttering.
 
Last edited:
4K scales perfectly to 1080p because it's just 4 1080p screens together, so 4 pixels become 1.

On 1440p to 1080p, there is no direct scaling, which means it definitely blurs a bit. You can possibly address this by turning up the sharpness of the monitor, but that means whenever you're playing a game, you'd probably turn up the sharpness and turn it down when you're not in game.

p.s. No fucking way I'd ever play any game without GSYNC/FreeSync on the monitor btw, amRam. Also, some monitors that have FreeSync will now support nVidia GSYNC. Different certifications, etc., but at least if you ever change the GPU from nVidia to AMD or vice versa, the monitor still works for the adaptive sync tech.

480p doesn't "fit perfectly" into 1080p or 4K...
 
480p doesn't "fit perfectly" into 1080p or 4K...
Well there's no argument there. 480 doesn't go into 1080 or 2160 evenly, so no, it doesn't fit perfectly. This is all about native resolution, pixel density, and simple math (larger res divided by reduced res).

Let me give an example of this. Let's take two identical quality/mfg/size/dimension TVs. One is 4k and one is 1080p. If I run the same file on both that is made for 1080p, but I force the 4k one to run in 1080p, the screens will look identical. There will be no blurring. The pixels on the 4K TV fit perfect 4:1 with the 1080p. Since the size of the screens are the same, the pixel density would be the same (when downscaling from 4K). When it's not downscaling, the pixel density of the 4K TV is... 4 times higher than the 1080p screen.

If we did the same test with a 1440p TVs (assume one exists), the pixel density is higher with 1440p of course, but there is no direct pixel relation. 1080 vertical pixels won't "fit" in 1440 vertical pixels, therefore the TV is not just taking 4 pixels to represent 1 pixel from the 1080p content, it's having to interpolate them which blurs the image.

I wouldn't say that it's entirely coincidental that 1080 just happens to fit perfectly in 2160 (4K). I would imagine it made sense from a manufacturing perspective once they were able to make things smaller. It also meant properly supporting the massive amount of 1080p content out there without blurring. 8K (4320) screens also fit perfectly for this same reason. 8K running at 4K would look identical to a 4K screen as well as an 8K running 1080 compared to a 1080p. 8 pixels become 1.

I hope that makes sense.
 
Last edited:
Thanks for that lengthy explanation of how to multiply and divide numbers! :)

I'm not quite so sure that stretching pixels to fill a display leaves things "identical" though...
 
Last edited:
Ok so I ordered but haven't picked up the PC so there's time to change my mind if necessary but...

PC is ryzen7 3700x with 3060ti card

Monitor I'm thinking is this one:
XV340CK Pbmiipphzx - Tech Specs | Monitors | Acer United States

34 flat panel (no curve yesss!) 1440p / 144hz with Freesync, but doesn't list G-sync

Random review says:

Even though the Acer XV340CKP is not certified as G-SYNC Compatible by NVIDIA, FreeSync works without any issues with compatible NVIDIA cards (GTX 10-series or newer)! You just have to enable it manually.

Good or no?
 
Thanks for that lengthy explanation of how to multiply and divide numbers! :)
:lol: I'm just trying to explain [to everyone who isn't technical] how some resolutions can be scaled down easily without blurring, or scaled up if the TV hardware/software is good enough.

I'm not quite so sure that stretching pixels to fill a display leaves things "identical" though...
Because it's not stretching, fool. Dammit, didn't we just go over this?

If the pixel density of a 4K screen is FOUR TIMES the pixel density of a 1080p screen and you want to scale 4K down to 1080p, it would be four 4K pixels to one 1080p pixel. So for the same size screen, this results in identical pixel density. 4 quarter-size pixels = 1 large pixel. Same effective pixels per inch.
 
Last edited:
Yeah it's fairly well reviewed and it'll mount nicely to the wall...

Fucking covid, I'd love to just go see these in person and make a decision ugh...
 
Also, and this might be irrelevant, but i spend a fair bit of time staring at and interacting with large architectural drawings, and a curved screen might be weird for that?? With all the straight lines and what-not....
 
:lol: I'm just trying to explain [to everyone who isn't technical] how some resolutions can be scaled down easily without blurring, or scaled up if the TV hardware/software is good enough.

Because it's not stretching, fool. Dammit, didn't we just go over this?

If the pixel density of a 4K screen is FOUR TIMES the pixel density of a 1080p screen and you want to scale 4K down to 1080p, it would be four 4K pixels to one 1080p pixel. So for the same size screen, this results in identical pixel density. 4 quarter-size pixels = 1 large pixel. Same effective pixels per inch.

Oh wow so cool!

TECHNOLOGY IS AMAZING! :)

Anti-aliasing is a thing of the past! We just need to divide and multiply pixels in whole numbers!
 
Last edited:
Also, and this might be irrelevant, but i spend a fair bit of time staring at and interacting with large architectural drawings, and a curved screen might be weird for that?? With all the straight lines and what-not....

oh then probably flat is the way to go
 
Also, and this might be irrelevant, but i spend a fair bit of time staring at and interacting with large architectural drawings, and a curved screen might be weird for that?? With all the straight lines and what-not....

that's why i don't have one. photoshop on a curved screen is a bitch.
 
Yeah it's fairly well reviewed and it'll mount nicely to the wall...

Fucking covid, I'd love to just go see these in person and make a decision ugh...
I have a 27" (16:9), but this is my limit for rotating my head left and right. I'd imagine 34" ultra wide would be far crazier, but if that doesn't bother you, then whatever works.

Also, and this might be irrelevant, but i spend a fair bit of time staring at and interacting with large architectural drawings, and a curved screen might be weird for that?? With all the straight lines and what-not....
I've watched reviews of graphics artists who said the same thing, but they also said that after a couple days/week they were used to it and didn't even notice it anymore.

If that were my job/hobby, it'd probably be an area of concern for me as well.
 
Oh wow so cool!

TECHNOLOGY IS AMAZING! :)

Anti-aliasing is a thing of the past! We just need to divide and multiply pixels in whole numbers!
1.) Your monitor doesn't anti-alias.
2.) And actually, higher resolution (like 2160p) almost removes the need for anti-aliasing because of the pixel density.

Ya cunt.
 
Back
Top