cancel
Showing results for 
Search instead for 
Did you mean: 

1440 to 1080p-fuzzy picture!?

ddmeltzer8
Level 7
hi,all.
I currently have display resolution of 2560x1440 but I would like to change it down to 1920x1080p to see if I can overclock it and get a few more fps out of it.
The problem is than when I change the res the picture goes all fuzzy on me!All lines and fonts and such in all of win10 gets fuzzy and not even "clear-type" can fix it.
Is there anyone who knows a bit about this?
Thanks.
7,955 Views
10 REPLIES 10

haihane
Level 13
this happens on most (if not all) monitors i owned.

don't know how to fix 'em. i had always assumed that it was a problem of expectations (mine) rather than the monitor itself.
no siggy, saw stuff that made me sad.

Hello,

Your modern LCD/LED screen is a matrix of pixels, each one has a physical size.
To send an image to the screen there is a digital processor that is set to run at the physical dimensions of said screen.

So if the screen has 2560 x 1440 pixels, the digital processor will output everything to this size.
When you lower the input resolution the digital processor has to convert your image, just like resizing (stretching) a bitmap picture.
Now you have a smaller image with less pixels that needs to fill a higher pixel area.
The digital processor will try to fill the screen so you don't get borders, so the image is upscaled back to the native resolution of the screen, 2560 x 1440.
This causes the poor image quality.

Some monitors allow borders when running a lower resolution, this fixes the quality to an extent as the smaller size image isn't being stretched.
The problem you are seeing is the reason why LCD/LED monitors/TVs must run at their native resolution, the input to the monitor/TV must match the physical screen size for the best possible image quality.

Older CRT monitors would have handled this without an issue 🙂
The short version, its a limitation of LCD/LED monitors/TVs.

tommiboi
Level 7
Dont turn it to 1080P on desktop. But in game settings should just work fine. If you can get more fps due to that am not sure.

But I would never had paired a 1050ti with 7700K though. Then I would have chosen an 9 serie GPU and bought a used 970/980 maybe even a 980ti... or go super cheap and buy a 780ti.

Anyways hope ingame settings will proivde you what you are aiming for GL HF
Try more, Try harder, Try and previal

Korth
Level 14
2560x1440 and 1920x1080 have the same 16:9 aspect ratio. Using "borders" would fill 25% of your width and 25% of your height with black pixels and shrink your 1080p image down to the center 75% of the screen ... I think it would look kinda awful, lol.

There might be graphic quality settings (like antialiasing) in your AMD/NVIDIA control panel which can make downscaling look better. But chances are it's largely controlled by the logic hardware built into the monitor and basically what you see is what you get no matter how much you tweak the settings.
"All opinions are not equal. Some are a very great deal more robust, sophisticated and well supported in logic and argument than others." - Douglas Adams

[/Korth]

Korth wrote:
2560x1440 and 1920x1080 have the same 16:9 aspect ratio. Using "borders" would fill 25% of your width and 25% of your height with black pixels and shrink your 1080p image down to the center 75% of the screen ... I think it would look kinda awful, lol.


What it looks like is one thing, the main point is the input resolution of 1080 will not be altered, only centered. Nothing is resized, there is no shrinking, so no distortion to the input image.

Just like a desktop wallpaper, stretch Vs centered on a smaller image (smaller than your monitor's native resolution). 🙂

tommiboi
Level 7
Dont think you can do this with the normal software. But like Korth said there might be some software that i smade for downscaling a monitor. But this is software and really dont think you will gain more fps! The black pixels still need to be refreashed as well by your GPU. So only a hard option would alow this for you to get more more FPS.

Aslo the 1050ti oc is already oc'd from the factory. I OC'd lots of cards just because I could. Best result where always from AMD card... but that another subject.

Just doing some fast numbers.
Lets say you have 60 FPS now.
You get a plus 5 to 8% stable overclock. Lets say 8% you gain 4.8 fps.. i
That is in theory... moslylike you you will gain less IRL gaming.

TBH.. a 1050ti is not suitble for a 1440P screen when you demand high FPS.

I have 980ti and I there lots of games that i cannot play on ultra settings. Even with my overclock 26% of default. ( and this comes factory OC'd on 22% )and is custome watercoled as well.

Just saying what you want is not really going to workout at all.

Hopefully you will find a solution. Would like to know if you found 1.. Maybe I can set my PUGB on 1080p

GL HF
Try more, Try harder, Try and previal

Korth
Level 14
DX12 (and DX11.x, and DX10) has integrated trans-scaling functions. If the game/app senses active 1920x1080 on a 2560x1440 panel - and it's properly coded, lol - there won't be any additional performance hit (or fps reduction) because DirectX will pass pixel calculations from software to GPU hardware. It's still rendering in hardware at 1920x1080 regardless of whatever (unused) resolution capacities are present on the display panel. Iff the game engine, DirectX APIs, and GPU drivers are all properly coded, lol.

You should expect raw fps performances similar to anyone else running the same game on the same GPU/CPU at 1080p. The real question is how it looks when upscaled onto a 1440p panel, it the GPU (and DX) upscales the render well then it can look great, if the monitor's embedded logic is cheap then it'll look bad and no (1080p) software setting can really improve it.
"All opinions are not equal. Some are a very great deal more robust, sophisticated and well supported in logic and argument than others." - Douglas Adams

[/Korth]

AlexiTQ
Level 9
What you would need is a 1:1 pixelmapping, i.e 1 pixel input is scaled to 2 pixel output. 1280x720 would be the proper resolution for that. However, most displays don't support that and neither do the graphics drivers.
Instead, what you get is some kind of interpolation (probably bilinear) and that will always give you a noticeable reduction in image quality. If your display supports it your best bet is to turn off the upscaling and have the image centered with black borders. This will maintain the crispness of the image (and actually look better than 1920x1080 on a 1920x1080 display), but with a smaller image size.

Both AMD and Nvidia graphics cards have a driver setting to do either in-monitor resolution scaling, or use the graphics card's own built-in resolution scaler. If you pick monitor scaling, you have to put your trust in whatever scaler the manufacturer saw fit to put in there - by original poster's testimony, apparantly not a very good one in this particular case!

Switching over to graphics card scaling instead might provide better image quality. It's worth a shot. Chances are it won't ever look as good as using the monitor's LCD panel's native resolution, but there can still be a big difference between a cheap bad scaler and a good one.