mikjames
Well-known member
There seems to be a fair amount of misconception around using a 4k tv as a computer monitor. Hopefully this post will clarify some things for those considering it.
1) The vast majority of televisions do not have the internal circuitry required to upscale 1440p to 4k, whereas the majority of 4k monitors do. It is a simple calculation to scale 1080p to 4k, 1 pixel = 4 pixels, not so for 1440p.
There are a couple questions that will arise here. "Why would you want to, wouldn't it look worse than 1080p upscaled?" I don't know first hand, but a large number of "native" 1440p monitors are actually using 4k panels these days due to economy of scale. Friends of mine with 4k monitors prefer 1440p upscaled, rather than 1080p upscaled.
Another question I've been pondering, is why Nvidia/Amd graphics cards are incapable of upscaling 1440p to 4k before it sends the frame to the monitor. Aside from general incompetence, the only thing I can think of is the potential for added input lag, a problem they might prefer to avoid entirely.
2) Tv manufacturers employ all sorts of skeevy engineering tricks in 4k tv panels. They were able to claim 4k resolution for tv/video hoping that few would call them out on their tricks.
Some of these tricks include the use of 4:2:0 chroma sub-sampling, leading to a less precise final image. This often presents as chromatic aberrations surrounding fine text and/or fine high contrast details in video/games. It's undetectable at a standard tv sitting distance, but I'm sitting 2-3 feet away from a 40" monitor to get the full 4k experience...
Tv engineers have also experimented with incomplete sub pixel grids (the red green and blue lights that make up a single pixel), embedding white subpixels, and/or allowing multiple pixels to share a subpixel, via a diamond layout. What this boils down to is a loss of detail and accuracy, and potentially increased input lag. The display has to either guess when applying a square pixel to a bastardized diamond/triangle pixel, or shove it in wherever it will fit. Not a good recipe for an accurate 4k image.
3) Tv's will advertise refresh rates that are marketing gimmicks, the vast majority will not accept a 120hz 1080p input, let alone anything more impressive.
4) Tv's will generally have higher input lag, though this has changed over the years. There are certainly quite a few hidden gems with competent game modes in the tv market, but you're far more likely to get a dud if you don't do the research. Displaylag.com, Rtings.com, these are your friends for input lag comparisons. Rtings will also provide info on supported input res/refresh, and a close up of the rgb pixel structure.
5) Monitors generally make life easy for the end user, brightness, contrast, and rgb controls are typically all present in a basic menu, with minimal superfluous features that will just get in the way of a good image. Tvs, particularly cheap ones often have horrible out of box settings, with needless settings present in the user menu, and useful settings hidden away in a service menu that requires a special access code. If you put the time in to navigate the service menu and get everything dialed in, you may find all your settings are wiped when the tv shuts down. The safest route is to stick with tv brands that are either top tier, or those that also sell computer monitors. Samsung, Sony, Panasonic, LG, etc.
A couple years ago I paid ~$300 for an Avera EQX10. "4k", "low input lag", everyone was ranting and raving about it's awesomeness. DON'T MAKE THE SAME MISTAKE I DID! I enter the service menu everyday using the remote, to dial in my settings. I have to ensure that scaling is disabled in the Nvidia control panel to avoid excessive input lag. My options are 1080p or 4k, nothing in between, and there is a consistent dithering/splotchy effect visible in all dark/high contrast games, regardless of configuration. It probably has a non rgb panel. It's a garbage display, and I got exactly what I paid for. I have my eye on a Samsung Nu7100, but I'm not going in blind this time, I understand it's limitation in regards to 1440p and 120hz refresh rate.
1) The vast majority of televisions do not have the internal circuitry required to upscale 1440p to 4k, whereas the majority of 4k monitors do. It is a simple calculation to scale 1080p to 4k, 1 pixel = 4 pixels, not so for 1440p.
There are a couple questions that will arise here. "Why would you want to, wouldn't it look worse than 1080p upscaled?" I don't know first hand, but a large number of "native" 1440p monitors are actually using 4k panels these days due to economy of scale. Friends of mine with 4k monitors prefer 1440p upscaled, rather than 1080p upscaled.
Another question I've been pondering, is why Nvidia/Amd graphics cards are incapable of upscaling 1440p to 4k before it sends the frame to the monitor. Aside from general incompetence, the only thing I can think of is the potential for added input lag, a problem they might prefer to avoid entirely.
2) Tv manufacturers employ all sorts of skeevy engineering tricks in 4k tv panels. They were able to claim 4k resolution for tv/video hoping that few would call them out on their tricks.
Some of these tricks include the use of 4:2:0 chroma sub-sampling, leading to a less precise final image. This often presents as chromatic aberrations surrounding fine text and/or fine high contrast details in video/games. It's undetectable at a standard tv sitting distance, but I'm sitting 2-3 feet away from a 40" monitor to get the full 4k experience...
Tv engineers have also experimented with incomplete sub pixel grids (the red green and blue lights that make up a single pixel), embedding white subpixels, and/or allowing multiple pixels to share a subpixel, via a diamond layout. What this boils down to is a loss of detail and accuracy, and potentially increased input lag. The display has to either guess when applying a square pixel to a bastardized diamond/triangle pixel, or shove it in wherever it will fit. Not a good recipe for an accurate 4k image.
3) Tv's will advertise refresh rates that are marketing gimmicks, the vast majority will not accept a 120hz 1080p input, let alone anything more impressive.
4) Tv's will generally have higher input lag, though this has changed over the years. There are certainly quite a few hidden gems with competent game modes in the tv market, but you're far more likely to get a dud if you don't do the research. Displaylag.com, Rtings.com, these are your friends for input lag comparisons. Rtings will also provide info on supported input res/refresh, and a close up of the rgb pixel structure.
5) Monitors generally make life easy for the end user, brightness, contrast, and rgb controls are typically all present in a basic menu, with minimal superfluous features that will just get in the way of a good image. Tvs, particularly cheap ones often have horrible out of box settings, with needless settings present in the user menu, and useful settings hidden away in a service menu that requires a special access code. If you put the time in to navigate the service menu and get everything dialed in, you may find all your settings are wiped when the tv shuts down. The safest route is to stick with tv brands that are either top tier, or those that also sell computer monitors. Samsung, Sony, Panasonic, LG, etc.
A couple years ago I paid ~$300 for an Avera EQX10. "4k", "low input lag", everyone was ranting and raving about it's awesomeness. DON'T MAKE THE SAME MISTAKE I DID! I enter the service menu everyday using the remote, to dial in my settings. I have to ensure that scaling is disabled in the Nvidia control panel to avoid excessive input lag. My options are 1080p or 4k, nothing in between, and there is a consistent dithering/splotchy effect visible in all dark/high contrast games, regardless of configuration. It probably has a non rgb panel. It's a garbage display, and I got exactly what I paid for. I have my eye on a Samsung Nu7100, but I'm not going in blind this time, I understand it's limitation in regards to 1440p and 120hz refresh rate.