A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)
A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)
Isn’t it just a limitation of human vision? No matter how much resolution we can create, the human eye will only ever see a certain level of resolution … anything beyond that is imperceptible to us. I think I remember reading that 4K is the maximum we can realistically appreciate and anything beyond that is impractical because no one would ever notice the difference.
The only way higher resolutions work is if you start blowing up the size of the image itself. A 20" wide image at 720p looks good but the same image blow up to 60" becomes noticeably pixelated. A 20" wide image at 8K looks sharp and blown up to 60", it still looks sharp.
Some displays really are that big, which is why they are trying to make denser panels. But for the “TV in a living room” use case, yeah, we’re already at the point of diminishing returns. That’s why they’re developing other things, like higher resolutions, OLEDs, and 3D.
There’s an e-ink display tech (just published in Nature, so very much in the lab) that has a pixel density >25,000 PPI and it can operate at up to 200FPS. It’s on a scale where 1 pixel to 1 retinal cell is possible.
The color gamut isn’t as wide as sRGB but it’s 100x the gamut of color electrophoretic displays (‘e-Ink’, like in the Kindles).
here: https://www.nature.com/articles/s41586-025-09642-3