Maybe they should stop forcing this antiquated TV License scam. For those who don’t know, the UK pushes for a TV License to watch Live TV though most people don’t watch BBC anymore. Though I bet they will try to find a way to include Netflix and more in the TV License.
What about 12K?
I want a dumb tv with the price and and specs of a smart tv. Less is more
You would likely have to pay more since they aren’t getting to sell your information.
I second this.
The TV industry can crash and burn otherwise.
Most developing countries have cheap 1080p TVs right now, but others are still using CRTs, and still others are watching on their phones (like some of my poorer relatives who do get their entertainment fix through their phones while the big TVs in their living rooms rarely gets turned on).
I think my TV is like 32" and 720p from 2012 or so. It’s fine.
Before that I had a projector which was fun sometimes but mostly too big for the room. Cool to take psychedelics and run a visualizer while it was pointed along the floor at you. You could feel the fractals on your skin. I don’t do that anymore, so a 32" TV is fine.
Real innovation:

What about Dairy Queen?
Make a TV that can accurately reproduce really high contrast ratios and don’t put any pointless software on it. Display the image from the source with as much fidelity as possible, supporting all modern display technology like VRR.
That’s all I want from a display.
Next big technological innovation will be good looking and fast working e- ink TVs.
I doubt this. I use an e-ink android tablet as an e-reader. I like that it’s easy on the eyes. For using it to scroll Lemmy or even a web page, it’s fine. But the refresh rate (even on the best settings) makes watching a video or gif on it painful.
I don’t think anyone really wants an e-ink TV unless they want something that’s a hybrid. The things you’d use a tv for are just not e-ink things.
This makes sense to me. A hybrid would be nice. Have a calendar or some art while it’s “off”. But then, that’s probably pretty expensive. (Not that I’ve looked, I’m just assuming.)
Hmm I have considered this, and I think it is ads beamed straight to the eyeball
- The TV industry… Probably
Theres a ton of other things I want my TV to do before more pixels.
Actual functional software would be nice, better tracking on high speed shots (in particular sweeping landscapes or reticles in video games) higher frame rates and variable frame rate content, make the actual using of the tv, things like changing inputs or channels faster, oh man so much more.
Anything but more pixels.
I still probably watch 90% 1080p and 720p stuff lol. As long as the bitrate is good it still looks really good.
Actual functional software would be nice
you do not want software on your TV.
I mean, yes and no. I like e-arc, and I like being able to adjust settings other than v-hold. But I don’t want this slow crud fest that keeps telling me when my neighbour turns on Bluetooth on their iphone.
I like e-arc,
… the audio hdmi thing?
I want software on my TV.
Steam Link specifically. I like streaming to my TV via Ethernet.
You can do that with a Raspberry Pi for <$100 and without the need to have Amazon/Google/Roku/whoever tf else collecting your data.
Who says I let Amazon/Google/Roku/whoeverTfElse collect my data?
I have my TV isolated to its own network and allow inputs from LAN, so i can use Steam Link and Jellyfin just fine.
So get a device that can do that. You didn’t need a piece of software that will never see an update to do this.
It’s funny that you think smart TVs don’t receive updates. It’s got a wifi chip for a reason.
I have a Samsung frame because I wanted a TV that didn’t look so much like I had one, but the software is so goddam bad. The only way to switch sources quickly is to set them as a favorite which isn’t always that straight forward if you didn’t do it right away. Regardless you have to let the the home page fully render before you can even worry about that. Even the Samsung TV app which you would think would be perfectly optimized for the hardware since the same compare makes the software is barely functional and loads like a web page on AOL in 1998
I like my frame because it faces a set of windows and with all my other tvs … I would have to close the blinds to see the tv in the day time.
However, the software is straight garbage. I didn’t even know about the favourite thing … every time I change source it would spend like a minute or two trying to figure out if it could connect to it for no reason.
Couldn’t any TV be a “frame tv”
Would you just need some trim?
No, not really
I have bad eye sight, but also 8k probably never need to be a thing on a tv, more impressive for stuff thats closer to our eyes like VR headsets probably.
Even if it was, the streamings everyone’s using crush down the bitrate so bad it’d barely look better than 4k anyway.
They showed Skyfall on 70ft IMAX screens, and that film was shot 2880 x 1200. Its not all about the pixel count.
Working in entertainment and broadcasting, you learn barely half of Americans have a 4K tv and it’s under half worldwide. Marketing makes you think that “everyone is doing it”
Most enterprise places are 1080p default and reqi people to go above and beyond to justify 4k screens.
Then we get into laptop’s still barely going about 720p so a lot of people have no idea what 4k would even bring them as even most streaming content still only in 1080 so not really noticeable for those who even have 4k screens
I’m still using a 1080p TV and I only plan to replace it if it breaks.
If you do, smart TVs are dumb and forgot how to TV
Wait, half have 4k‽
I don’t own a single 4k. Have four 1080p dumb TV’s. I will get a new one when one of these dies.
Broadcast towers put out 1080p and satellite broadcasts 1080i. Filmmakers/streaming standard is 1080p. As a filmmaker (producer/ cinematographer) and broadcaster operator, you’re always encoding to scale down due to bandwidth and/or majority of the incapable most due to FCC rules/cost.
Even the bargain basement ones with horrible pictures have 4k resolution these days
Think it is an economy of scale issue.
Yeh, do: 60fps, 30 bit color… and I guess HDR?
Do things that people can actually appreciate.
And do them in the way that utilises the new tech. 60fps looks completely different from 24fps… Work with that, it’s a new media format. Express your talentSorry, the best I can do is install a camera and microphone on our next model, to spy on you and force interaction with advertisements.
I mean video conferencing from your living room. How neat is that?
Gaming was supposed to be one of the best drivers for 8K adoption.
Whu? Where 4k still struggles with GPU power? And for next to no benefit?
Introducing the new DLSS 9, were we upscale 720p to 8k. Looks better than native, pinky swear.
What’s dumb is that 3D failed because of lack of resolution and brightness, and now we have more pixels than we can handle and screens so bright they can hurt to look at. PS3 had a couple games that showed different screens to two players wearing 3D glasses. I’d love to see full screen couch coop games with modern tech. 8K isn’t solving any problems.
3D failed for the exact same reason VR is failing now. Nobody wants to wear headsets at home.
Screen dimming is technically possible over HDMI/Displayport no idea why its not properly supported and integrated into monitors, graphics drivers, windows and Linux. KDE shows dimming for monitors sometimes? Don’t know if that is software or real hardware dimming though.
I’m talking about brightness in context of 3d. KDE uses DDE with USB connected monitors which is the same thing as using a button to lower brightness.
It’s real hardware dimming.
Nice thought so but wasn’t sure.
I dunno. Oxygen Not Included looks crisp on a 4K monitor. And it makes my job easier, being able to have an absolute tonne of code on-screen and readable. I reckon I could probably use an 8K monitor for those things.
Yeah, I generally have FSR running on any 3D game made in about the last decade - even if I can run it at 4K at a reasonable framerate, my computer fans start to sound like a hoover and the whole room starts warming up. But upscaling seems a better solution than having separate monitors for work and play.
Yeah, can’t run it, Internet too slow to steam it
Pretty sure my GPU could run 4k Rimworld, just play good games instead of AAA games.
FR. What distance and size do I need to be able to actually see the difference between 1080p and 4K? Cuz my current setup does not allow me to notice anything but the massive reduction in performance, unless it’s a 2D game and then everything becomes too tiny to effectively play the game.
4k is noticeable in a standard pc.
I recently bought a 1440p screen (for productivity, not gaming) and I can fit so much more UI with the same visual fidelity compared to 1080p. Of course, the screen needs to be physically bigger in order for the text to be the same size.
So if 1080p->1440p is noticeable, 1080p->4k must be too.
Like I said, 2D things it is noticable only because it makes everything smaller (there is more space because the elements inside that space are smaller). However movies and 3D games? No difference.
Even going from 640x480 to 1024x768 makes a noticeable size difference with the 2D elements of a UI.
I’m using a 60 inch tv as a monitor to my desktop - I sit in front of it at a distance of about 2m. It feels really nice to have stuff in 4k, so it’s always 4k except the games that are too tough for my 2060 super to give me 60p.
It’s about 60”. That’s the point where we noticed pixels at 1080P. HDR is more important.
It’s very noticeable at the DPI of a 27" screen from arms’ length. Or maybe not if you can’t see very well. But on a TV from 10 feet away, I dunno if I could differentiate 1440p from 4K personally.
I have 27 inch 2k monitors at work, and it’s already enough
27" 2K is a good DPI. I personally only went up to 4K 27" because I also wanted OLED, and the 2K OLED panel I was using had some noticeable text fringing because of the subpixel layout. At 4K it’s not noticeable anymore.
For PC you can actually see the benefit between 4K and lower resolutions.
It’s for TV where it’s likely useless to have 8K
I am on a PC… You mean a smaller monitor closer to my face?
Yes, that’s what I am saying, for a monitor you can easily see the difference between 4K and lower. While for a tv if I remember correctly for a 55’ over 2 meters and something from the TV you can’t tell the difference between 4k and 1080p
8K is theoretically good as “spare resolution,” for instance running variable resolution in games and scaling everything to it, displaying photos with less scaling for better sharpness, clearer text rendering, less flickering, stuff like that.
It’s not worth paying for. Mostly. But maybe some day it will be cheap enough to just “include” with little extra cost, kinda like how 4K TVs or 1440p monitors are cheap now.













