I want a dumb tv with the price and and specs of a smart tv. Less is more
I second this.
The TV industry can crash and burn otherwise.
Real innovation:

Make a TV that can accurately reproduce really high contrast ratios and don’t put any pointless software on it. Display the image from the source with as much fidelity as possible, supporting all modern display technology like VRR.
That’s all I want from a display.
Next big technological innovation will be good looking and fast working e- ink TVs.
Hmm I have considered this, and I think it is ads beamed straight to the eyeball
- The TV industry… Probably
Theres a ton of other things I want my TV to do before more pixels.
Actual functional software would be nice, better tracking on high speed shots (in particular sweeping landscapes or reticles in video games) higher frame rates and variable frame rate content, make the actual using of the tv, things like changing inputs or channels faster, oh man so much more.
Anything but more pixels.
I still probably watch 90% 1080p and 720p stuff lol. As long as the bitrate is good it still looks really good.
Actual functional software would be nice
you do not want software on your TV.
I mean, yes and no. I like e-arc, and I like being able to adjust settings other than v-hold. But I don’t want this slow crud fest that keeps telling me when my neighbour turns on Bluetooth on their iphone.
I like e-arc,
… the audio hdmi thing?
I want software on my TV.
Steam Link specifically. I like streaming to my TV via Ethernet.
You can do that with a Raspberry Pi for <$100 and without the need to have Amazon/Google/Roku/whoever tf else collecting your data.
Who says I let Amazon/Google/Roku/whoeverTfElse collect my data?
I have my TV isolated to its own network and allow inputs from LAN, so i can use Steam Link and Jellyfin just fine.
So get a device that can do that. You didn’t need a piece of software that will never see an update to do this.
I have a Samsung frame because I wanted a TV that didn’t look so much like I had one, but the software is so goddam bad. The only way to switch sources quickly is to set them as a favorite which isn’t always that straight forward if you didn’t do it right away. Regardless you have to let the the home page fully render before you can even worry about that. Even the Samsung TV app which you would think would be perfectly optimized for the hardware since the same compare makes the software is barely functional and loads like a web page on AOL in 1998
I like my frame because it faces a set of windows and with all my other tvs … I would have to close the blinds to see the tv in the day time.
However, the software is straight garbage. I didn’t even know about the favourite thing … every time I change source it would spend like a minute or two trying to figure out if it could connect to it for no reason.
Couldn’t any TV be a “frame tv”
Would you just need some trim?
No, not really
Even if it was, the streamings everyone’s using crush down the bitrate so bad it’d barely look better than 4k anyway.
They showed Skyfall on 70ft IMAX screens, and that film was shot 2880 x 1200. Its not all about the pixel count.
Working in entertainment and broadcasting, you learn barely half of Americans have a 4K tv and it’s under half worldwide. Marketing makes you think that “everyone is doing it”
Most enterprise places are 1080p default and reqi people to go above and beyond to justify 4k screens.
Then we get into laptop’s still barely going about 720p so a lot of people have no idea what 4k would even bring them as even most streaming content still only in 1080 so not really noticeable for those who even have 4k screens
I’m still using a 1080p TV and I only plan to replace it if it breaks.
If you do, smart TVs are dumb and forgot how to TV
Wait, half have 4k‽
Broadcast towers put out 1080p and satellite broadcasts 1080i. Filmmakers/streaming standard is 1080p. As a filmmaker (producer/ cinematographer) and broadcaster operator, you’re always encoding to scale down due to bandwidth and/or majority of the incapable most due to FCC rules/cost.
I don’t own a single 4k. Have four 1080p dumb TV’s. I will get a new one when one of these dies.
Even the bargain basement ones with horrible pictures have 4k resolution these days
Yeh, do: 60fps, 30 bit color… and I guess HDR?
Do things that people can actually appreciate.
And do them in the way that utilises the new tech. 60fps looks completely different from 24fps… Work with that, it’s a new media format. Express your talentSorry, the best I can do is install a camera and microphone on our next model, to spy on you and force interaction with advertisements.
I mean video conferencing from your living room. How neat is that?
8K is theoretically good as “spare resolution,” for instance running variable resolution in games and scaling everything to it, displaying photos with less scaling for better sharpness, clearer text rendering, less flickering, stuff like that.
It’s not worth paying for. Mostly. But maybe some day it will be cheap enough to just “include” with little extra cost, kinda like how 4K TVs or 1440p monitors are cheap now.
Also, we haven’t even got HDR figured out.
I’m still struggling to export some of my older RAWs to HDR. Heck, Lemmy doesn’t support JPEG XL, AVIF, TIFF, HEIF, nothing, so I couldn’t even post them here anyway. And even then, they’d probably only render right in Safari.
Gaming was supposed to be one of the best drivers for 8K adoption.
Whu? Where 4k still struggles with GPU power? And for next to no benefit?
Introducing the new DLSS 9, were we upscale 720p to 8k. Looks better than native, pinky swear.
What’s dumb is that 3D failed because of lack of resolution and brightness, and now we have more pixels than we can handle and screens so bright they can hurt to look at. PS3 had a couple games that showed different screens to two players wearing 3D glasses. I’d love to see full screen couch coop games with modern tech. 8K isn’t solving any problems.
3D failed for the exact same reason VR is failing now. Nobody wants to wear headsets at home.
Screen dimming is technically possible over HDMI/Displayport no idea why its not properly supported and integrated into monitors, graphics drivers, windows and Linux. KDE shows dimming for monitors sometimes? Don’t know if that is software or real hardware dimming though.
I’m talking about brightness in context of 3d. KDE uses DDE with USB connected monitors which is the same thing as using a button to lower brightness.
It’s real hardware dimming.
Nice thought so but wasn’t sure.
I dunno. Oxygen Not Included looks crisp on a 4K monitor. And it makes my job easier, being able to have an absolute tonne of code on-screen and readable. I reckon I could probably use an 8K monitor for those things.
Yeah, I generally have FSR running on any 3D game made in about the last decade - even if I can run it at 4K at a reasonable framerate, my computer fans start to sound like a hoover and the whole room starts warming up. But upscaling seems a better solution than having separate monitors for work and play.
Yeah, can’t run it, Internet too slow to steam it
Pretty sure my GPU could run 4k Rimworld, just play good games instead of AAA games.
FR. What distance and size do I need to be able to actually see the difference between 1080p and 4K? Cuz my current setup does not allow me to notice anything but the massive reduction in performance, unless it’s a 2D game and then everything becomes too tiny to effectively play the game.
4k is noticeable in a standard pc.
I recently bought a 1440p screen (for productivity, not gaming) and I can fit so much more UI with the same visual fidelity compared to 1080p. Of course, the screen needs to be physically bigger in order for the text to be the same size.
So if 1080p->1440p is noticeable, 1080p->4k must be too.
Like I said, 2D things it is noticable only because it makes everything smaller (there is more space because the elements inside that space are smaller). However movies and 3D games? No difference.
Even going from 640x480 to 1024x768 makes a noticeable size difference with the 2D elements of a UI.
I’m using a 60 inch tv as a monitor to my desktop - I sit in front of it at a distance of about 2m. It feels really nice to have stuff in 4k, so it’s always 4k except the games that are too tough for my 2060 super to give me 60p.
It’s very noticeable at the DPI of a 27" screen from arms’ length. Or maybe not if you can’t see very well. But on a TV from 10 feet away, I dunno if I could differentiate 1440p from 4K personally.
I have 27 inch 2k monitors at work, and it’s already enough
27" 2K is a good DPI. I personally only went up to 4K 27" because I also wanted OLED, and the 2K OLED panel I was using had some noticeable text fringing because of the subpixel layout. At 4K it’s not noticeable anymore.
It’s about 60”. That’s the point where we noticed pixels at 1080P. HDR is more important.
For PC you can actually see the benefit between 4K and lower resolutions.
It’s for TV where it’s likely useless to have 8K
I am on a PC… You mean a smaller monitor closer to my face?
Yes, that’s what I am saying, for a monitor you can easily see the difference between 4K and lower. While for a tv if I remember correctly for a 55’ over 2 meters and something from the TV you can’t tell the difference between 4k and 1080p
For movies, look up what Alexa Mini’s can do. The very recent past dominant Mini had a 3424x2202 resolution. Most movies shot digitally (most in the last like 20 years) were shot at below 4k. Many had special effects done at 1080p. When movie theaters switched to digital projection, most used 2048x1080 projectors and the shift towards 4k projectors wasn’t that long ago.
The Alexa Mini LF can do 4448x3096. The Sony Venice 8k camera is a few years old only. There’s a huge amount of loyalty to Arri, particularly the lenses. Panasonic and Z-Cam have 8k cameras. But like Arri, I’d bet most filmmakers would choose the 4k/6k cameras
Even the ones that opt for the 8k cameras, there very likely would be lower resolution cameras in use as well like the Alexa Mini or a 4k/6k Z-Cam or like a Sony FX3. Including most nature documentaries. Unless it’s just wide canvases, you’re not going to be mounting the 8k cinema cameras all over the place like you would cheaper small cameras like an FX3. Plus stuff like shooting at 240fps 12bit color for slow motion playback. I’m betting that none of the cinema cameras support that at 8k. So back down to probably 1080p. Don’t know the status of 4k240 out there. That’s a lot of bandwidth, storage, processing power, and cooling needed
Then there’s 70mm filming. The vast majority of productions cannot do that. There’s a lot of unused footage that goes into filmmaking. If you look at the size of a roll of 70mm film used for projecting a feature length film, it’s huge. It’s boutique equipment. There’s not a ton of cinema 70mm cameras and lenses to rent. Camera operators. Labs to get them scanned for digital editing. 70mm will always be limited in adoption especially now that digital is the dominant form of cinematography
So 8k will for many many years be the land of upscaling and for native, old video games. Videos, I guess two 8k60 cameras rigged together for VR will lead the way along with an occasional movie/documentary. No guarantees that the movie theater will show it in 8k though. It’s taken a very long time for 4k projection to approach standard for theaters in places like the US let alone poorer countries or countries with less mega theater chains that could more comfortably afford the upgrade
It’s about time the electronics industry as a whole realises that innovation for the sake of innovation is rarely a good thing
Look, we can’t have TVs that last 15 years anymore!
We need to keep people buying every year or two. Otherwise line not go up! Don’t you understand that this is about protecting The Economy?!
Boomers economic policy is like if Issac Newton saw an apple falling from a tree, and came to the conclusion it would always accelerate at the same speed no matter what, even though the ground with the entire ass planet behind it is right fucking there.
Numbers can not constantly go up, it’s just that’s what was happening their whole lives and they can’t accept that their childhoods was a blip and not how things always were and always will be.
They just can’t wrap their heads around it. They have such shit tier empathy they can’t comprehend that they’re an exception.
To be fair Boomers didn’t create this economic policy. Their parents elected Nixon, who broke the Bretton Woods agreement “temporarily”, and then we adopted Keynesian macroeconomic policy afterwards to justify it.
Inb4 someone regurgitates a defense of this “boomer” policy and proves that it’s not just them and never was. It’s always been the rich and the their loyal servants.
A large number of the problems we currently face and will in the future come down to boomers being worse than their predecessors at grasping, understanding, and accepting their own impermanence and unimportance on the grand stage of reality.
Most of them need to have a series of existential crises or maybe read some fucking Satre so they can stop with the Me generation bullshit. It’s wild that the first generation to do LSD in mass is somehow the one that needs to experience ego death the most
It’s wild that the first generation to do LSD in mass
I want to say hippies were less than 1% of that generation, but for some reason I think it was recorded as 2-3% which would be a gross over-estimate.
But for every hippie you think of sticking daisies in rifles, there was 100 spitting on Black kids for going to the school they were legally required to go to.
It would be like if in 2080 they think we’re all catboys with blue hair and 37 facial piercings.
Sure, those people exist as a fringe demographic, but they’re not the norm.
Bmost hippies had more issues with peers their own age than their parents age, that part of the folk tale gets left out tho, because the people who want us to think they were hippies and “grew out of it” were the ones beating hippies for being different.
All they were ever trying to do was lie to younger generations in the hopes they’d confirm to decades old social norms. Like, it’s weird how many people still don’t understand the boomers just lie about shit instinctively. They grew up in a world filled with lead and are literally incapable of caring about logical inconsistencies. They want younger generations to think they were cool, so they just fucking lied about what they were like as a generation.
If you ever run into a real deal old hippie some day, ask them what the majority of people their age was like back then.
It’s not even innovation, per say. It’s just Big Number Go Up.
Nobody seems to want to make a TV that makes watching TV more pleasant. They just want to turn these things into giant bespoke advertising billboards in your living room.
Show me the TV manufacturer who includes an onboard ad blocker. That’s some fucking innovation.
The galaxy brain move is buying an old dumb tv for a pittance and use it for watching Jellyfin/Plex/stream from a browser with uBlock Origin/DNS filtering – all running on some relative’s “obsolete” smart toaster from last year that they happily gift you because “the new version’s bagel mode IS LIT – pun intended – but it needs the 128 gb DDR7 ram of the new model, can barely toast on the old one any more”.
I think this just comes down to human nature. Give people (engineers, execs) a metric that looks like a good proxy for performance and they will overcommit on that metric as it is a safer bet than thinking outside the box. I think the incremental improvements in deep learning with all those benchmarks are a similar situation.
You can’t really find a dumb TV anymore. I might see how big of a monkey I can find when I’m ready to upgrade, but I doubt I’ll find one big enough and cheap enough.
I hooked my computer up to the HDMI and have used that as my primary interface.
It’s not perfect, but it screens out 95% of bullshit
That doesn’t, unless you’ve blocked your TV from network access, because they use ACR - Automated Content Recognition - that literally scans what is being displayed over your hdmi port and then sells it off to advertisers.
Just don’t give the TV your wifi password, boom dumb TV.
That won’t save you anymore. My boss bought a smallish smart TV in contravention of my explicit instructions for use as a CCTV monitor because it was “cheap.” It nags you on power up with a popup whining about not being able to access the internet, and if you don’t feed it your Wifi password it will subsequently display that same popup every 30 minutes or so requiring you to dismiss it again. And again. And again. Apparently the play is to just annoy you into caving and letting it access your network.
Instead I packed it up and returned it. Fuck that.
If you are at a business you should have an access point or router that is capable of blocking specific devices from WAN access. But I would create a new segmented network, block that network from WAN access entirely, put it on its own VLAN, and then connect the TV to that network.
We traded 3D tv’s, which are amazing if you watch the right stuff, for 8k…
8k is great, but we need media in 8k to go with it.
FireWire got killed too soon.
Wha?
AFAIK FireWire is part of the Thunderbolt protocol, you can get FireWire 800 to Thunderbolt adapters. Apple even used to sell one.
Apple killed Firewire 4 years before Lightning came along.
??? My 2012 MacBook Pro had FireWire, that’s the year Lightning came out…
Apple announced it was moving away from Firewire in 2008
Too right, brother
it was way too ahead of its time.
There’s no 8k content, and only recently do standard connectors support 8k at high refresh rates.
There’s barely any actual 4K content you can consume.
There’s barely any actual 4K content you can consume.
Ironically there actually is if you bother pirating content because that’s the only crowd that will share full 4k Dolby Vision + Dolby Atmos/DTS-X BluRay rips.
Aside from that though, even 4k gaming is a struggle because GPU vendors went into the deep end of frame generation, which also coincidentally is the same mistake lots of TV OEMs already made.
There’s barely any actual 4K content you can consume
I feel like that’s not true. But you’ve gotta try. If you’re streaming it, chances are it’s not really any better. 4K Bluray (or rips of them…) though? Yeah it’s good. And since film actually has 8K+ resolution old movies can be rescanned into high resolution if the original film exists.
Supposedly Sony Pictures Core is one streaming service that can push nearly 4K Bluray bitrates… but you’ve gotta have really good internet. Like pulling 50-80GB in the span of a movie runtime.
Not true on paper but true in practice. Most people don’t buy/use Blurays (or any other physical media) anymore to the point that retailers aren’t even bothering to stock them on the shelves these days. Their days are certainly numbered and then all we’ll be left with is low quality 4k streaming.
You’re probably aware of this since you mentioned bitrate, but a lot of 4K streaming services use bitrates that are too low to capture much more detail at 4K compared to a lower resolution. A lot of games will recommend/effectively require upscaling (DLSS/FSR/XeSS) to achieve good performance at 4K. All of this is still maybe better than 1440p, but it shows 4K is still kind of hard to make full use of.
There’s barely any actual 4K content you can consume.
Honestly a little surprised the IMAX guys didn’t start churning out 4k+ content given that they’ve been in the business forever.
But I guess “IMAX in your living room” isn’t as sexy when the screen is 60" rather than 60’
IMAX is 4K or less content. Its edge is special projection that can look good and brighter on huge screens.
Only imax film prints are significantly better than anything else
You don’t even need IMAX for 4K; ordinary 35mm film can normal scan to a nice 4K video. Films shot on the 65mm IMAX cameras would probably make good 8K content, but most of that was educational films, not what most people apparently want to watch all the time.
The digital IMAX projections were actually a step backwards in resolution.
Films shot on the 65mm IMAX cameras would probably make good 8K content, but most of that was educational films, not what most people apparently want to watch all the time.
Sure. But the cameras exist. You can use them for other stuff.
Hateful Eight was filmed in 70mm, and while it wasn’t Tarantino’s best work it certainly looked nice.
65mm is a gulf from 15/65 though. Much harder to shoot in IMAX format.
IMAX film is equivalent to 12K. Their digital laser projectors are only 4K.
Films shot on the 65mm IMAX cameras would probably make good 8K content
So there’s still hope that they might release The Last Buffalo in 8k 3D sometime in the future? Got it. :)
They don’t want IMAX in your living room, they want IMAX in the IMAX theater, where you pay a premium for their service.
I’ve got a nice 4k mini LED tv with a 4k Blu-ray player and there’s plenty of excellent 4k content but it’s a niche market because most people aren’t using physical media for movies. 4k streaming is garbage compared to UHD Blu-ray.
Which makes sense because even 1080p streaming is garbage compared to blu-ray.
Preach. I got gifted a 4k bluray player and was absolutely blown away by how good it looks compared to streaming.
People really need to understand a lot of what “smart” TVs do is upscale the “4k” signal to something actually resembling real 4k.
Like how some 4k torrents are 3GB, and then a 1080p of the same movie is 20gb.
It’s “worse” resolution, but it looks miles better because it’s upscaling real 1080 to 4k instead of taking existing shitty 4k and trying to make it look better without just juicing the resolution.
So we don’t need 8k.content for 8k.tvs to be an incentive. We need real 4k media, then 8ks TV would show a real improvement.
Yeah, you’re talking about bitrate. A lot of the 4k content is encoded using more efficient codecs, but if it’s sourced from the streaming services the bitrate is so abysmal it’s usually a tossup between the 1080p or 4k stream. At least the 4k usually has hdr these days which is appreciable.
Yeah. A 1080p Bluray clocks in around 20GB. A 4K bluray is 60-80GB.
If you’re downloading something smaller it’s probably lower quality























