I use my TV as basically just a dumb display panel. If it can display 4k/60, then there is no bug that needs to be fixed. I don’t even use built-in audio.
I connected my other TV to my network once when my Nvidia Shield wasn’t working. That TV is still showing advertisements in the main menu for shows that were released 3 years ago.
I completely understand what you’re saying; in general, I tend to agree that if a firmware update is available, it’s best to install it. I keep the firmware up to date on all my networking equipment, and the first thing I do when I set up a new PC is install Windows updates (or apt-get update in Linux).
I have two TVs. One in the living room, and one in the bedroom. After the brief time I had my bedroom TV connected to my network, it immediately started serving me advertisements. I hate ads with a passion. When it comes to network security, privacy probably comes second to blocking ads in terms of priority. When it came time to replace my living room TV, I first tried to repair it, but after spending too much on a replacement mainboard that didn’t do shit, I just bought a new TV. There was no way in hell I was letting it connect to the internet and download advertisements.
I have an extensive Zigbee network for home automation, 10GB fiber links between my servers and my home office, etc. My home is very much “connected.” TVs are just one of those things that I will never, ever, under any circumstances, allow to connect to anything other than a video cable. If I’m paying $1000 or more for a device, I’ll be damned if it’s going to show me advertisements.