This is essentially what I've done. The TV has never been connected to the Internet and my Apple TV drives everything. I would still have payed 2x to support a good dumb TV project and would love to do so in the future.
I made the mistake of connecting my vizio to the internet after owning it for years.
It was great for two days and then it downloaded an update that absolutely wrecked the interface. What was smooth and snappy and good enough for me now moves at a snails pace and the tv is practically unusable even after a factory reset.
Opening the menu takes 2.5 seconds from button push to response on a good day for no reason other than vizio must have decided it was time for me to buy a new tv.
I used to like their brand. Now I'll never buy another one again.
The problem with smart TVs is that you can’t easily disable the offending software if you want to. The software is tightly coupled to the hardware. In some cases it will aggressively search for opportunities to spy on you.
Worst case, Apple jumps the shark… you can just unplug it. You at least get to keep the display.
The Apple TV’s hardware is wildly more powerful than that bundled in any smart TV. The current ATV 4K is running on the last gen flagship Apple SoC with a big passive heatsink attached while smart TVs use hardware comparable to that of a low-to-midrange Android phone from 2012-2014. Even the first gen ATV 4K from 2017 is several times more powerful than current smart TVs.
That difference in power is felt quite a lot in the user experience.
I am not sure if "powerful" matters in this context though. (That is, I expect the chipsets built into TVs to be plenty powerful for their intended purpose.)
Have you tried the average "smart" TV you find at an Airbnb? I have and let me tell you, it does matter.
We were staying at one just a few days ago that had a cheap Samsung TV. The UI latency was so horribly laggy that simply clicking an arrow on the remote to try to navigate to the next menu would take up to 10 seconds to finally register on screen. It was also variable, meaning some button presses only took 1-2 seconds to respond, but some took 10 seconds, and if you pressed more than once you'd end up with a whole bunch of your delayed button presses registering at once and taking you to a menu option you didn't want.
Sad to say, but state of the art in these Android menu systems is horrible latency, most likely because the UI devs are building in new javascript features that run horribly slow on older ARM processors and they just don't give any F's about the actual user experience or testing...
I love your reference to AirBnB. That is precisely when I get to experience what I assume the rest of the world is used to. Firing up a random TV at an AirBnB is simply painful. You're 100% right that CPU power matters. The delay on every menu is painful. The UX is just atrocious compared to my AppleTV. I cringe that people use this for their normal viewing.
In many cases the SoCs used in TVs are so underpowered that they can’t render menu screens without frame drops, or if they can they lose that ability after a software update or two because there’s so little margin.
Just how “underpowered” are we taking here? I’m having a really hard time imagining a chip which cannot render a menu in non-fractional fps. An 8088 can do this…
Like I mentioned in an earlier comment, their power is roughly on par with a 2012-2014 low-to-midrange smartphone, which sits somewhere between 5-15% a powerful as a modern midrange-and-up smartphone.
That would be fine if they were rendering to a 720p screen or had much more simplistic menus like the those found on most A/V receivers, but they’re usually running recent-ish Android or something similar, which has fancy graphics and animations all over the place designed for newer devices which make that hardware choke at the 4K resolution that the majority of TVs now ship with. Exacerbating this are the terrible lowest-bidder smart TV apps which are written terribly.
TV manufacturers will never ship an OS more suitable for the hardware though, because they’re concerned that it will make the TV look less modern than competing TVs. They also won’t ship better hardware because that’d cut $5 per unit off of their margins. As such, it’s best to just write off integrated “smarts” and plug in a streaming box that’s not so anemic.
The Netflix app ran fine at first but had outgrown my 2018 smart TV's IQ by 2022. Freezes for a good moment then crashes the TVOS. Hulu as well. Factory reset was a waste of time and fixed nothing. But TCL made a few bucks more going with the cheaper cpu and accelerated obsolescence.
One more reason to get a dumb display...
But on the other hand TV SoCs need to process the video signal at 4k 120hz 4:4:4 without dropping a single frame, although most if not all those tasks are most likely done by an ASIC embedded into the SoC. Would a modern but very cut down GPU be able to do this and also at a low enough power draw?
There's hardware transcoders on the chip that render your video stream. The menu system is just a terribly outdated Android SoC... the latency is entirely from the speed at which it can render the user interface and has nothing to do with how fast your 4K 120/240hz panel can draw a frame.
The irk for smart tvs comes from that fact that they show ads and slow down overtime.
With Apple TV, you can nip that in the bud. Yes, Apple TV has its own quirks but it’s nowhere near as hostile as built in “functionality” that these tvs try to provide.
And not say that streaming services have every incentive to keep the Apple TV apps improving compared to the tv app itself.
A TV with Samsung MySmart™ HomeOS Android UltraCrap Edition is not the same as an AppleTV, if you care about things like, I don't know... consistent framerate? No random crashes? Bearable UI latency?
Similarly how Apple CarPlay is not the same as car manufacturers' sorry-ass homegrown infotainment trash software.
From a security or absolutist point of view, yeah. From a customer point of view — different companies have different reputations and market positions. We might all disagree about the exact level of faith we have in them, but Apple and Vizio or whoever seem to have different reputations, for whatever that is worth.
Depends. A Smart TV is a category of TVs being sold. If you go into bestbuy asking for a Smart TV, you're not going to get a TV + streaming box/stick, just a "Smart TV" (although there is some obvious overlap here since TV makers have partnered with Fire TV and Roku).
But when it comes to general conversation (and this context), yes it's the same. Unless you are actually using a niche setup of local streaming and the Apple TV box is just a nice interface you keep offline - don't know how well that works. For everyone else, they are just avoiding overlapping TV services, as relying on Apple or Google or Amazon is falling into the same traps.