Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm confused - why aren't video codecs winner take all?

Who still uses paten encumbered codecs and why?





video decoding on a general-purpose cpu is difficult, so most devices that can play video include some sort of hardware video decoding chip. if you want your video to play well, you need to deliver it in a format that can be decoded by that chip, on all the devices that you want to serve.

so it takes a long time to transition to a new codec - new devices need to ship with support for your new codec, and then you have to wait until old devices get lifecycled out before you can fully drop support for old codecs.


To this day no AppleTV boxes support hardware AV1 decode (which essentially means it’s not supported). Only the latest Roku Ultra devices support it. So obviously Netflix, for example, can’t switch everyone over to AV1 even if they want to.

These days, even phone-class CPUs can decode 4k video at playback rate, but they use a lot of power doing it. Not reasonable for battery-powered devices. For AC-powered devices, the problem might be heat dissipation, particularly for little streaming boxes with only passive cooling.

Would it be possible to just ship video streaming devices with a FPGA that can be updated to support whatever hardware accelerated codec is fashionable?

probably not at the prices that video streaming devices typically sell for.

I think the need for hardware decoding stinks because it makes capable hardware obsolete since it can't decode new video.

Hardware acceleration has been a thing since...forever. Video in general is a balancing act between storage, bandwidth, and quality. Video playback on computers is a balancing act between storage, bandwidth, power, and cost.

Video is naturally large. You've got all the pixels in a frame, tens of frames every second, and however many bits per pixel. All those frames need to be decoded and displayed in order and within fixed time constraints. If you drop frames or deliver them slowly no one is happy watching the video.

If at any point you stick to video that can be effectively decoded on a general purpose CPU with no acceleration you're never going to keep up with the demands of actual users. It's also going to use a lot more power than an ASIC that is purpose-built to decode the video. If you decide to use the beefiest CPU in order to handle higher quality video under some power envelope your costs are going to increase making the whole venture untenable.


I hear you but I think the benefits fall mainly on streaming platforms rather than users.

Like I'm sure Netflix will lower their prices and Twitch will show fewer ads to pass the bandwidth savings onto us right?


Would anyone pay NetFlix any amount of money if they were using 1Mbps MPEG-1 that's trivially decoded on CPUs?

The whole video/movie industry is rife with mature, hardware-implemented patents. The kind that survive challenges. They are also owned by deep pockets (not fly-by-night patent trolls). Fortunately, the industry is mature enough, that some of the older patents are aging out.

The image processing industry is similar, but not as mature. I hated dealing with patents, when I was writing image processing stuff.


For whatever reason, the file sharing community seems to strongly prefer H.265 to AV1. I am assuming that either the compression at a preferred quality, or the quality at preferred bitrates is marginally better than AV1, and that people who don't care about copyright also don't care about patents.

I assume "file sharing community" is the euphemism for "movie pirating community", but I apologize if I made the wrong assumption.

If that's a correct guess -- I think the biggest reason is about hardware support, actually. When you have pirated movies, where are you going to play it? TV. Your TV or TV box very likely has support for H265, but very few has AV1 support.

Then the choice is apparent.


Once can very well argue that 'movie pirating community' is more properly the dysphemism for 'file sharing community'. :-)

What is odd is that the power-seeders, the ones who actually re-encode, don't do both. You see H264 and H265 released alongside eachother. I'm surprised it doesn't go H265/AV1 at this point.

You would dilute the seeding pool, which will already get diluted enough.

What I wonder is "Why still H264?" I guess it's because some people don't buy new video cards every 6 years and don't have H265 on their hardware.

From a quick skim of hardware support on Wikipedia, it looks like encoding support for H.265 showed up in NVIDIA, AMD, et. al around 2015 whereas AV1 support didn't arrive until 2022.

So, the apparent preference could simply be 5+ years more time to do hardware-assisted transcoding.


Pirates are generally slow to transition formats, but AV1 is also not better than H.265 (in practice) at the high-bitrate encodes.

Scene rules say to start with --crf 17 at 1080p, which is a pretty low CRF (i.e. it results in high bitrates): https://scenerules.org/html/2020_X265.html

AV1 would most likely result in slower encodes that look worse.


Timing. Patent encumbered codecs get a foothold through physical media and broadcast first. Then hw manufactures license it. Then everyone is forced to license them. Free codecs have a longer path to market as they need to avoid the patents and get hw and sw support.

Backwards compatibility. If you host a lot of compressed video content, you probably didn't store the uncompressed versions so any new encoding is a loss of fidelity. Even if you were willing to take that gamble, you have to wait until all your users are on a modern enough browser to use the new codec. Frankly, the winner that takes all is H.264 because it's already everywhere.

AV1 is still worse in practice than H.265 for high-fidelity (high bitrate) encoding. It's being improved, but even at high bitrates it has a tendency to blur.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: